Hadoop Administrator
InfojiniInc1
Posted: April 25, 2016
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Hadoop Administrator is responsible for managing and maintaining Hadoop infrastructure, ensuring scalability and performance for large-scale data processing and analytics projects.
Required Skills
Job Description
Infojini Consulting is a full service IT consulting, services, and staffing firm with offices in Linthicum Heights ,Maryland, Washington, DC and Mumbai, India.
Infojini Consulting is recognized as one of the fastest growing IT services and software development Companies. With a partnership of all major technology vendors, Infojini Consulting has built a strong Government and commercial customer base including fortune 100 companies and most state and federal agencies such as State of North Carolina, State of South Carolina, State of Maryland, State of California, State of Pennsylvania, State of Virginia, State of Washington and many others.
Infojini Consulting is an equal opportunity employer and considers all qualified individuals for employment irrespective of their race, gender, age, color, sexual orientation. We offer an excellent compensation package
Job Details:
JOB TITLE/ROLE :Hadoop Administrator
DURATION : 6 Months
LOCATION : Raleigh NC
The administrator will manage scalable Hadoop environments in a large enterprise system including Monitoring Cluster and Jobs, managing the backup of Hadoop data, optimize and tune environments as needed for optimal performance. The ideal candidate will work with developers and other team members to design, develop, document, test and debug applications software and systems that contain logical and mathematical solutions. Conduct multidisciplinary research and collaborates with equipment designers and/or hardware engineers in the planning, design, development, and utilization of electronic data processing systems for product and commercial software.
Basic Qualifications:
• Bachelors degree and Minimum 10 years relevant IT experience.
• 14 years of experience in lieu of degree will be accepted.
• Experience developing with Java in a Linux environment.
• Minimum of 5 years experience with SQL.
• Minimum of three (3) years experience designing, developing and administering the Horton works distribution of hadoop.
• Experience with Sqoop and Flume.
• Experience with the design & development of applications for use in distributed computing environments
• Experience with design, development, and testing in cloud technology architectural components supporting analytic modernization
• Must be able to obtain a position of public trust security clearance
Bachelors Degree
All your information will be kept confidential according to EEO guidelines.