Hadoop Consultants
AtriaGroupLLC
Posted: July 4, 2014
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Participate in the pre- and post- sales process, helping both the sales and product teams to interpret customers’ requirements and analyze complex distributed production deployments, and make recommendations to optimize performance.
Job Description
Responsibilities:
• Participate in the pre- and post- sales process, helping both the sales and product teams to interpret customers’ requirements
• Work directly with prospective customers’ technical resources to devise and recommend solutions based on the understood requirements
• Analyze complex distributed production deployments, and make recommendations to optimize performance
• Work closely with account teams at all levels to ensure rapid response to customer questions and project blockers
• Help develop reference Hadoop architectures and configurations
• Drive POCs with customers to successful completion
• Write and produce technical documentation, knowledgebase articles
• Attend speaking engagements when needed
• Travel 100%, Monday-Thursday. Travel - reimbursed.
Qualifications:
• 3+yrs of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions 2+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions
• Ability to understand and translate customer requirements into technical requirements
• Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment
• Ability to compile and install Linux applications from source, the Linux kernel and kernel modules
• Experience with integrating various solutions such as LDAP, or system / installation management tools into the overall solution
• Strong understanding of network configuration, devices, protocols, speeds and optimizations
• Python, Perl, or other scripting language required
• Familiarity with the Java ecosystem and enterprise offerings, including debugging and profiling tools (jconsole), logging and monitoring tools (log4j, JMX), and security offerings (Kerberos/SPNEGO).
• Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
• Solid background in Database administration or design - Oracle RAC a plus Excellent verbal and written communications
• Experience in architecting data center solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements
• Demonstrable experience using R and the algorithms provided by Mahout
Nice, but not required experience:
• Experience working with Apache Hadoop including: Knowledge on how to create and debug Hadoop jobs
• Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments.
Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.
8 weeks with extension
Submit your resume today!