Senior System Administrator Big Data, relocation to Hamburg,Germany
GMSServices
Posted: April 30, 2015
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
We are looking for a System Administrator with strong DevOps skills and broad knowledge of Linux and Big Data. The ideal candidate will work with our Big Data team, collaborating with hardware and software providers, and completing administrator activities as well as configuration management and scripting.
Required Skills
Job Description
We are looking for a System Administrator with strong DevOps skills and broad knowledge of Linux and Big Data. This role is divided between working with our Big Data team, which is already well-established in the areas of software development and data, but also in the area of IT engineering which still requires assistance. The spectrum of responsibilities here is wide-ranging, from collaborating with our hardware and software providers to completing administrator activities as well as configuration management and scripting. Acting as the go-between who addresses all special Big Data requirements as well as any conventions and needs from our central IT unit is also a part of this position. In addition to working in the Big Data team, the incumbent must also work closely with our other administrators and development operators.
Your responsibilities:
• Work in a self-organized, cross-functional team;
• Develop and expand our Big Data infrastructure together with the team;
• Co-design enterprise-wide IT architecture;
• Evaluate and integrate new technologies;
• Optimize our infrastructure using efficient automation processes;
• Work closely with the development and technical service departments;
• Attend to all aspects of our Hadoop Ecosystem.
Your skills:
• Several years’ experience in developing and operating platforms running Linux as well as very good knowledge of Linux;
• Successful track record of developing and maintaining Big Data systems;
• Experience with Map Reduce, Zookeeper, HDFS, Impala, Hive, Sqoop;
• Very good knowledge of ETL principles and know-how to implement these principles in combination with Hadoop;
• Ability to fluently write Shell scripts and strong familiarity with configuration management (e.g. Opscode Chef);
• Ability to read Java code and basic knowledge in PHP, Ruby, Python, Perl and/or Java Script;
• Ability to see big picture as well as a mind for creative ideas and solutions;
• Team-player who enjoys sharing knowledge;
• Fluent in English (oral);
• Asset: Experience with work-flow scheduling and monitoring (e.g. Oozie);
• Asset: Experience handling flows of data (i.e. fluentd/td-agent, flume, kafka);
• Asset: Fluent in German or German as native language.
We offer:
• Lots of data to be managed;
• Possibility to work with new technologies;
• A fresh, open-minded and international atmosphere in a growing company;
• Freedom to realize your own ideas and initiatives;
• Team-work and short decision-making processes;
• Expansive coverage with over 300 million users worldwide;
• Attractive location in downtown Hamburg;
• Games room with Foosball table and game consoles.