Big Data Architect
KrgTechnologyInc
Posted: October 20, 2016
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Design and implement large-scale data processing and storage systems, working with multiple technologies and cross-functional teams to deliver scalable and efficient solutions.
Required Skills
Job Description
Ajith
KRG Technologies Inc
ajith at krgtech.com
661-367-8000 EXT 310
Job Title: Big Data Architect
Location: Bethesda, MD
Duration: Contract
Job Description:
Professional Experience Required
• 12-15 years of experience in designing, architecting and implementing large scale data processing/data storage/data distribution systems
• Extensive experience working with large data sets with hands-on technology skills to design and build robust Big Data solutions
• Experience having deployed BigData technologies to production
• Ability to work with a multi-technology/cross-functional teams and customer stakeholders to guide/managing a full life-cycle of a Hadoop solution
• Extensive experience in data modeling and database design involving any combination of -
• Data warehousing and Business Intelligence systems and tools
• Relational and MPP database platforms like Netezza, Teradata
• Open source Hadoop stack
• Hands-on administration, configuration management, monitoring, performance tuning of Hadoop/Distributed platforms
• Strong understanding of Big Data Analytics platforms and ETL in the context of Big Data
• Ability to frame architectural decisions, provide technology leadership & direction
• Excellent problem solving, hands-on engineering skills and communication skills
Professional Experience Preferred
• Understanding of Lambda design architectures are Real-Time streaming
• Knowledge/experience of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce, Azure)
• Considerable understanding and experience of real-time analytics
• Should be willing to independently code & review modules are required ( Role not at solution architecture level only)
Technical Skills Required
Any combination of below technical skills
• Big Data : HDFS, MapReduce, Hive, Hbase, Pig, Mahout, Avro, Oozie
• Processing & Streaming : Spark, Strom, Samza, Kafka
• NoSQL : Cassandra, MongoDB, Hbase
• Appliances : Teradata, Netezza, Greenplum, Aster Data, Vertica
• Languages : Java, Scala, Linux, Apache, Perl/Python/PHP