Big Data Architect
INTELLISWIFTINC
Posted: August 7, 2015
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Big Data Architect with 10+ years of experience in designing and implementing scalable data integration solutions, leveraging Hadoop ecosystem components and expertise in data analytics and reporting.
Required Skills
Job Description
Good understanding of SDLC process and experience working in agile environment.
Minimum 10 years of experience in working on Data warehouse and integration solutions.
Experience in Data Analytics with Big Data
Good working Experience on Data Integration, warehousing concepts - Dimensional and Relational models, ETL tools, and Reporting.
Good Experience integrating multiple Big Data solutions and legacy database systems
Experience processing large amounts of structured and unstructured data.
Expert level knowledge of Hadoop ecosystem components – Hadoop, Map Reduce, Pig, Hive, Solr, Elastic search, Spark, Kafka, Storm, Falcon, Oozie, Hawq, Gemfire XD etc.
Expert level knowledge of one or more NoSQL databases - HBase, Cassandra, MongoDB
Advanced skills using one or more scripting languages (e.g. python, UNIX shell scripts)
Ability to quickly understand business problems, find patterns and insights.
Ability to quickly learn new technologies and work effectively in a very dynamic environment.
Proven ability to build, manage and foster a team-oriented environment
Proven ability to work creatively and analytically in a problem-solving environment
Excellent communication (written and oral) and interpersonal skills
Excellent leadership and management skills
Local candidate preferred but not mandatory.