Big Data Solutions Engineer
Jobsbridge1
Posted: February 24, 2016
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Build distributed, scalable, and reliable data pipelines that ingest and process large amounts of data in Big Data environments, leveraging expertise in Hadoop, Flume, and Storm for efficient data processing.
Required Skills
Job Description
Jobs Bridge Inc is among the fastest growing IT staffing / professional services organization with its own job portal.
Jobs Bridge works extremely closely with a large number of IT organizations in the most in-demand technology skill sets.
Skill Hadoop Big Data Big Data Engineer Flume Storm Hive
Location Bridgewater, NJ
Total Experience 2 yrs.
Max Salary Not Mentioned
Employment Type Direct Jobs (Full Time)
Domain Any
Description
OPT and EAD can apply
Desired Skills and Experience
Job Responsibilities:
Build distributed, scalable, and reliable data pipelines that ingest and process data at scale and in real-time.
Collaborate with other teams to design and develop and deploy data tools that support both operations and product use cases.
Perform offline analysis of large data sets using components from the Hadoop ecosystem.
Evaluate and advise on technical aspects of open work requests in the product backlog with the project lead.
Own product features from the development phase through to production deployment.
Evaluate big data technologies and prototype solutions to improve our data processing architecture.
Candidate Profile:
BS in Computer Science or related area
software development experience
Minimum 2 Year Experience on Big Data Platform
Proficiency with Java, Python, Scala, HBase, Hive, MapReduce, ETL, Kafka, Mongo, Postgres, Visualization technologies etc.
Flair for data, schema, data model, how to bring efficiency in big data related life cycle
Understanding of automated QA needs related to Big data
Understanding of various Visualization platform (Tableau, D3JS, others)
Proficiency with agile or lean development practices
Strong object-oriented design and analysis skills
Excellent technical and organizational skills
Excellent written and verbal communication skills
Top skill sets / technologies in the ideal candidate:
* Programming language -- Java (must), Python, Scala, Ruby
* Batch processing -- Hadoop MapReduce, Cascading/Scalding, Apache Spark
* Stream processing -- Apache Storm, AKKA, Samza, Spark streaming
* NoSQL -- HBase, MongoDB, Cassandra, Riak,
* ETL Tools Data Stage, Informatica,
* Code/Build/Deployment -- git, hg, svn, maven, sbt, jenkins, bamboo
Technologies that we use include:
R
Java
Hadoop/MapReduce
Flume
Storm
Kafka
MemSQL
Pig
Hive
Tableau Integration
ETL
Multiple Openings