Big Data
SonsoftInc
Posted: February 15, 2017
Interested in this position?
Create a free account to apply with AI-powered matching
Quick Summary
Developing software applications using a variety of technologies, including Java and Scala, with a focus on data processing, API integration, and system architecture.
Required Skills
Job Description
Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services.
• Background in all aspects of software engineering with strong skills in parallel data processing, data flows, REST APIs, JSON, XML, and micro service architecture.
• Must have strong programming knowledge of Core Java or Scala - Objects & Classes, Data Types, Arrays and String Operations, Operators, Control Flow Statements, Inheritance and Interfaces, Exception Handling, Serialization, Collections, Reading and Writing Files.
• Must have hands on experience in design, implementation, and build of applications or solutions using Core Java/Scala.
• Strong understanding of Hadoop fundamentals.
• Must have experience working on Big Data Processing Frameworks and Tools – MapReduce, YARN, Hive, Pig.
• Strong understanding of RDBMS concepts and must have good knowledge of writing SQL and interacting with RDBMS and NoSQL database - HBase programmatically.
• Strong understanding of File Formats – Parquet, Hadoop File formats.
• Proficient with application build and continuous integration tools – Maven, SBT, Jenkins, SVN, Git.
• Experience in working on Agile and Rally tool is a plus.
• Strong understanding and hands-on programming/scripting experience skills – UNIX shell, Python, Perl, and JavaScript.
• Should have worked on large data sets and experience with performance tuning and troubleshooting.
Preferred
• Knowledge of Java Beans, Annotations, Logging (log4j), and Generics is a plus.
• Knowledge of Design Patterns - Java and/or GOF is a plus.
• Knowledge of Spark, Spark Streaming, Spark SQL, and Kafka is a plus.
• Experience to Financial domain is preferred
• Experience and desire to work in a Global delivery environment
• Bachelor’s degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education.
• Atleast 5 years of Design and development experience in Big data, Java or Datawarehousing related technologies.
• Atleast 3 years of hands on design and development experience on Big data related technologies – PIG, Hive, Mapreduce, HDFS, HBase, Hive, YARN, SPARK, Oozie, Java and shell scripting.
• Should be a strong communicator and be able to work independently with minimum involvement from client SMEs.
• Should be able to work in team in diverse/ multiple stakeholder environment.
** U.S. Citizens and those who are authorized to work independently in the United States are encouraged to apply. We are unable to sponsor at this time.
Note:-
1.This is a Full-Time & Permanent job opportunity for you.
2.Only US Citizen, Green Card Holder, GC-EAD, H4-EAD & L2-EAD can apply.
3.No OPT-EAD, H1B & TN candidates please.
4.Please mention your Visa Status in your email or resume.