Share this Job

Go Back

Hadoop/Scala Developer Location: Maryland Heights, MO Dur:- 6 M+ Rate:- Open


2017-08-01 10:32:21
Job Type: Contract to Hire

Dur:- 6 M+

Rate:- Open

Job Title: Hadoop/Scala Developer

Location: Maryland Heights, MO

Required Skills :

  • Scala and Spark.
  • 8-10 years of hands-on experience in handling large-scale software development and integration projects.
  • 2+ years of experience administering Hadoop cluster environments and tools ecosystem: Spark/Spark Streaming/SQOOP/HDFS/Kafka/Zookeeper
  • Experience with Java, Python, Pig, Hive, or other languages a plus

Description :

  • Experience working with General Hadoop - client is using Hortonworks Distribution
  • Apache Spark Development and troubleshooting experience
  • Need candidates with actual Implementation experience with Java Background
  • Ability to use Scala + Spark and do general data transformation (Extract, Transformation, Load, Cleanse)
  • Experience with Hadoop Formats and Tools (Hive, HDFS)
  • Experience working with real-time data ingestion using KAFKA
  • Experience with standard Development IDEs such as Eclipse or similar
  • Agile and Waterfall experience.
  • Experience working with little guidance (be very Independent)

Summary:

Responsible for design, development and implementation of Big Data Projects using Spark Scala. Resolve issues regarding development, operations, implementations, and system status.

Major Duties And Responsibilities

  • Strong Knowledge in Hadoop Architecture and its implementation.
  • Strong understanding of best practices in Scala coding on large scale Hadoop Clusters.
  • Proficiency with Software Development Lifecycle (SDLC)
  • Solid knowledge of the programming language(s), application server, database server and/or architecture of the system being developed.
  • Good communication skills and problem solver mentality.
  • Solid understanding of current programming languages and employs any/all of these languages to solve the business needs of Client\'\'s internal customers.
  • Professional Strong Functional programming using Scala and Java.
  • Strong experience in Scala (Function, Generics, implicit, Collections) or other functional languages.
  • Excellent understanding of data engineering concepts.
  • Experience working with Spark for data manipulation, preparation, cleansing
  • Experience in whole Hadoop ecosystem like HDFS, Hive , Yarn, Flume, Oozie, Flume, Cloudera Impala, Zookeeper, Hue, Sqoop, Kafka, Storm, Spark and Spark Streaming including Nosql database knowledge
  • Good knowledge of Windows/Linux/Solaris Operating systems and shell scripting
  • Strong desire to learn a variety of technologies and processes with a \"can do\" attitude

Required Qualifications:

Skills / Abilities and Knowledge

  • Ability to read, write, speak and understand English.
  • Ability to communicate orally and in writing in a clear and straightforward manner
  • Ability to communicate with all levels of management and company personnel
  • Ability to handle multiple projects and tasks
  • Ability to make decisions and solve problems while working under pressure
  • Ability to prioritize and organize effectively
  • Ability to show judgment and initiative and to accomplish job duties
  • Ability to use personal computer and software applications (i.e. word processing, spreadsheet, etc
  • Ability to work independently
  • Ability to work with others to resolve problems, handle requests or situations
  • Ability to effectively consult with department managers and leaders

Preferred Qualifications:

  • Experience in working with RDBMS and Java
  • Exposure to NoSQL databases like MongoDB, Cassandra etc.
  • Experience with cloud technologies(AWS)
  • Certification in Hadoop development is desired


Key Skills: