Share this Job

Go Back

Senior Hadoop Developer (1132)

Irving , TX
2017-06-22 11:02:50
Job Type: Contract to Hire

Job description:

The Senior/Lead Hadoop Developer is responsible fordesigning, developing, testing, tuning and building a large-scale dataprocessing system, for Data Ingestion and Data products that allow HMS toimprove quality, velocity and monetization of our data assets for bothOperational Applications and Analytical needs. This position supports this goalwith strong experience in software engineering and development of solutionswithin the Hadoop Ecosystem

• Responsible for design, development and delivery of datafrom operational systems and files into ODSs (operational data stores),downstream Data Marts and files.

• Troubleshoot and develop on Hadoop technologies includingHDFS, Hive, Pig, Flume, HBase, Spark, Impala and Hadoop ETL development viatools such as Informatica,

• Translate, load and present disparate data sets inmultiple formats and multiple sources including JSON, Avro, text files, Kafkaqueues, and log data.

• Will implement quality logical and physical ETL designsthat have been optimized to meet the operational performance requirements forour multiple solutions and products, this will include the implementation ofsound architecture, design, and development standards.

• Has the experience to design the optimal performancestrategy, and manage the technical metadata across all ETL jobs.

• Responsible for building solutions involving large datasets using SQL methodologies, Data Integration Tools like Informatica in anyDatabase preferably in an MPP platform.

• Has strong Core Java Programming experience to apply inData Integration.

• Works with BA’s, end users and architects to define andprocess requirements, build code efficiently and work in collaboration with therest of the team for effective solutions.

• Deliver projects on-time and to specification withquality.

• 8 years’ experience in managing data lineage andperforming impact analyses.

• 5 years’ experience with ETL tool development

• 4 years’ experience with Hadoop Eco System

• Experience working in Data Management projects.

• Experience working in Hive or related tools on Hadoop,Performance tuning, File Format, executing designing complex hive HQL’s, datamigration conversion.

• Experience working with Spark for data manipulation,preparation, cleansing.

• Experience working with ETL Tools ( Informatica/DS/SSIS)for data Integration.

• Experience designing and developing automated analyticsoftware, techniques, and algorithms

• Ability to handle multiple tasks and adapt to a constantlychanging environment

• Self-starter with the ability to work independently andtake initiative. Ability to translate ideas and business requirements intofully functioning ETL workflows.

• Strong analytical and problem solving skills.

• Excellent written and oral communication skills, with theability to articulate and document processes and workflows for use by variousindividuals of varying technical abilities.

• Excellent organizational skills.

• Knowledge of HealthCare a plus.

 

Minimum Education:

• MS/BS in Computer Science, Information Systems, or relatedfield preferred and/or equivalent experience

• Ability to apply mastery knowledge in one of therelational data base (DB2, MSSQL, Teradata, Oracle 8i/9i/10g/11i)

• Ability to apply mastery knowledge in one of the DataIntegration Tools (Informatica, SSIS.)

• Expert ability and hands on experience in SQL and CoreJava a must.

• Experience with Unix/Linux and shell scripting.

• Ability to demonstrate experience in distributed UNIXenvironments

• Ability to work both independently and in a collaborativeenvironment.

• Excellent problem solving skills, communication skills andinterpersonal skills.

• Ability to analyze information and use logic to addresswork related issues and problems.

• Ability to demonstrate proficiency in Microsoft Access,Excel, Word, PowerPoint and Visio.

• Ability to present to a group.

• Experience working in Agile, DevOps environment a plus.

• Experience or knowledge of web architecture, (JavaScript,SOAP/XML, WebLogic, Tomcat) is a plus.

• Experience with an ORM framework, SOA Architecture,Microservices is a plus.

• Experience with Middleware components (ESB, API Gateway)is a plus.

 


Key Skills:
HDFS, Hive, Pig, Flume, HBase, Spark, Impala, Hadoop ETL development, Hadoop eco system,