Share this Job

Go Back

Hadoop ETL Lead Architect Location:San Jose, CA Dur : 6 M+ Rate:Open


2017-03-17 14:50:30
Job Type: Contract

Job Description

Hadoop ETL Lead Architect

Loc:San Jose, CA

Dur : 6 M+

Rate:Open

  • As our Data Engineering Tech Lead Architect, you'll be a trusted member of our team with the following responsibilities:
  • Build and lead a team of 5 to 10 developers to design, develop, maintain and support data engineering solutions on time and within budget for all Consumer Digital Technology capabilities.
  • Manage the intake, prioritization, assignment and fulfillment of development projects within Digital Technology.
  • Research and deploy new tools, processes and technologies to meet business demand.
  • Collaborate with Project Managers, Product Managers, QA teams and Business SMEs to ensure delivered solutions optimally support the achievement of business outcomes.
  • Work across a number of projects and bridge functional/technical gaps with Product Managers and business stakeholders.
  • Lead the developers through design and implementation decisions to achieve balance between strategic design and tactical needs.
  • Drive the development and enforcement of development, integration standards, patterns and processes.

Responsibilities:

  • Design and implement map reduce jobs to support distributed processing using java, python, hive and pig; Ability to design and implement end to end solution.
  • Build libraries, user defined functions, and frameworks around Hadoop
  • Research, evaluate and utilize new technologies/tools/frameworks around Hadoop eco system
  • Develop user defined functions to provide custom hive and pig capabilities
  • Mentor junior developers in the team
  • Define and build data acquisitions and consumption strategies
  • Define & develop best practices
  • Work with support teams in resolving operational & performance issues
  • Work with architecture/engineering leads and other teams on capacity planning
  • Work with Site-Operations team on configuration/upgrades of the cluster

Qualifications

  • What We're Looking For
  • We\\\'re looking for someone special, someone who had these experiences and clearly demonstrated these skills:
  • MS/BS degree in a computer science field or related discipline
  • 10+ years experience in large-scale software development
  • 1+ year experience in Hadoop
  • Strong Java programming, shell scripting, Python, and SQL
  • Strong development skills around Hadoop, MapReduce, Hive, Pig, HBase, Flume & Oozie
  • Strong understanding of Hadoop internals
  • Good understanding of AVRO and Json
  • Experience with build tools such as Maven
  • Experience with databases like Oracle;
  • Experience with performance/scalability tuning, algorithms and computational complexity
  • Experience with data warehousing, dimensional modeling and ETL development
  • Ability to understand and ERDs and relational database schemas
  • Proven ability to work cross functional teams to deliver appropriate resolution
  • Experience with open source NOSQL technologies such as HBase and Cassandra
  • Experience with messaging & complex event processing systems such as Kafka and Storm
  • Machine learning framework (Nice to have)


Key Skills: