Share this Job

Go Back

Sr Hadoop Developer

Costa Mesa, CA
2017-09-01 10:14:17
Job Type: Contract

Costa Mesa, CA

12 month contract


Responsibilities
• Design a generic framework for high throughput and streaming data ingestion, curation and linkage for the Experian Data Fabric – a Data as a Service platform intended to be configured/extended for deployments across the globe
• Develop high throughput and streaming data pipelines using Experian Lambda Architecture; Internal bulk sources to include server logs, file uploads, database logs, relational sources and CDC; streaming sources to include web sockets, HTTP client callbacks, and kafka events
• Develop rules-driven, data transformation layer that can be rapidly configured / deployed at business units and geographies globally
• Build components for standardization, data curation, identity resolution and end-to-end traceability at dataset and record levels
• Develop data provisioning components to deliver data for operational analytics as well as research purposes
• Develop ultra-low latency micro services to serve data from the State Container within 10 milliseconds

Qualifications
• 5+ years hands-on experience as Software Developer/Engineer
• 4+ years Java/Scala development background
• 1+ year experience working with native MapReduce and/or spark
• Desired - 2+ years data engineering/ETL development experience working with data at scale
• Desired - 2+ experience with Heroku or Docker
• 1+ year experience across one or more of the following Hadoop, MapReduce, HDFS, Cassandra, HBase, Hive, Flume, Sqoop, Spark, Kafka, etc.
• Bachelor's degree in Computer Science, Engineering or quantitative discipline


·         Designs, develops, tests, and evaluates software and systems that enable computers to perform their applications, applying principles and techniques of computer science, engineering, and mathematical analysis.

·         Excludes paraprofessional positions and requires a degree in software engineering design and development.

·         Researches, designs, and develops computer software systems, in conjunction with hardware product development, applying principles and techniques of computer science, engineering, and mathematical analysis.

·         Analyzes software requirements to determine feasibility of design within time and cost constraints.

·         Consults with hardware engineers and other engineering staff to evaluate interface between hardware and software, and operational and performance requirements of overall system. Formulates and designs software system, using scientific analysis and mathematical models to predict and measure outcome and consequences of design.

·         Develops and directs software system testing procedures, programming, and documentation.

·         Consults with customer concerning maintenance of software system.

·         May coordinate installation of software system.

·         12+ years of experience required.



Key Skills:
Hadoop,Hive,Spark,Java, Scala, Sqoop, Cassandra.