Share this Job

Go Back

Sr Hadoop / Big Data Developer Location:San Francisco,CA


2016-12-02 13:15:09
Job Type: Full Time only
Budget $: 100,000 - 200,000

Role:Sr Hadoop / Big Data Developer

Location:San Francisco,CA

Qualifications Include:

  • Minimum 5+ years of hands on experience with Hadoop and Big Data stack using Python, Java or Scala.Software development experience, ideally in Big Data technologies.
  • Experience with AWS services stack.Ideally, someone who is an open source contributor or committer to Spark/Hadoop committer and PMC Members for Apache level projects.
  • Experience with SQL RDBMS like SQL Server, Oracle and My SQL or MPP databases like Vertica, Teradata and Netezza.
  • Experience working with large data sets, experience working with distributed computing a plus (Map/Reduce, Hadoop, Hive, Apache Spark, etc.).
  • Experience in Extract data from multiple structured and unstructured feeds by building and maintaining scalable ETL pipelines on distributed software systems.
  • Programming experience in Java, Scala and Python.
  • Python a plus.
  • Knowledge of Hadoop or similar data processing frameworks (such as EMR, Hadoop, Spark) and a good understanding of optimization techniques.
  • Understanding of NoSQL data stores, messaging or pub-sub queuing systems and data processing frameworks.
  • Working knowledge of Computer Algorithms.
  • Excellent communication skills with both technical and non-technical audiences.
  • You are curious, have a research mindset, love bringing logic and structure to loosely defined unstructured problems and ideas.
  • You hold yourself and your colleagues to a high bar, and take great pride in your attention to details.
  • You inspire us to aim higher.
  • You are resilient, passionate, and creative
  • Youll work on a small, cross-functional agile team to design & ship features & functionality as rapidly as possible.
Additional skills
  • Experience with private cloud platforms such as Amazon AWS, Microsoft Azure.
  • Work experience in Hadoop stack, ETL and Data Warehousing.
  • Experience with software development & delivery in a SaaS and PaaS environment.
  • Experience working with large datasets, relational databases (SQL), and distributed systems using python.
  • Experience developing web services, APIs or data libraries.
  • The Ideal Candidate will be Smart. Youre a top performing student.Curious.
  • You ask why, you explore, you're not afraid to introduce and defend a crazy idea.
  • Data Savvy You know how to move data around, from a database or an API, through a transformation or two, a model and into human-readable form (ROC curve, chart, map, visualization, etc.).
  • You know Python, Java, R, C/C++, Spark, Storm, Julia, SQL or think everything can be done in a SQL one-liner.
  • You have a bias toward action, you try things, and sometimes you fail.
  • Expect to tell us what youve shipped and whats flopped.


Key Skills: