Share this Job

Go Back

Hadoop Developer

Charlotte, Pennington,NY
2017-10-06 18:38:41
Job Type: Full Time only
Budget $: 100,000 - 200,000

GC/USC only

At least 10+ years experience in Relational Database and SQL Programming
•Minimum 5+ years experience working in a large scale Data Warhouse environment.
• Extensive knowledge of Hadoop stack and storage technologies HDFS, Yarn, HBASE, HIVE, sqoop, Impala , flume, kafka and oozie and  solid experience with parallel processing using Spark, MapReduce, and Hadoop Yarn.
• In-depth understanding of Hadoop architecture, data modeling, and performance tuning to support very large data volumes
• Extensive Knowledge on Bigdata Enterprise architecture (Cloudera preferred)
• Experience in Real time messaging technologies such as Kafka or AMPS , RESTful API and No SQL Technologies ( Cassandra )
• Knowledge of JAVA/J2EE
• Experience in Datawarehouse concepts
• Must have experience in big data application for Banking or Financials Organization
Good To have
• Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem. ( R , Python )
• Visual Analytics Tools knowledge ( Tableau )
• Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
• Awareness or experience with Data Lake with Cloudera ecosystem



Key Skills: