Share this Job

Go Back

Hadoop / Big Data Architect Loc: San Carlos, CA, Full Time, No Sponsorship, Sal: OPEN


0000-00-00 00:00:00
Job Type: Direct

Overview •Purpose of the Team: oProvide strategic direction and expertise to all of client’s Big Data environments. oProvide Big Data hands-on support, thought leadership and technology roadmaps to support Client’s vision and evolution. •Purpose of the Job: oAssume a lead position in providing strategic direction, evaluation and recommendation of Big Data products to support Client’s current and future vision of Big Data. oRepresent Big Data Engineering alongside business partners to provide expertise that supports significant M&A, strategic and new technology infrastructure. oCreate processes frameworks, design and implementation of large scale data migration and data processing on Hadoop using massively parallel processing programming techniques. •Who would be the best fit for this role: oWhat technologies do you have hands-on experience with, and what degree is that experience? oWhat experience do you have acting as a change-agent for adoption, integration and exploitation of Big Data technologies to deliver tangible value to the business? oWhat experience do you have leading and influencing mission critical initiatives that are strategically important to highly-visible business objectives? Role Essential responsibilities of the position: •Work directly with application and development teams to completely understand all requirements for project and ad hoc requests and provide accurate estimates. •Provide technical leadership to a team of subject matter experts that support mission-critical Big Data environments for highly-available and contingent platforms. •Create and maintain documentation and define best practices in support of proper change management, security and operational requirements. •Investigate and understand new technologies and additional capabilities of current platforms that support Client’s vision of Big Data, leading their delivery adoption. •Mentor and develop the technical skills and abilities of the team in the areas of Big Data concepts and application of technology to serve business needs •Direct maintenance and operational support activities related to Big Data platforms per Client standards and industry best-practices. All About You •Essential knowledge, skills and experience: oStrong knowledge of Hadoop platforms and other distributed data processing platforms. oAdvanced Linux knowledge is a must. Understanding of shell, debugging things etc. The candidate should be able to get their way around Linux and get things to work. oStrong background in delivering mission-critical Big Data project work while interacting with diverse and experienced teammates across the spectrum of enterprise operational functions. oThe candidate must be able to engage in solving complex problems. Programming problems are a good example. oGood in one programming language. Java would be preferable. If not Java, the candidate must have a good handle on one language and display the ability to pick up a new one if required. Experience on programming infrastructure management or automation tools is a big plus. •Desirable or additional capabilities: oUnderstanding of and experience with network engineering in support of large IT organizations. oKnowledge in statistics and machine learning would be highly beneficial: The candidate should have a strong developed mentality with a focus on measuring and metrics oBackground in ETL. Practical experience dealing with large ETL pipelines is a plus. oBasic SQL knowledge a must. Good to have advanced data warehousing and MPP knowledge but it's not a must have.


Key Skills: