Share this Job

Go Back

Big Data Architect Loc: Greater Detroit Area


2017-01-10 12:20:43
Job Type: Full Time only
Budget $: 100,000 - 200,000

Job description

  • looking for a well-qualified Solutions Architect with a background in Big Data, BI, and Data Warehousing.
  • This position will provide technical leadership around the Enterprise Information Management platform, focusing on large volume information ingestion, storage and analysis.
  • Concentration will be on building and/or optimizing information models, physical data layouts, configuration, optimization and monitoring of RDBMS and Hadoop environments and overall improving processing efficiencies to support the needs of the business.
GENERAL RESPONSIBILITIES:

  • Lead the design and development of highly scalable and optimized data models, by utilizing modeling software to document and maintain versions to support Data Marts, Cubes, Data Warehouse, and Operational Data Stores (ODS).
  • Establish data standards in terms of nomenclature, storage, design and deployments.
  • Work in concert with a team of ETL developers to ensure efficiently and accurate data transfer within the entire EDW echo system with Big Data Platforms.
  • Assure optimized source system replication models and operations.
  • Lead design and maintenance of enterprise meta-data solution to communicate data definitions to the BI audience.
  • Acts as a DW liaison to our Infrastructure Engineering teammates and coordinates initiatives with the other database administration groups in that sister team.
  • Ensures appropriate technical standards and procedures are defined. Manages the development of centers of excellence around key storage sub-system technologies.
  • Collaborate in planning initiatives in Application Development, System Architecture, Future Roadmaps, Operations and Strategic Planning Work with business teams and technical analysts to understand business requirements.
  • Determine how to leverage technology to create solutions that satisfy the business requirements.
  • Present solutions to the business, project teams, and other stakeholders with the ability to speak technical and non-technical Create architecture and technical design documents to communicate solutions that will be implemented by the development team.
  • Work with development, infrastructure, test, and production support teams to ensure proper implementation of a solution.
  • Ability to assess the impact of new requirements on an existing suite of complex applications Educate organization on available and emerging tool sets.
  • Drive the evolution of infrastructure, processes, products, and services by convincing decision makers
  • Develop proofs-of-concept and prototypes to help illustrate approaches to technology and business problems.
  • Experience in building Business Intelligence platforms in an enterprise environment.
  • Data integration (batch, micro-batches, real-time data streaming) across Hadoop, RDMSs, and Data warehousing (SQL Server 2016 preferred)
  • Build real-time data pipelines using technologies such as Apache Kafka, Spark, Storm, and Flume etc.
  • Analyze data using technologies such as Python, R, Scala, Pig, and Hive etc. Building consumption frameworks on Hadoop (Restful services, Self-service BI and Analytics)
  • Optimize Hadoop environment using MapReduce, Spark and HDFS footprints Hadoop security, Data management and Governance
Qualifications:

  • Knowledge in HADOOP required Understands the capabilities of key technologies (Data modeling, data processing, BI analytics) and can quickly assess the applicability of commercial off the shelf technology.
  • Excellent grasp of integrating multiple data sources into an enterprise data management platform and can lead data storage solution design.
  • Strong communication skills (oral and written).
  • Good analytical and problem-solving skills.
  • Understanding of the software development lifecycle including agile methodology.
  • Ability to understand business requirements and building pragmatic/cost effective solutions using Agile project methodologie
  • Ability to collaborate with business users to understand requirements
  • Excellent problem solving and analytical skills
  • Minimum of 8-10 years enterprise IT application experience that includes at least 3 years of architecting strategic, scalable BI, Big data solutions
  • 6 to 8 years experience with Relational DBMS technology - SQL Server focused.
  • 6 to 8 years experience in developing procedures, packages, and functions in DW environment.
  • Deep experience in ANSI SQL, Stored Procedures;
  • 2+ years experience with Map Reduce, Pig, Hive-QL Hadoop languages a plus.


Key Skills: