GCP Data Architect / Lead
Basking Ridge, NJ (On-Site)
100000 - 200000
Job Description:
Role: GCP Data Architect / Lead
Location: Basking Ridge NJ
Duration : Full Time
About the Role
We are looking for an experienced GCP Data Architect to design and lead the implementation of scalable, secure, and cost-efficient data platforms on Google Cloud. The ideal candidate will have deep expertise across the GCP ecosystem, strong knowledge of data governance and security frameworks, and a proven track record in building enterprise-grade data architectures.This role requires strategic thinking, hands-on architecture design, and leadership in data modernization and cloud transformation initiatives.
Role and responsibilities
1. Define and implement end-to-end data architecture on Google Cloud Platform (GCP)
2. Design scalable and resilient data pipelines, data lakes, and data warehouses
3. Architect solutions using BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, and Composer
4. Establish and enforce data governance frameworks (data quality, lineage, cataloging, compliance)
5. Design and implement data security and privacy controls (IAM, encryption, masking, access policies)
6. Drive cloud data migration and modernization initiatives from legacy platforms to GCP
7. Optimize data platforms for performance, scalability, and cost efficiency
8. Define best practices for data modeling, partitioning, clustering, and storage optimization
9. Collaborate with stakeholders to translate business requirements into technical architecture
10. Provide technical leadership and mentorship to data engineering teams
11. Evaluate and recommend new tools, frameworks, and architectural patterns
12. Ensure adherence to enterprise architecture standards and compliance requirements
Technical skills requirements
The candidate must demonstrate proficiency in,
• Deep expertise in Google Cloud Platform (GCP) services:
? BigQuery, Cloud Storage, Dataflow, Dataproc, Pub/Sub, Cloud Composer
• Strong understanding of data architecture patterns (Lambda, Kappa, Data Mesh, Lakehouse)
• Experience in data governance frameworks (data catalog, lineage, metadata management)
• Expertise in data security and compliance (IAM, encryption, GDPR, PII handling)
• Strong experience in designing scalable and high-performance data systems
• Proven ability in cost optimization strategies (query optimization, storage tiering, workload tuning)
• Experience in ETL/ELT architecture and data pipeline design
• Strong knowledge of SQL, Python, and/or PySpark
Nice-to-have skills
• Experience with AI/ML integration (Vertex AI, AI pipelines)
• Knowledge of streaming architectures (Kafka, Pub/Sub, real-time analytics)
• Familiarity with multi-cloud or hybrid architectures
• Exposure to CI/CD, DevOps, and Infrastructure-as-Code (Terraform)
• Experience in data observability and monitoring tools
• Understanding of AI-native engineering tools and LLM-based data solutions
Qualifications
• Overall 10 + years with 8-15 years of relevant work experience in Data Engineering/GCP Data Architecture
• B.Tech., M.Tech. or MCA degree from a reputed university
Key Skills:
- GCP