Job ID: JOB_ID_2795
Job Overview:
We are looking for a highly skilled and experienced Google Cloud Data Architect to lead the IAM Data Modernization project. This role involves migrating an on-premises SQL data warehouse to a Data Lake on Google Cloud Platform (GCP). The objective is to enable advanced analytics, GenAI use cases, and provide a unified data strategy for enterprise-wide decision-making. This is a contract position requiring significant hands-on experience with GCP and data architecture.
Project Details:
- Project Goal: Migrate on-premises SQL data warehouse to a GCP Data Lake for enhanced analytics and GenAI capabilities.
- Integration Scope: Ingest data from 30+ source systems and manage multiple downstream integrations.
- Key Capabilities: Enable metrics, reporting, natural language querying, advanced pattern/trend analysis, faster summarizations, and cross-domain metric monitoring.
- Benefits: Achieve scalability, leverage advanced cloud functionality, establish a highly available and performant semantic layer, and create a unified data strategy.
Required Skills and Experience:
- Data Lake Architecture & Storage: Proven experience designing and implementing data lake architectures (e.g., Bronze/Silver/Gold). Strong knowledge of Cloud Storage (GCS) design, Hadoop/HDFS, columnar data formats (Parquet, Avro, ORC), and partitioning strategies.
- Data Ingestion & Orchestration: Experience building batch and streaming ingestion pipelines using GCP-native services (e.g., Pub/Sub, Cloud Composer/Airflow). Knowledge of incremental ingestion and CDC patterns.
- Data Processing & Transformation: Experience developing scalable pipelines using Dataflow (Apache Beam) and/or Spark (Dataproc). Strong proficiency in BigQuery SQL, advanced Python programming, and managing schema evolution.
- Analytics & Data Serving: Expertise in BigQuery performance optimization, building semantic layers, and integrating with BI tools. Understanding of data exposure patterns.
- Data Governance, Quality & Metadata: Experience implementing data catalogs, metadata management, data lineage, data quality frameworks, and data contracts.
- Cloud Platform Management: Strong hands-on experience with GCP, including IAM, security best practices, VPC networking, and encryption (KMS).
- DevOps, Platform & Reliability: Proven ability to build CI/CD pipelines, manage secrets, and implement observability (SLOs, dashboards, alerts).
- Experience: 10-14 years in data engineering/architecture, with 5+ years designing on GCP at scale. Prior on-premises to cloud migration experience is a must.
- Education: Bachelors/Masters in Computer Science, Information Systems, or equivalent experience.
- Certifications: Google Cloud Professional Cloud Architect (required or within 3 months). Professional Data Engineer, Security Engineer are a plus.
Good to Have:
- Security, Privacy & Compliance: Experience with fine-grained access controls, VPC Service Controls, PII handling, data masking, and audit requirements.
Location Options:
- Dallas, TX
- Charlotte, NC
- Columbus, OH
- New Jersey
- (4 days onsite required for all locations)
This is a critical role for a major data modernization initiative. We are looking for a candidate who can drive technical excellence and deliver impactful solutions.
Special Requirements
Location: Dallas, TX / Charlotte, NC / Columbus Ohio / New Jersey (4 days onsite). Certifications: Google Cloud Professional Cloud Architect (required or within 3 months). Plus: Professional Data Engineer, Security Engineer. Keywords: continuous integration continuous deployment artificial intelligence business intelligence North Carolina Texas
Compensation & Location
Salary: $120,000 – $180,000 per year
Location: Dallas, TX
Recruiter / Company – Contact Information
Recruiter / Employer: Innovyt
Email: himanshu.yadav@innovyt.com
Recruiter Notice:
To remove this job posting, please send an email from
himanshu.yadav@innovyt.com with the subject:
DELETE_JOB_ID_2795