NEWPosted 4 hours ago

Job ID: JOB_ID_2411

Role Overview: Google Cloud Data Architect

TestingXperts is seeking a highly experienced Google Cloud Data Architect to lead a critical Identity & Access Management (IAM) Data Modernization initiative. This high-impact role involves spearheading the migration of a massive on-premises SQL data warehouse to a target-state Data Lake on Google Cloud Platform (GCP). The successful candidate will be responsible for designing a scalable, performant, and secure data ecosystem that enables advanced metrics, reporting, and cutting-edge GenAI use cases, including natural language querying and cross-domain trend analysis.

Project Scope and Objectives

The IAM Data Modernization project is a cornerstone of our client’s digital transformation strategy. You will oversee the integration of over 30 source system data ingestions and manage multiple downstream integrations. The goal is to establish a single source of truth for enterprise-wide data-driven decision-making. Key project highlights include:

  • Migration of legacy on-premises SQL environments to a modern GCP Data Lake.
  • Implementation of Bronze/Silver/Gold layered data architectures.
  • Enabling GenAI capabilities such as accelerated summarization and pattern analysis.
  • Developing a highly available semantic layer with historical data support.
  • Ensuring a unified data strategy across diverse cyber domains.

Technical Requirements and Responsibilities

As a Senior Architect, you will be expected to demonstrate mastery over the following domains:

Data Lake Architecture & Storage

  • Design and implement robust data lake architectures using GCS.
  • Define bucket layouts, naming conventions, and lifecycle policies.
  • Apply expertise in Hadoop/HDFS architecture and distributed file systems.
  • Optimize data storage using columnar formats like Parquet, Avro, and ORC.
  • Develop partitioning strategies and backfill mechanisms for large-scale data.

Data Ingestion & Orchestration

  • Build complex batch and streaming pipelines using GCP-native services.
  • Design Pub/Sub-based streaming architectures and event schemas.
  • Implement Change Data Capture (CDC) patterns and ensure idempotency.
  • Utilize Cloud Composer (Airflow) for sophisticated workflow orchestration.
  • Create resilient error handling and replay mechanisms.

Data Processing & Transformation

  • Develop scalable pipelines using Dataflow (Apache Beam) and Dataproc (Spark).
  • Optimize BigQuery SQL for performance, cost control, and clustering.
  • Maintain advanced Python codebases for data engineering tasks.
  • Manage schema evolution to minimize downstream impact.

Governance, Security & Compliance

  • Implement data catalogs, metadata management, and ownership models.
  • Enforce data quality frameworks, including validation and freshness checks.
  • Design IAM and security best practices using least-privilege access.
  • Manage VPC networking, private access patterns, and encryption (KMS/CMEK).
  • Ensure compliance readiness and audit logging across the platform.

DevOps & Reliability

  • Build CI/CD pipelines for data and infrastructure workloads.
  • Manage secrets securely using GCP Secret Manager.
  • Define and monitor SLOs, dashboards, and alerting systems.
  • Maintain platform reliability through rigorous logging and runbook development.

Special Requirements

Only Visa Independent candidates (USC/GC/etc.). Requires 4 days onsite in Dallas, TX. Domain focus: Identity & Access Management (IAM) Data Modernization. Google Cloud Professional Cloud Architect certification required or obtainable within 3 months.


Compensation & Location

Salary: $190,000 – $260,000 per year (Estimated)

Location: Dallas, TX


Recruiter / Company – Contact Information

Recruiter / Employer: TestingXperts

Email: rani.alugu@testingxperts.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
rani.alugu@testingxperts.com with the subject:

DELETE_JOB_ID_2411

to delete@join-this.com.