NEWPosted 4 hours ago

Job ID: JOB_ID_4015

Job Description: GCP Lead / Architect (Data Engineering)

We are seeking a highly skilled GCP Lead / Architect with a strong Data Engineering foundation to design and deliver secure, scalable, and cost-optimized data platforms on Google Cloud Platform (GCP). This is a hybrid role based in Hartford, CT.

Key Responsibilities:

  • Architecture & Solution Design: Lead end-to-end architecture for data platforms on GCP, including networking, security, compute, storage, and analytics components. Define high-level design (HLD) and low-level design (LLD), architecture standards, and reference patterns for ingestion, transformation, serving, and governance. Drive architecture decisions balancing performance, reliability, scalability, cost, and security; perform design reviews and technical audits.
  • Data Engineering & Warehousing: Architect and guide the implementation of robust pipelines for structured, semi-structured, and unstructured data using scalable patterns (batch + streaming where applicable). Develop and maintain data models, ETL/ELT workflows, and batch/streaming pipelines. Build and optimize BigQuery-centric data warehouse/lakehouse solutions, including dimensional modeling, partitioning/clustering, query tuning, and workload optimization. Lead DWH design: data modeling (conceptual/logical/physical), SCD strategies, conformed dimensions, data quality rules, and lineage considerations.
  • GCP Platform Engineering (Hands-on): Implement and enforce security and access controls using IAM (least privilege), service accounts, and org/policy guardrails. Engineer and support workloads on GKE and Compute Engine, including configuration, scalability, observability, and operational readiness. Use GCS for governed storage and lifecycle, and Dataproc for Spark/Hadoop-based processing.
  • DevOps / CI-CD / Automation: Build, manage, and optimize data pipelines using GCP-native tools and services. Develop CI/CD automations with Git / GitHub Actions. Establish DevOps best practices: Git branching strategies, environment promotion, artifact/version management, IaC standards, and rollback strategies.
  • AI/ML Enablement (as needed): Collaborate with ML/DS teams to operationalize ML services using Vertex AI (training/inference integration, data access, and platform readiness). Support patterns for secure AI consumption and governance where required (e.g., explainability, privacy controls, audit readiness).

Must-Have Skills:

  • GCP Lead / Architect with a strong Data Engineering foundation (15+ years exp)
  • Hands-on GCP services: IAM, VPC, GCS, BigQuery, Vertex AI, GKE, Compute Engine, Dataproc (6+ years exp)
  • CI/CD & DevOps: GitHub Actions, Git workflows, pipeline automation, environment management (6+ years exp)
  • Data Engineering: SQL (advanced), Python/PySpark, pipeline design, performance tuning, data quality controls (6+ years exp)
  • Data Warehousing: DWH design, dimensional modeling, distributed processing concepts, BigQuery optimization (6+ years exp)
  • Architecture: Define high-level design (HLD) and low-level design (LLD), architecture standards (6+ years exp)

Good-to-Have Skills:

  • Infrastructure as Code: Terraform or GCP Deployment Manager
  • MLOps exposure: model lifecycle, CI/CD for ML, experiment tracking, deployment automation, monitoring (framework exposure)
  • Domain Preference: Healthcare domain exposure is preferred; Medicare STAR ratings experience is a strong plus.

Qualifications:

  • Bachelor’s degree in Computer Science / Engineering or equivalent experience.
  • Cloud certification(s) in GCP (Professional Cloud Architect / Data Engineer) preferred.

This role requires a strategic thinker with deep technical expertise in GCP and data engineering principles. The ideal candidate will be a strong leader capable of guiding teams and influencing architectural decisions.


Special Requirements

Hybrid work environment. Healthcare domain exposure is preferred. Medicare STAR ratings experience is a strong plus. Experience with GCP services including IAM, VPC, GCS, BigQuery, Vertex AI, GKE, Compute Engine, Dataproc. CI/CD and DevOps experience with GitHub Actions, Git workflows. Strong Data Engineering and Data Warehousing skills.


Compensation & Location

Salary: $75 – $100 per year (Estimated)

Location: Hartford, CT


Recruiter / Company – Contact Information

Recruiter / Employer: Client

Email: uma@hclglobal.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
uma@hclglobal.com with the subject:

DELETE_JOB_ID_4015

to delete@join-this.com.