NEWPosted 6 hours ago

Job ID: JOB_ID_2347

Role Overview

Arkhyatech is seeking a highly skilled GCP Data Engineer to join our team in a hybrid capacity, supporting our clients in Mountain View and Oakland, California. This role is critical for organizations looking to optimize their cloud financial operations and requires a deep understanding of Google Cloud Platform (GCP) services. As a Data Engineer, you will take full ownership of data pipelines, specifically focusing on the design, development, and maintenance of robust ETL processes that handle raw GCP Billing Export data and other massive datasets.

Key Responsibilities

  • Design, develop, and maintain robust ETL data pipelines for GCP Billing Export data.
  • Implement complex backend logic and data models to attribute shared infrastructure costs (MySQL, Kafka, BigQuery, GCS).
  • Own the development lifecycle for core backend services, ensuring high performance and stability.
  • Collaborate with finance and platform teams to integrate organizational mapping into cost systems.
  • Perform deep-dive performance tuning on data processing jobs and database interactions.
  • Use infrastructure-as-code (Terraform) and automation scripts (Python, Bash) for resource management.
  • Implement monitoring and alerting for all pipelines to ensure data quality.
  • Maintain up-to-date documentation detailing data models and ETL logic.

Required Qualifications

  • 6+ years of experience in backend software development for large-scale data processing.
  • Expert-level proficiency in Python, GoLang, or TypeScript.
  • Hands-on experience with cloud financial data and GCP Billing Export data.
  • Strong familiarity with MySQL, Kafka, BigQuery, and GCS.
  • Deep understanding of backend engineering principles and ETL processes.
  • Hands-on experience with Google Cloud Platform (GCP) services.

Work Environment and Impact

One of your primary responsibilities will be the implementation of complex backend logic and data models for cost attribution. This involves accurately mapping shared infrastructure costs—such as MySQL, Kafka, BigQuery, and GCS usage—to specific business verticals, enabling precise financial reporting and accountability. You will work at the intersection of engineering and finance, collaborating with various stakeholders to integrate organizational structures into the cost attribution system. Your expertise in backend engineering will be vital as you own the development lifecycle for core services, ensuring they are stable, scalable, and performant. Performance tuning is a key aspect of this role; you will perform deep-dives into data processing jobs and database interactions to ensure efficiency. Additionally, you will apply infrastructure-as-code principles using Terraform to manage cloud resources and develop automation scripts in Python or Bash to streamline operational tasks. Reliability is paramount, so you will implement comprehensive monitoring and alerting for all pipelines to maintain data quality and service continuity. This position requires a mandatory three-day-per-week onsite presence in the Bay Area, offering a collaborative environment where you can directly impact the financial efficiency of large-scale cloud operations.


Special Requirements

Hybrid: 3 days/week onsite (No flexibility). Visa constraints: GC, USC, TN Visa only.


Compensation & Location

Salary: $155,000 – $215,000 per year (Estimated)

Location: Mountain View, CA


Recruiter / Company – Contact Information

Recruiter / Employer: Arkhyatech

Email: sathish.s@arkhyatech.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
sathish.s@arkhyatech.com with the subject:

DELETE_JOB_ID_2347

to delete@join-this.com.