Job ID: JOB_ID_1452
Role Overview
As a Senior Data Engineer specializing in Google Cloud Platform (GCP), you will join the Mail Analytics Data Engineering team to architect and implement large-scale data solutions. This role is pivotal in driving mission-critical decision-making through the development of robust batch pipelines, data serving layers, and advanced data lake house architectures. You will be at the forefront of enabling AI-powered capabilities for one of the world’s largest mail platforms, ensuring that data infrastructure scales seamlessly with business needs.
Key Responsibilities
- Partner with cross-functional teams including Data Science, Product Management, and Engineering to define data ontologies and requirements for Mail Data & Analytics.
- Lead and mentor junior data engineers, providing technical guidance and fostering a culture of engineering excellence.
- Design, build, and maintain highly efficient and reliable batch data pipelines to populate core datasets.
- Develop scalable frameworks and automated tooling to streamline analytics workflows and enhance user interaction with data products.
- Establish and promote industry best practices for data operations, lifecycle management, and data governance.
- Optimize complex code and data processing systems using advanced algorithmic concepts and deep understanding of underlying system stacks.
- Create frameworks to improve the deployment and management of data platforms, working closely with infrastructure teams to resolve complex issues.
- Prototype new metrics and data systems to support evolving business strategies.
- Define and manage Service Level Agreements (SLAs) for all datasets within your ownership area.
- Develop complex queries and very large volume data pipelines to solve intricate engineering problems.
- Collaborate with stakeholders to deliver technical solutions that address specific business challenges.
- Provide engineering consulting on large-scale data lake house architectures.
Technical Requirements
- Bachelor’s degree in Computer Science, Engineering, or a related technical field (Master’s or PhD preferred).
- 8+ years of experience in building scalable ETL pipelines using industry-standard orchestration tools such as Airflow, Google Cloud Composer, or Oozie.
- Deep expertise in SQL, PySpark, or Scala for large-scale data processing.
- 3+ years of experience leading data engineering projects in direct partnership with business or data science teams.
- Proven track record of building and maintaining multi-terabyte datasets with expertise in debugging large-scale analytics challenges (e.g., skew mitigation, sampling strategies).
- Hands-on experience with at least one major cloud provider (GCP, AWS, or Azure), with a strong preference for GCP.
- Experience developing or enhancing ETL orchestration frameworks.
- Proficiency in GitOps workflows, including CI/CD systems and PR-based development.
- Solid understanding of GDPR and data privacy regulations.
Preferred Qualifications
- 3+ years of specific experience with Google Cloud Platform technologies including BigQuery, Dataproc, Dataflow, Composer, and Looker.
- Strong communication skills and the ability to manage expectations in a fast-paced environment.
- Detail-oriented mindset with a passion for solving complex data infrastructure challenges.
Special Requirements
Visa: GC, USC, or H1B; Domain: Mail Analytics; Screening: Technical interview and algorithmic assessment.
Compensation & Location
Salary: $145,000 – $195,000 per year (Estimated)
Location: Dallas, TX
Recruiter / Company – Contact Information
Recruiter / Employer: Fixity Technologies
Email: sai.p@fixitytech.com
Recruiter Notice:
To remove this job posting, please send an email from
sai.p@fixitytech.com with the subject:
DELETE_JOB_ID_1452