Job ID: 2099
Position Summary: Senior Data Engineer (12+ Years Experience)
In the data-driven economy of 2026, the role of a Senior Data Engineer is pivotal in transforming raw data into a strategic asset. We are seeking a highly experienced professional to join our team in Indianapolis, Indiana, on a hybrid basis. This role requires a deep technical mastery of data orchestration, cloud architecture, and governance. As a senior member of the engineering team, you will be responsible for designing and implementing the next generation of data pipelines that power advanced analytics and AI-driven decision-making. This is a C2C contract position tailored for experts who thrive in complex, high-scale environments.
Core Technical Responsibilities
- Architect, build, and maintain robust data pipelines and integrations using Databricks, Python, and PySpark within the Azure ecosystem.
- Design and implement end-to-end ETL/ELT processes that ensure data quality, consistency, and low-latency availability for downstream consumers.
- Establish and enforce comprehensive data governance frameworks, including data lineage, metadata management, and security protocols.
- Optimize Spark jobs and SQL queries to handle massive datasets efficiently, focusing on performance tuning and cost optimization in the cloud.
- Integrate diverse data sources, including structured, semi-structured, and unstructured data, into a unified Lakehouse architecture.
- Develop automated monitoring, alerting, and self-healing mechanisms for data pipelines to ensure 24/7 operational reliability.
- Collaborate with data scientists to prepare feature stores and data sets for machine learning models and predictive analytics.
- Lead the migration of legacy on-premise data systems to modern Azure-based cloud solutions with minimal business disruption.
Compliance, Experience, and Mandatory Requirements
- A minimum of 12 years of professional experience in data engineering, software development, or a related field is strictly required.
- Candidates must provide a mandatory passport number for background verification and compliance purposes.
- This position is restricted to candidates who do not hold a Green Card (NO GC Visa); other valid work authorizations are welcome.
- Local candidates in the Indianapolis area are preferred, and a valid Driver’s License (DL) is mandatory for hybrid work requirements.
- Expert-level proficiency in Python and PySpark is non-negotiable, along with hands-on experience in Azure Data Factory and Databricks.
- Strong understanding of SDLC principles, version control (Git), and CI/CD pipelines for data engineering.
Why Join the Heliogic Team?
Heliogic is a leader in providing cutting-edge data solutions to enterprise clients. By joining our team, you will work on high-impact projects that challenge your technical skills and offer opportunities for professional growth. We foster a culture of innovation where your expertise in data architecture and engineering best practices will be highly valued. In the 2026 market, the ability to manage complex data landscapes is a rare skill, and we offer competitive compensation that reflects your seniority and expertise. This hybrid role in Indianapolis provides the perfect balance of professional challenge and personal flexibility.
Special Requirements
Visa restriction: NO GC Visa; Requirement: Locals only with Driver's License; Mandatory: Passport number; Experience: 12+ years minimum.
Compensation & Location
Salary: $160,000 – $210,000 per year
Location: Indianapolis, IN
Recruiter / Company – Contact Information
Recruiter / Employer: Heliogic
Email: abhi@heliogic.com
Recruiter Notice:
To remove this job posting, please send an email from
abhi@heliogic.com with the subject:
DELETE_2099