NEWPosted 5 hours ago

Job ID: JOB_ID_6329

Job Summary

We are seeking a highly experienced Data Engineer with over 10 years of professional experience to design, build, and optimize enterprise-scale data pipelines and platforms. This is a remote, full-time contract position with a duration of 12+ months. The role requires advanced expertise in data engineering, cloud technologies (AWS), performance tuning, and a strong hands-on skill set in SQL, ETL, and modern cloud stacks. The ideal candidate will be able to architect secure, scalable, and production-ready solutions.

Key Responsibilities

  • Design and implement robust data pipelines, APIs, and ETL workflows for large-scale data processing.
  • Develop and optimize Oracle SQL & PL/SQL code for maximum performance and scalability.
  • Build and maintain batch jobs using Informatica, Autosys, and other relevant ETL tools.
  • Architect and operate workloads on AWS services, including EKS, Kafka, S3, and Lambda.
  • Implement data modeling, data warehousing, and governed access patterns to ensure data integrity and security.
  • Collaborate closely with DevOps teams to integrate CI/CD pipelines for data workflows, enabling continuous integration and delivery.
  • Ensure data quality, lineage, and compliance across all systems, adhering to industry standards and regulations.
  • Partner effectively with business and analytics teams to deliver secure, reliable, and high-performance data services.
  • Troubleshoot complex distributed systems and optimize performance under pressure, ensuring minimal downtime.
  • Mentor junior engineers and provide technical leadership, fostering a collaborative and innovative team environment.

Required Skills & Experience

  • 10+ years of professional data engineering experience.
  • Expert proficiency in Oracle SQL & PL/SQL, including complex queries, tuning, and stored procedures.
  • Strong hands-on experience with ETL tools such as Informatica and Autosys.
  • Demonstrated expertise with AWS cloud services, including EKS, Kafka, S3, and Lambda.
  • Solid understanding of data warehousing concepts and technologies (e.g., Snowflake, Redshift, Databricks).
  • Experience with CI/CD pipelines and automated testing for data workflows.
  • Strong background in data modeling, API development, and integration patterns.
  • Familiarity with mid-tier technologies like Java, messaging systems, and web containers.
  • Comfortable working in Agile environments with a focus on continuous integration/delivery.
  • Proven ability to deliver secure, scalable, and high-performance data solutions.

Preferred Qualifications

  • Experience in financial services or other regulated industries.
  • Familiarity with acceptance test-driven development (ATDD).
  • Exposure to cloud-native data engineering frameworks.

Education

  • Bachelors or Masters degree in Computer Science, Engineering, or a related field, or equivalent practical experience.

Employment Details

  • Location: Remote
  • Employment Type: Contract (W2)
  • Duration: 12+ months
  • Rate: $80/hr

Special Requirements

Eligibility: and only. Interview: Face-to-Face (onsite if required)


Compensation & Location

Salary: $80 – $80 per year

Location: Remote


Recruiter / Company – Contact Information

Email: han.rathor@tekinspirations.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
han.rathor@tekinspirations.com with the subject:

DELETE_JOB_ID_6329

to delete@join-this.com.