NEWPosted 2 hours ago

Job ID: JOB_ID_4008

Job Description:

  • Proficient with Python and SQL
  • Build and manage efficient ETL pipelines using Databricks workflows or other orchestration framework
  • Familiarity with both structured and semi-structured data and ingesting and processing this data using pyspark
  • Fundamental AWS services (S3, SQS) or similar services for other clouds
  • Terraform and GitLab CICD
  • Query tuning and performance optimization in SQL and/or SparkSQL
  • Familiarity with data warehousing (snowflake or similar)
  • Spark SQL
  • AWS, Databricks, Snowflake
  • Experience working for a cloud-based data services provider to large healthcare clients.
  • FHIR experience

Additional Information:

  • Work with clients & implementation team and understand the data distribution requirements
  • Perform data analysis and data mapping required to produce client output.
  • Build the data distribution extracts and scripts
  • Optimize the performance of the data extract scripts.

Compensation & Location

Salary: $55 – $60 per year

Location: Remote


Recruiter / Company – Contact Information

Email: _khan@aesinc.us.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
_khan@aesinc.us.com with the subject:

DELETE_JOB_ID_4008

to delete@join-this.com.