NEWPosted 7 hours ago

Job ID: JOB_ID_1540

Position Summary

We are looking for a Senior Databricks/Azure Engineer to join our high-performance data engineering team. This role is focused on building resilient, secure, and compliant data platforms using the latest Azure and Databricks technologies. The ideal candidate will have extensive experience in application-level disaster recovery (DR) for data, machine learning (ML), and artificial intelligence (AI) workloads. You will be responsible for designing and implementing cross-region replication patterns and ensuring that our data infrastructure can withstand regional outages while maintaining data integrity and availability.

Technical Responsibilities

  • Design and implement application-level disaster recovery strategies for data, ML, and AI platforms.
  • Utilize Databricks Workflows, Delta Lake, MLflow, and Feature Store to build scalable data pipelines.
  • Develop cross-region replication patterns using Azure Data Lake Storage (ADLS) and Delta tables.
  • Orchestrate DR environments using Infrastructure as Code (IaC) tools such as ARM templates, Bicep, or Terraform.
  • Integrate Azure Data Factory (ADF) or Microsoft Fabric pipelines into the broader data ecosystem.
  • Implement data lineage mapping and governance using Unity Catalog to ensure transparency and compliance.
  • Automate data retention, archival, and lifecycle governance policies.
  • Tune Delta table versioning and VACUUM retention to balance performance and storage costs.
  • Manage ADLS lifecycle policies, including tiering, archival, and deletion.
  • Design compliance-oriented systems with immutable records and legal hold patterns.

Security and Data Governance

A significant portion of this role involves PII (Personally Identifiable Information) detection, masking, and protection. You will be expected to implement Python-based PII detection using regex or machine learning methods. You will also leverage Unity Catalog tags, classifications, and attribute-based access controls (ABAC) to secure sensitive data. The role requires the use of Delta Live Tables (DLT) for enforcing data quality rules and contracts, as well as implementing view-based or grant-based masking patterns to protect data at rest and in transit.

Required Skills and Experience

  • Proven experience with the Databricks Lakehouse Platform and Azure cloud services.
  • Expertise in Python for data engineering and security automation.
  • Strong understanding of Delta Lake internals and Unity Catalog governance.
  • Experience with Azure DevOps and CI/CD for data infrastructure.
  • Knowledge of data privacy regulations and best practices for PII protection.
  • Ability to work in a fast-paced, remote environment with minimal supervision.

Special Requirements

Remote role, focus on Disaster Recovery and PII protection.


Compensation & Location

Salary: $150,000 – $200,000 per year (Estimated)

Location: Remote


Recruiter / Company – Contact Information

Recruiter / Employer: Nvoids

Email: samk77322@gmail.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
samk77322@gmail.com with the subject:

DELETE_JOB_ID_1540

to delete@join-this.com.