NEWPosted 8 hours ago
Job ID: JOB_ID_10475
Job Title: Databricks Administrator
We are seeking an experienced Databricks Administrator to join our team. This is a critical role responsible for managing and optimizing our Databricks environment to support our data engineering and platform engineering initiatives.
Responsibilities:
- Administer and maintain the Databricks platform, ensuring high availability, performance, and security.
- Implement and manage Unity Catalog for data governance and access control.
- Configure and optimize cluster policies to ensure efficient resource utilization and cost savings.
- Manage Delta Lake, Spark, and other core Databricks components.
- Oversee workspace configuration and job scheduling.
- Implement and enforce data governance policies, including data modeling, ingestion frameworks, schema enforcement, versioning, and lineage tracking.
- Develop and manage Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC) policies within Databricks.
- Monitor system performance, identify bottlenecks, and implement solutions for optimization.
- Manage and monitor costs associated with Databricks usage, implementing cost-saving measures.
- Develop and maintain ETL/ELT pipelines using Python/PySpark and SQL.
- Utilize DLT, Airflow, or Databricks Workflows for pipeline orchestration.
- Collaborate with data scientists, data engineers, and other stakeholders to understand their needs and provide solutions.
- Communicate effectively with technical and non-technical teams, setting standards and influencing best practices.
- Troubleshoot and resolve issues related to Databricks clusters, jobs, and data pipelines.
- Stay up-to-date with the latest Databricks features and industry trends.
Technical Skills Required:
- 5+ years in data engineering or platform engineering.
- Minimum 2 years of direct experience in Databricks administration.
- Expert knowledge of Unity Catalog, cluster policies, Delta Lake, Spark, workspace configuration, and jobs.
- Strong understanding of data governance principles and practices.
- Proficiency in data modeling, ingestion frameworks, schema enforcement, versioning, and lineage.
- Proven experience implementing RBAC and ABAC in Databricks or similar platforms.
- Experience with cost optimization, monitoring, billing logs, and compute governance.
- Strong Python/PySpark and SQL skills.
- Familiarity with DLT, Airflow, or Databricks Workflows.
- Excellent communication and interpersonal skills.
Employment Type:
C2C Role and C2H Role
Location:
Houston, Texas (Onsite)
This is a direct client-facing role, working with a prime vendor and implementation partner.
Special Requirements
Onsite
Compensation & Location
Salary: $60 – $80 per hour (Estimated)
Location: Houston, TX
Recruiter / Company – Contact Information
Email: requirementdatabase@gmail.com
Recruiter Notice:
To remove this job posting, please send an email from
requirementdatabase@gmail.com with the subject:
DELETE_JOB_ID_10475