NEWPosted 13 hours ago

Job ID: 1831

Role Overview

Arbor Teksystems is seeking a highly experienced and technically proficient DataBricks Developer for a critical 12-month contract engagement with our client, Tech Mahindra. This position is based onsite in Palm Beach, Florida, and requires a seasoned professional with a minimum of 12 years of experience in the data engineering and big data ecosystem. The successful candidate will be responsible for architecting, developing, and maintaining high-performance data pipelines within the Databricks environment, ensuring that data processing workloads are scalable, secure, and optimized for complex analytical requirements.

Key Responsibilities

  • Design, develop, and fine-tune sophisticated data processing pipelines using Databricks, Spark, and Delta Lake technologies.
  • Implement advanced performance optimization techniques to handle large-scale data workloads efficiently, reducing latency and operational costs.
  • Collaborate closely with data science and analytics teams to integrate data solutions into broader business intelligence frameworks.
  • Support enterprise-level data architecture initiatives, leveraging the full suite of Databricks capabilities including Unity Catalog and Photon engine.
  • Provide expert-level troubleshooting and support for Databricks-based tools, workflows, and job clusters to ensure maximum system reliability.
  • Establish and enforce rigorous data governance and security protocols within the Databricks workspace to maintain compliance with regulatory standards.
  • Document all technical processes, system architectures, and best practices to facilitate knowledge sharing and long-term maintainability.
  • Conduct knowledge transfer sessions and mentor junior team members on Databricks optimization and data engineering principles.

Technical Requirements & Qualifications

  • Minimum of 12 years of professional experience in IT, with a significant focus on Big Data and Data Engineering.
  • Proven expertise in Databricks, including experience with Spark (PySpark/Scala), SQL, and Delta Lake.
  • Strong understanding of data warehousing concepts, ETL/ELT processes, and data modeling.
  • Experience with cloud platforms (Azure or AWS) and their integration with Databricks services.
  • Familiarity with CI/CD pipelines, version control (Git), and automated testing in a data environment.
  • Excellent communication skills and the ability to work effectively in a collaborative, onsite team environment.
  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related technical field.

Project Scope and Benefits

This is a high-impact role within a major digital transformation project for Tech Mahindra. The initial contract duration is 12 months, with a strong possibility of extension based on performance and project needs. Candidates will have the opportunity to work on cutting-edge data challenges in a dynamic environment. Please note that this role requires local candidates who can work onsite in Palm Beach; no relocation assistance is provided. This position is specifically open to H1B candidates.


Special Requirements

Visa: Only H1B Candidates. Location: Local Candidates Only, No Relocation. Client: Tech Mahindra. Duration: 12 Months (Possible Extension).


Compensation & Location

Salary: $165,000 – $215,000 per year (Estimated)

Location: Palm Beach, FL


Recruiter / Company – Contact Information

Recruiter / Employer: Arbor Teksystems

Email: deepakn@arborteksys.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
deepakn@arborteksys.com with the subject:

DELETE_1831

to delete@join-this.com.