Job ID: JOB_ID_1655
Role Overview
We are seeking a highly skilled and visionary Databricks Architect to spearhead the design, implementation, and continuous optimization of our enterprise-grade data analytics platform. As a Databricks Architect, you will be at the forefront of our data strategy, leveraging the power of the Databricks Lakehouse architecture to deliver high-performance, scalable, and secure data solutions. This role is critical for bridging the gap between complex business requirements and technical execution, ensuring that our data infrastructure supports advanced analytics, machine learning, and real-time data processing.
Key Responsibilities
- Design and architect end-to-end data solutions that integrate Databricks seamlessly with major cloud providers including AWS, Azure, and GCP.
- Lead the development of scalable and efficient data pipelines, ensuring high data quality and reliability across the entire lifecycle.
- Collaborate closely with data scientists, engineers, and business stakeholders to translate business needs into robust technical architectures.
- Perform deep-dive performance tuning and optimization of Databricks clusters to maximize throughput while minimizing operational costs.
- Establish and enforce best practices for data security, governance, and compliance within the Databricks environment, including Unity Catalog implementation.
- Develop comprehensive technical documentation, architectural diagrams, and standards for Databricks usage across the organization.
- Troubleshoot complex technical issues related to the Databricks platform, Spark engine, and third-party integrations.
- Mentor junior team members and provide technical leadership on big data technologies and cloud infrastructure.
Technical Qualifications
- Proven experience in architecting large-scale, production-grade data solutions specifically using the Databricks platform.
- Expert-level knowledge of Apache Spark, including Spark SQL, DataFrames, and Structured Streaming.
- Proficiency in programming languages such as Python, Scala, or SQL for data engineering and automation tasks.
- Hands-on experience with cloud infrastructure management and deployment (Terraform, ARM templates, or CloudFormation).
- Strong understanding of Delta Lake, data warehousing concepts, ETL/ELT processes, and modern data modeling techniques.
- Familiarity with CI/CD pipelines for data (DataOps) and version control systems like Git.
- Excellent communication skills with the ability to explain complex technical concepts to non-technical stakeholders.
Work Environment
This is a hybrid position based in Tampa, FL, offering a perfect balance between collaborative in-office sessions and the flexibility of remote work. Our client is a leader in the industry, known for fostering a culture of innovation, professional growth, and a healthy work-life balance. You will join a team of passionate professionals dedicated to pushing the boundaries of what is possible with data.
Special Requirements
Hybrid work model; Contract duration; Domain focus on Information Technology.
Compensation & Location
Salary: $165,000 – $225,000 per year (Estimated)
Location: Tampa, FL
Recruiter / Company – Contact Information
Recruiter / Employer: Brillius Technologies
Email: uddhavc@brillius.com
Recruiter Notice:
To remove this job posting, please send an email from
uddhavc@brillius.com with the subject:
DELETE_JOB_ID_1655