NEWPosted 21 hours ago

Job ID: 3180776

Job Overview:

We are seeking a highly skilled and experienced Snowflake Developer to join our client’s team. This role is crucial for designing, developing, and optimizing scalable data warehouse solutions using Snowflake within a cloud-native environment. The ideal candidate will possess strong expertise in Python-based ETL development, Apache Airflow orchestration, and leveraging AWS cloud services to build robust data pipelines.

Key Responsibilities:

  • Design, develop, and maintain scalable and efficient data warehouse solutions specifically within the Snowflake platform.
  • Develop and implement robust ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) pipelines, with a strong emphasis on Python programming.
  • Build, manage, and schedule complex workflow orchestrations using Apache Airflow, ensuring seamless data processing.
  • Integrate Snowflake with various critical AWS services, including but not limited to Amazon S3 for data storage, AWS Lambda for serverless computing, and Amazon Redshift where applicable for data warehousing needs.
  • Proactively optimize Snowflake performance through techniques such as effective clustering, partitioning strategies, advanced query tuning, and appropriate warehouse sizing to ensure cost-efficiency and speed.
  • Implement comprehensive data quality checks, establish detailed logging mechanisms, set up monitoring alerts, and develop robust error handling frameworks to ensure data integrity and system reliability.
  • Collaborate closely with cross-functional teams, including Data Engineers, Business Intelligence (BI) specialists, and business stakeholders, to thoroughly gather requirements and translate them into effective technical solutions.
  • Ensure strict adherence to best practices concerning data governance, data security protocols, and regulatory compliance standards throughout the development lifecycle.

Required Skills & Qualifications:

  • Demonstrated strong hands-on experience with Snowflake, including proficiency in schema design, SnowSQL, Snowpipe for data ingestion, and leveraging Streams & Tasks for change data capture.
  • A minimum of 3 years of dedicated experience working with Snowflake, complemented by 5+ years of overall experience in ETL development.
  • Advanced proficiency in Python programming, including libraries such as Pandas for data manipulation, PySpark for big data processing, and experience with API integrations.
  • Proven experience in building Directed Acyclic Graphs (DAGs) and scheduling complex data pipelines using Apache Airflow.
  • Solid, practical experience with core AWS services, including S3, EC2, Lambda, IAM for access management, and CloudWatch for monitoring.
  • Strong SQL expertise coupled with practical data modeling experience, particularly with Star and Snowflake schemas.
  • Experience implementing CI/CD (Continuous Integration/Continuous Deployment) pipelines and utilizing version control systems like Git for collaborative development.
  • A good understanding of performance tuning methodologies and cost optimization strategies specifically within the Snowflake environment.

Additional Information:

This is a 12+ month contract opportunity located in Charlotte, NC. The role requires a candidate who can hit the ground running and contribute immediately to data initiatives. Keywords relevant to this role include continuous integration, continuous deployment, business intelligence, S3, and North Carolina.


Compensation & Location

Salary: $70 – $90 per year (Estimated)

Location: Charlotte, NC


Recruiter / Company – Contact Information

Recruiter / Employer: Euclid Innovations Inc.

Email: Karthik.lakkoju@euclidinnovations.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
Karthik.lakkoju@euclidinnovations.com with the subject:

DELETE_3180776

to delete@join-this.com.