NEWPosted 5 hours ago

Job ID: JOB_ID_3726

Job Summary:

We are seeking an experienced Snowflake Data Engineer with a strong background in SQL, data warehousing, and cloud-based data platforms. The successful candidate will be responsible for the end-to-end process of designing, developing, and maintaining robust ETL/ELT pipelines, implementing efficient data ingestion workflows, and constructing scalable data models to support advanced analytics and business intelligence initiatives.

Key Responsibilities:

  • Design, develop, and maintain ETL/ELT pipelines utilizing Snowflake, SQL, Python, and relevant cloud technologies (AWS preferred).
  • Construct and optimize data models, including staging layers, data warehouses, and data marts, ensuring data integrity and performance.
  • Implement advanced performance optimization techniques specific to Snowflake, such as micro-partitioning, clustering, and query performance tuning.
  • Develop and manage data ingestion pipelines from various structured and semi-structured sources, including JSON, Parquet, XML, and APIs.
  • Implement business requirements leveraging key Snowflake features like Streams, Tasks, Time Travel, Zero-Copy Cloning, and Snowpipe for automated data ingestion.
  • Collaborate effectively with data architects, data analysts, and business stakeholders to translate complex requirements into scalable and efficient data solutions.
  • Monitor, troubleshoot, and optimize data pipelines to guarantee data quality, reliability, and overall integrity.
  • Document data flows, transformations, and all technical processes to facilitate maintainability and knowledge sharing within the team.
  • Implement and manage job scheduling and orchestration workflows using industry-standard tools.

Required Skills:

  • Extensive hands-on experience with cloud-based databases, with a strong preference for Snowflake.
  • Excellent proficiency in SQL / PL-SQL and a deep understanding of database optimization techniques.
  • Proven experience in ETL/ELT pipeline development.
  • Solid experience with Data Warehousing concepts and data modeling principles.
  • Experience with job orchestration / scheduling tools (Tidal preferred).
  • Basic understanding of AWS cloud services.
  • Strong analytical and problem-solving capabilities.

Preferred Skills:

  • Experience with Python for data engineering or automation tasks.
  • Experience handling semi-structured data formats such as JSON, Parquet, or XML.
  • Familiarity with banking domain terminology and financial data workflows.
  • Experience with data pipeline monitoring and troubleshooting tools.

Compensation & Location

Salary: $60 – $80 per year (Estimated)

Location: Stamford, CT


Recruiter / Company – Contact Information

Email: kumar@valzosoft.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
kumar@valzosoft.com with the subject:

DELETE_JOB_ID_3726

to delete@join-this.com.