NEWPosted 13 hours ago

Job ID: JOB_ID_1048

Role Overview

We are seeking a highly skilled and motivated Data Engineer to join our dynamic technology team. This role is critical for designing and implementing the next generation of our data infrastructure, focusing on Snowflake, Python, and AWS. The ideal candidate will be responsible for building robust, scalable ETL/ELT pipelines that handle both structured and unstructured data, ensuring high performance and reliability for our business intelligence and analytics platforms.

Key Responsibilities

  • Design, develop, and maintain scalable ETL/ELT pipelines for diverse datasets using Python and Snowflake.
  • Implement and optimize data storage, retrieval, and transformation processes within Snowflake and Oracle database environments.
  • Lead cloud migration efforts for legacy data pipelines, transitioning them to modern AWS and Snowflake architectures.
  • Participate in the full software development lifecycle (SDLC), including planning, requirement gathering, development, rigorous testing, and quality assurance.
  • Integrate and manage complex workload orchestration using Airflow and Tivoli Workload Scheduler (TWS) to ensure timely data delivery.
  • Create robust data processing scripts and tools in Python, emphasizing modular, testable, and reusable code structures.
  • Write and optimize complex SQL queries and stored procedures for high-volume, high-performance database applications.
  • Troubleshoot production incidents, identify root causes, and implement permanent preventive measures.
  • Collaborate with cross-functional teams, including data scientists and business analysts, to define and ship new data features.

Technical Requirements

  • Extensive hands-on experience with the Snowflake Data Platform (Certification is highly preferred).
  • Strong proficiency in SQL and PL/SQL for complex data manipulation and analysis.
  • Proven experience with AWS cloud services, particularly those related to data storage and processing (S3, Lambda, Glue).
  • Expert-level Python programming skills with a focus on ETL concepts and data engineering libraries.
  • Deep understanding of metadata management, data lineage, and data governance principles.
  • Advanced knowledge of SQL analytical functions, Views, and Materialized Views.
  • Strong background in Unix/Linux scripting for automation and system management.
  • Solid understanding of data warehouse concepts, including dimensional modeling (Star/Snowflake schemas) and ETL best practices.

Professional Environment

This position is based in Jersey City, NJ, and offers a collaborative environment where innovation is encouraged. You will work with cutting-edge technologies to solve complex data challenges that directly impact our business strategy. We value continuous learning and provide opportunities for professional growth and certification in cloud and data technologies.


Special Requirements

Local to Jersey City, NJ preferred.


Compensation & Location

Salary: $135,000 – $185,000 per year (Estimated)

Location: Jersey City, NJ


Recruiter / Company – Contact Information

Recruiter / Employer: WB Solutions LLC

Email: amaan@wbsolutions.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
amaan@wbsolutions.com with the subject:

DELETE_JOB_ID_1048

to delete@join-this.com.