NEWPosted 4 hours ago

Job ID: JOB_ID_4486

Job Description: Talend / Snowflake Data Engineer

We are seeking a highly skilled Talend / Snowflake Data Engineer to design, develop, and maintain scalable data integration and data warehouse solutions. The ideal candidate will have strong experience in Talend ETL development, Snowflake data warehousing, and cloud-based data pipelines. This role requires expertise in building efficient data pipelines, optimizing performance, and supporting analytics and business intelligence initiatives.

Responsibilities:

  • Design, develop, and maintain ETL/ELT pipelines using Talend to ingest and transform data from multiple sources.
  • Develop and optimize data models and data pipelines in Snowflake.
  • Implement data integration solutions for batch and real-time data processing.
  • Build scalable data ingestion frameworks from APIs, databases, and flat files.
  • Optimize Snowflake performance, including query tuning, clustering, and warehouse optimization.
  • Implement data quality, validation, and monitoring mechanisms.
  • Work closely with data analysts, BI developers, and business stakeholders to understand data requirements.
  • Maintain and improve data governance, security, and compliance standards.
  • Troubleshoot and resolve data pipeline failures and performance issues.
  • Document data architecture, workflows, and ETL processes.

Required Skills & Experience:

  • 5+ years of experience in Data Engineering or ETL development.
  • Strong hands-on experience with Talend (Talend Data Integration / Talend Cloud).
  • Strong expertise in Snowflake Data Warehouse.
  • Experience with SQL and advanced query optimization.
  • Experience designing data pipelines and ETL/ELT workflows.
  • Hands-on experience with data modeling (star schema, snowflake schema).
  • Experience working with large datasets and high-volume data processing.
  • Knowledge of cloud platforms (AWS / Azure).
  • Experience with Git or version control systems.

Preferred Qualifications:

  • Experience with Python or Shell scripting for automation.
  • Knowledge of ESP or other workflow orchestration tools.
  • Experience with CI/CD pipelines for data engineering.
  • Familiarity with data lake architectures and modern data stack.
  • Understanding of data governance and security best practices.

Nice to Have:

  • Experience with streaming technologies (Kafka, Spark).
  • Experience with BI tools such as Tableau or Power BI.
  • Snowflake or Talend certifications.

Location:

  • Atlanta, GA (Onsite) – Local or nearby candidates preferred.

Special Requirements

Only locals with DL from GA and nearby states required and genuine visa candidates only. Please share necessary docs along with details.


Compensation & Location

Salary: $100,000 – $140,000 per year (Estimated)

Location: Atlanta, GA


Recruiter / Company – Contact Information

Recruiter / Employer: Info Tech Spectrum

Email: jaya@infotechspectrum.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
jaya@infotechspectrum.com with the subject:

DELETE_JOB_ID_4486

to delete@join-this.com.