NEWPosted 5 hours ago
Job ID: JOB_ID_3033
Job Description:
We are looking for a skilled GCP / API Data Engineer to join our team in Danbury, CT. This is a contract position focused on designing, building, and maintaining scalable data pipelines and APIs.
Key Responsibilities:
- Design, build, and maintain scalable data pipelines using GCP services such as Cloud Dataflow, Apache Beam, Apache Spark, or BigQuery.
- Develop ETL/ELT workflows for data ingestion, transformation, and processing using Cloud Composer (Airflow), TIDAL, Dataform, or custom scripts.
- Optimize BigQuery performance through partitioning, clustering, and query tuning.
- Work with Cloud Storage, Pub/Sub, Ni-Fi, Cloud SQL, and Bigtable for real-time and batch data processing.
- Monitor and troubleshoot data pipeline performance, failures, and cost efficiency.
Required Skills:
- Strong expertise in GCP services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Bigtable, Firestore, etc.).
- Proficiency in SQL, Python, and Java for data processing and automation.
- Experience with ETL/ELT workflows using Cloud Composer, Dataflow, or Dataform.
- Strong understanding of data modeling, warehousing, and distributed computing.
- Experience with real-time and batch processing architectures.
- Understanding of security and compliance standards (IAM, encryption, GDPR, HIPAA, etc.).
Strong API Skills:
- Strong Core Java & Spring Boot development.
- Experience with APIGEE and security patterns.
- Swagger Design experience.
- Knowledge of Microservice Architecture and Patterns.
This role is ideal for an engineer passionate about cloud data technologies and API development, with a focus on delivering robust and efficient data solutions.
Special Requirements
Onsite
Compensation & Location
Salary: $110,000 – $150,000 per year (Estimated)
Location: Danbury, CT
Recruiter / Company – Contact Information
Email: selva@ovstechnologies.com
Recruiter Notice:
To remove this job posting, please send an email from
selva@ovstechnologies.com with the subject:
DELETE_JOB_ID_3033