NEWPosted 4 hours ago

Job ID: JOB_ID_9024

Role Summary

We are seeking a skilled Data Engineer with expertise in SQL, ETL, Kafka, PySpark, Azure Cloud, and DevOps. This role involves working on data pipelines, data governance, and cloud platforms to support our client’s data initiatives. The position requires a hybrid work model with 3 days in the office.

Required Skills (Must Have)

  • Strong SQL database experience (Oracle, PostgreSQL, or SQL Server).
  • ETL experience (Hands on Python).
  • Kafka, PySpark, Azure cloud experience (Event Hub, ADF, Databricks, Delta Lake experience).
  • DevOps/CI-CD (GitHub Actions, Azure DevOps) & Flink.
  • Adobe Experience Platform (AEP) XDM schema modeling.

Nice-to-Have Skills

  • Understanding of telecom regulatory rules (PII, CPNI, DPI, GDPR, SOX).
  • Data cataloging tools (Purview, Collibra, Informatica).
  • Data classification & metadata management.
  • Data ownership, stewardship frameworks.
  • DAMA DMBoK framework knowledge.
  • Adobe XDM Governance Mapping.
  • Adobe Real-Time CDP governance integration.
  • Experience with AI driven governance automation.
  • Governance workflows (ira).
  • Data retention policy automation.
  • Strong conceptual, logical, physical modeling skills.
  • Dimensional modeling (Star/Snowflake).
  • Normalization/denormalization strategies.
  • Relational modeling for RDBMS (Oracle, SQL Server, PostgreSQL).
  • SQL proficiency for sourcing, profiling, validation.
  • Data quality profiling on ingested feeds.
  • Schema evolution automation for streaming data.
  • Knowledge of TM Forum SID / eTOM models.
  • Data Vault modeling.
  • Understanding of telecom data flows & domains (orders, billing, charging, usage, SIM, provisioning, network events).
  • Cloud data storage: ADLS.
  • Access control management (RBAC, ACLs, Key Vault, IAM).
  • Understanding of storage zone patterns (Raw, Curated, Processed).
  • Lakehouse technologies (Delta Lake, Iceberg, Hudi).
  • Multi-cloud governance.
  • Data ingestion tools (ADF, Nifi).
  • ETL/ELT concepts (mapping, transformations, business rules).
  • Python/SQL at intermediate level.
  • Batch and streaming processing basics.
  • Familiarity with version control (Git).
  • PySpark/Scala.
  • dbt core modeling and testing.
  • Experience with Databricks jobs, pipelines.
  • Basics of Kafka or Event Hubs.
  • Understanding real-time vs batch architectural differences.
  • Data classification & policy enforcement.
  • PII and sensitive telecom data handling.
  • Encryption, masking, data access governance.
  • Stakeholder alignment (Business, Marketing, IT).
  • Ability to translate business rules into data rules.
  • Data stewardship & operational governance.
  • Change management.
  • Workflow automation for governance processes.

Special Requirements

Visa: Need and Local onlyNo please | Type : Contract/C2C | 3 days work from office


Compensation & Location

Salary: $70,200 – $93,600 per year (Estimated)

Location: Dallas, TX & Atlanta, GA, TX


Recruiter / Company – Contact Information

Recruiter / Employer: Client

Email: kcort5@gmail.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
kcort5@gmail.com with the subject:

DELETE_JOB_ID_9024

to delete@join-this.com.