NEWPosted 2 hours ago

Job ID: JOB_ID_8209

Position Overview

The Data Integration Solution Architect will own the end-to-end design of data movement, transformation, and governance patterns across the Novolex enterprise. This role is the technical authority for how data flows between source systems (SAP ECC, S/4HANA Cloud, Blue Yonder, Workday, Syndigo) and our analytics and AI platforms (Microsoft Fabric, Snowflake, Azure AI Foundry). With AI models only being as good as the data they consume, this role is foundational to every AI initiative in our pipeline.

Business Need & Strategic Drivers

  • Multi-Version SAP Complexity: Novolex operates both legacy SAP ECC and S/4HANA Cloud instances. Data extraction patterns differ significantly between these environments. Joule AI is only available on S/4HANA Cloud, meaning ECC data must be surfaced through Fabric or Snowflake for AI consumption. A dedicated architect is needed to design standardized extraction pipelines that account for version-specific schemas, APIs, and data models.
  • Microsoft Fabric as the Data Backbone: Fabric has been designated as our primary data integration and analytics platform. Designing OneLake data architectures, building Fabric data pipelines, managing lakehouses and warehouses, and governing data access across domains (manufacturing, finance, supply chain, HR) requires sustained architectural ownership, not ad hoc project work.
  • Snowflake & Third-Party Data Integration: Snowflake serves as a key analytics layer, particularly for Cortex AI capabilities and third-party data sharing (e.g., the planned Circana integration for market intelligence). The SAP-Snowflake zero-copy data sharing capability requires careful architectural design to ensure data freshness, access governance, and cost optimization.
  • AI Model Data Readiness: Every AI use case in our pipeline—demand forecasting, predictive maintenance, spend analytics, computer vision for PPE compliance—requires curated, governed datasets. Without a data integration architect, data preparation becomes the bottleneck for AI delivery, typically consuming 60-80% of project timelines.
  • Data Governance & Quality: Alation (data catalog and governance) and Syndigo (master data management) are in place but need architectural oversight to ensure lineage tracking, data quality rules, and catalog integration are woven into every pipeline, not bolted on after the fact.

Key Responsibilities

  • Design and maintain the enterprise data integration architecture across Microsoft Fabric, Snowflake, and Azure Data services.
  • Define data extraction, transformation, and loading (ETL/ELT) patterns for multi-version SAP environments (ECC and S/4HANA Cloud).
  • Architect OneLake data organization, including lakehouses, warehouses, and data domains aligned to business functions.
  • Lead the technical design for third-party data integrations (Circana, Syndigo, external market data sources).
  • Establish data pipeline standards, monitoring, and SLA frameworks for AI-consumed datasets.
  • Collaborate with the AI Architect and MLOps Engineer to ensure data readiness for model training and inference.
  • Partner with Alation to operationalize data cataloging, lineage, and governance policies across all integration points.
  • Evaluate and recommend tooling for data quality, observability, and cost management within Fabric and Snowflake.

Compensation & Location

Salary: $55 – $60 per year

Location: Remote


Recruiter / Company – Contact Information

Email: nirajkr147852@gmail.com


Interested in this position?
Apply via Email

Recruiter Notice:
To remove this job posting, please send an email from
nirajkr147852@gmail.com with the subject:

DELETE_JOB_ID_8209

to delete@join-this.com.