Job ID: JOB_ID_9412
POSITION 1: SAP FERC Consultant
Location: Remote, with limited travel to Tacoma, WA
Core Responsibilities:
- Develop, maintain, and enhance SAP based regulatory reports supporting FERC, NARUC, and state PUC requirements.
- Translate FERC USoA and state regulatory classifications into SAP master data, cost structures, and reporting logic.
- Support preparation of FERC Form filings (e.g., Form 1, 2, 6, 60, 714) using SAP data extracts and validation routines.
- Ensure SAP configuration aligns with evolving state PUC and NARUC reporting frameworks.
- Perform data quality checks, reconciliations, and audit trails to ensure accuracy and compliance.
- Partner with Accounting, Rates, Regulatory Affairs, and IT to implement reporting changes driven by new rules or orders.
- Document SAP to regulatory mappings, data lineage, and reporting logic for audit and compliance purposes.
Required Skills & Experience:
- Strong understanding of FERC accounting, USoA classifications, and federal reporting requirements.
- Familiarity with NARUC guidelines, state PUC reporting structures, and retail regulatory frameworks.
- Hands-on experience with SAP FI/CO, SAP ECC or S/4HANA, and regulatory reporting data flows.
- Experience working with utility cost structures, allocations, and large datasets.
- Ability to interpret regulatory rules and convert them into SAP technical requirements.
POSITION 2: SAP BTP/ABAP Consultant
Location: Redmond, WA – Onsite
Required Skills & Experience:
SAP Skills
- Strong experience with SAP BTP (Neo / Cloud Foundry preferred).
- Hands-on experience with SAP Integration Suite (CPI).
- Experience integrating SAP systems such as S/4HANA, ECC, SuccessFactors, Ariba, or BW.
- Exposure to OData, REST, SOAP APIs, IDocs, RFCs.
Microsoft Skills
- Strong knowledge of Microsoft Azure architecture and services.
- Experience with Power Platform (Power Apps, Power Automate).
- Experience with Azure AD / Entra ID for identity and access management.
- Understanding of Microsoft 365 integrations is a plus.
Technical Skills
- Programming/scripting experience in Java, JavaScript, Groovy, or Node.js.
- Experience with XML, JSON, XSLT, and API security.
- Familiarity with DevOps, Git, CI/CD pipelines.
- Strong troubleshooting and debugging skills.
Key Responsibilities:
- Design and develop SAP BTP-based extensions and integrations for SAP S/4HANA and related systems.
- Build and manage end-to-end integrations between SAP and Microsoft platforms (Azure, Entra ID/Azure AD, Power Platform, M365).
- Develop services and applications using SAP BTP services such as: Integration Suite (CPI), API Management, Event Mesh, SAP Build (Apps, Process Automation).
- Implement authentication and authorization using OAuth2, SAML, Azure AD / Entra ID.
- Design and deploy solutions using Azure services (Logic Apps, Functions, Service Bus, Data Factory, App Services).
- Collaborate with SAP functional and technical teams to translate business requirements into technical solutions.
- Ensure solutions comply with security, compliance, and enterprise architecture standards.
- Support CI/CD pipelines, monitoring, troubleshooting, and performance optimization.
- Create technical documentation and provide knowledge transfer to support teams.
POSITION 3: Senior Business Analyst
Location: Rahway, NJ or WestPoint, PA (Onsite from day one, REMOTE NOT ALLOWED)
Mandatory skills: Assay development knowledge
Job Description:
- 10+ years in product/business analysis or similar roles within pharma, biotech, diagnostics, or healthcare tech.
- Must have hands-on assay development knowledge (experimental design, optimization, and common validation attributes).
- Proven experience in requirements gathering, documentation, backlog ownership, and end-to-end SDLC delivery.
- Familiarity with lab operations and assay configuration concepts (methods, instruments, reagents, controls).
- Excellent analytical, communication, and documentation skills; expert at writing user stories and acceptance criteria.
- Proficiency with Jira and Confluence (or similar).
- Experience with LIMS, ELN, instrument integration, and lab informatics.
- Exposure to data models for assay results, metadata, and audit trails.
- Strong experience with agile methodologies and release management.
POSITION 4: Data Modeler with strong Insurance domain expertise
Location: Princeton, NJ – Onsite
Job Description:
Data Modeler with strong Insurance domain expertise to design, develop, and maintain enterprise-level data models that support underwriting, policy administration, claims, billing, and analytics initiatives. The ideal candidate will work closely with business stakeholders, architects, and engineering teams to translate complex insurance business requirements into scalable and compliant data models.
Key Responsibilities:
- Design and maintain conceptual, logical, and physical data models for insurance systems and data platforms.
- Work closely with business analysts, underwriters, actuaries, claims, and IT teams to understand data requirements.
- Model core insurance entities such as Policy, Coverage, Risk, Claim, Exposure, Party, Billing, and Reinsurance.
- Support data modeling for transactional systems, data warehouses, data lakes, and analytics platforms.
- Ensure data models adhere to industry standards, regulatory requirements, and data governance policies.
- Define data definitions, business rules, relationships, and metadata.
- Support data integration, migration, and modernization initiatives (legacy to cloud).
- Review and optimize data models for performance, scalability, and data quality.
- Collaborate with data engineers and developers during implementation.
- Participate in design reviews, impact analysis, and documentation.
POSITION 5: GenAI Engineer/Architect
Location: Florida – Remote
Skill Requirements:
- Expert Proficiency In AI/ML Model Development, Including Classical ML, Deep Learning, NLP, And Time Series Forecasting.
- Excellent Skills In Python, R
Job Summary:
This role is responsible for architecting and designing advanced artificial intelligence and machine learning solutions that drive business transformation. The individual leverages deep expertise in scalable data platforms, distributed computing, and model development to deliver robust, secure, and high-performing systems. They provide strategic technical direction, mentor teams, and ensure alignment of technology strategies with organizational objectives.
Key Responsibilities:
- Architect end-to-end AI/ML solutions using Python, TensorFlow, PyTorch, and scikit-learn, ensuring scalability, performance, and security across cloud and on-premise environments.
- Design and implement distributed data processing pipelines with Apache Spark and Kafka, optimizing for real-time analytics and robust data ingestion.
- Develop and operationalize machine learning models for NLP, deep learning, and time series forecasting using Spark Mllib, XGBoost, and LightGBM, ensuring seamless integration with PostgreSQL and DataBricks platforms.
- Establish best practices for model deployment and monitoring using Apache Airflow and Bash scripting, enabling automated workflows and continuous delivery.
- Mentor and guide technical teams in advanced ML model development, fostering skill growth in Python, R, and SQL, and mitigating delivery risks through knowledge transfer.
- Collaborate with stakeholders to gather requirements, define technology strategy, and align AI/ML architectures with evolving business needs and industry standards.
- Architect and implement RESTful API integrations to enable seamless communication between AI/ML components and external systems, ensuring scalable, secure, and efficient data exchange across diverse enterprise environments.
- Architect and implement RESTful API integrations to enable seamless communication between AI/ML components and external systems, ensuring scalable, secure, and efficient data exchange across diverse enterprise environments.
POSITION 6: Lead Data Engineer
Location: Rhode Island, USA – Remote
Mandatory Skills: Matillian With Snowflake
- Specializes in ETL (Extract, Transform, Load) using Matillion ETL tool.
- Specializes in ETL (Extract, Transform, Load) using DBT ETL tool.
- Experience in Snowflake based data warehouse.
- Azure Data Services such as ADLS, SQL Server (managed and non-managed).
- Good understanding on latest trends and best practices in ETL/ELT.
- Hands-on working experience in Data warehouse project and understand Star Schema & Snowflaking models, Normalization, Denormalization, Aggregations, Data quality, etc. concepts.
- Working experience in Health Insurance domain.
- Excellent interpersonal and communication skill.
POSITION 7: Senior Java Microservices Engineer (Azure)
Location: Bay Area, CA (SF / Oakland)
Work Model: Hybrid onsite 2 days/week
Interview: In person required
- Strong hands-on experience in Java (Java 8+), Spring / Spring Boot.
- Solid expertise in Microservices architecture and RESTful API design.
- Strong hands-on experience building and deploying applications on Microsoft Azure.
- Azure App Services, AKS, Azure Storage, Azure SQL / Cosmos DB.
- Experience with Docker, Kubernetes (AKS), and cloud native development.
- Hands-on experience with CI/CD pipelines using Azure DevOps.
- Strong understanding of distributed systems, scalability, resilience, and security best practices.
- Smart, high caliber engineer with strong problem-solving skills.
- Actively uses AI tools (e.g., GitHub Copilot, Claude, ChatGPT, etc.) to improve developer productivity, enhance code quality and velocity, and accelerate debugging, design, and documentation.
- Comfortable working in fast-paced, outcome-driven environments.
- Strong communication and collaboration skills.
- Ownership mindset with focus on quality and delivery.
- Experience building enterprise-scale, cloud-native solutions is a plus.
POSITION 8: Informatica Lead
Company: Truist Bank
Location: Atlanta, GA, or Charlotte, NC – Onsite
Skills:
- 10+ years of experience as an ETL Informatica development/Lead experience.
- Strong Knowledge in IBM DB2 and PostgreSQL.
- Good knowledge in data migration of large databases (TB in size).
- Need to coordinate multiple teams for data migration and production deployment.
- Create workflows with parameterizing the input.
- Strong SQL knowledge and query writing.
- Worked in JDBC connection creating and retrieving the data.
POSITION 9: Senior Software Engineer
Company: Wells Fargo
Location: San Leandro – Local Candidate
Interview: Face to face Client Interview is MUST
- Flink technical, practical knowledge, which is a mandatory skill as per customer for DTI requirement.
- 10+ years application development experience.
- Java, Springboot, Microservices, Kafka, MongoDB, OCP, Flink.
POSITION 10: Senior Software Engineer
Company: WELLS FARGO
Location: San Leandro – LOCAL PROFILES ONLY
Interview: FACE TO FACE MEETING IS MUST FOR CLIENT INTERVIEW
- Flink technical, practical knowledge, which is a mandatory skill as per customer for DTI requirement.
- Min 15+ years of experience.
- Strong hands-on Spring Boot Microservice development.
- Hands-on Spring Web application development.
- JDK17+.
- Actively using Git Co-Pilot and any AI tools.
- Hands-on Splunk queries, dashboards.
- Familiarity with APM tools and analysis.
- Strong in MongoDB design/queries and Kafka messaging service.
- Familiarity and hands-on Oracle DB.
- Familiarity with BDD development.
- Familiarity with OCP and Kubernetes container platform.
- Very good team player and effective communicator.
- Good problem solver with minimal guidance.
POSITION 11: JAVA DEVELOPER
Company: WELLS FARGO
Location: Charlotte, NC OR Phoenix, AZ – any location is fine
Interview: F2F CI IS MUST
Note: Should have worked/experience in Wealth and Investment Management.
- Flink technical and practical knowledge is a mandatory skill required by the customer for DTI.
- 6-8 years of strong experience in Java development, including proficiency in Spring / Spring Boot.
- 2-3 years of experience with Python, focused on backend or data-driven development.
- Deep understanding of Reactive Programming (WebFlux, etc.).
- Hands-on experience with Apache Kafka for event-driven architectures.
- Experience with Flink for stream processing and data pipelines.
- Proficiency in Redis for caching and performance optimization.
- Database expertise in both MongoDB (NoSQL) and Oracle (RDBMS).
- Strong experience in building and consuming RESTful APIs.
- GraphQL knowledge is good to have but not mandatory.
- Good understanding of Google Dialogflow CX or similar conversational AI frameworks.
- Exposure to LLM (Large Language Models), agentic architectures, and prompt engineering concepts.
- Familiarity with ADK (Agent Development Kit), Playbook, or similar agentic frameworks.
- Conceptual understanding of machine learning fundamentals and model telemetry.
- Strong problem-solving and debugging skills.
- Experience with microservices architecture, CI/CD pipelines, and cloud-native environments (OCP, Kubernetes, etc.).
- Excellent communication skills; ability to collaborate across engineering and product teams.
POSITION 12: Tech Architect Java /Angular
Location: San Jose, CA – onsite
- Bachelor’s (BS in Computer Science) or similar degree.
- Proven experience in Java development (ideally 7+ years).
- Solid understanding of object-oriented design patterns.
- Knowledge of Dependency Injection principles and experience with frameworks like Spring or Guice (Guice preferred).
- Proficiency with relational databases and NoSQL databases.
- Experience with front-end development using Angular.
- Experience with designing and implementing system integrations (e.g., API-based, message queues).
- Strong experience with JUnit, Mockito, or other unit testing best practices.
- Experience developing and deploying applications on Unix/Linux environments.
- Familiarity with ORM frameworks such as Hibernate.
- Experience with Web development, including REST/SOAP web services development and design principles.
- Experience with source control (e.g., Git, Piper) and build technologies (e.g., Ant, Maven, Blaze/Bazel).
- Experience with Integrated Workplace Management System (IWMS) applications (e.g., IBM Tririga, Archibus, Planon) or developing applications for real estate management.
- Experience developing within Google’s internal tech stack, including tools and platforms such as: Piper, Cider, Boq, Google App Engine, Spanner, One Platform, ACX Web.
- Familiarity with Google’s use of Tririga and related integrations within the CRES (Core Real Estate Systems) domain.
- Experience with Google’s integration platforms and best practices (e.g., Pub/Sub, Data Bridge tools like Replicator).
- Strong oral and written communication skills.
- Ability to work effectively within a team environment.
- Good problem-solving and analytical skills.
- Eagerness to learn and adapt to new technologies and challenges.
POSITION 13: Ai/ML Software Development SME
Location: San Jose, CA – onsite
Mandatory Skills: AI/ML, ML Ops, Java, Python
- Strong experience with Google Cloud Platform (GCP) services, specifically in the MLOps and ML domain (Vertex AI, Kubeflow, Cloud Storage, Artifact Registry).
- Proven ability to design and implement end-to-end ML pipelines for data management, model training, and deployment.
- Hands-on experience with containerization technologies like Docker.
- Familiarity with CI/CD practices and pipeline automation.
- Knowledge of ML frameworks like TensorFlow, and experience with experiment tracking and hyperparameter tuning.
- Excellent problem-solving skills and a strong understanding of the ML lifecycle.
- Experience with the Generative Language API (Gemini model) or other AI Agent integrations is a plus.
POSITION 14: HRIS Workday Integration Specialist
Location: Orlando, FL. Remote profiles considered as second priority.
- Bachelor’s degree in Human Resources, Information Systems, Computer Science, or related field.
- 3-6 years of experience in HRIS management and Workday integrations.
- Hands-on experience with Workday Studio, EIBs, and other integration tools.
- Strong understanding of HR processes, especially offboarding and exit management.
- Excellent problem-solving and communication skills.
- Experience with ticketing systems and change management processes is a plus.
- Knowledge of data privacy regulations e.g., GDPR, HIPAA.
- Experience with other HR platforms e.g., SAP SuccessFactors, Oracle HCM.
- Familiarity with scripting or programming languages used in integrations e.g., XSLT, XML, REST APIs.
POSITION 15: RPA Administrator
Location: Hartford, Connecticut – Hybrid
Mandatory Skills: UI Path installation, maintain and manage server configurations, handling access & permissions, manage server upgrades and patches, perform deployments, analyze & maintain server health.
- Install & Administer UiPath Orchestrator, Robots, and environments (Dev/Test/Prod).
- Monitor platform health, troubleshoot issues, and ensure uptime.
- Manage deployments, upgrades, and CI/CD support.
- Configure queues, assets, roles, and access (RBAC Role Based Access Control).
- Ensure security, compliance, and audit readiness.
- Support incident management, RCA, and performance optimization.
- Collaborate with developers and infrastructure teams.
POSITION 16: RPA UI Path Developer
Location: Hartford, Connecticut – Hybrid
- Strong hands-on experience in UiPath development and solution design.
- Proven expertise in business analysis for automation and process transformation.
- Deep understanding of RPA lifecycle, frameworks, and enterprise deployments.
- Experience with integrations (APIs, databases, Excel, web automation).
- Excellent stakeholder management.
- UiPath Advanced/Professional Certification.
- Experience with AI/ML integrations, Document Understanding, or OCR.
- Familiarity with Agile delivery and CI/CD pipelines.
- Identify, prioritize, and own automation pipelines aligned to business strategy and ROI targets.
- Drive process discovery, due diligence, and feasibility assessments for high-impact opportunities.
- Define and document PDD/SDD, solution architecture, and scalable design patterns.
- Design, develop, and deploy complex automations using UiPath (Reframe work, Orchestrator).
- Lead UAT, production rollout, and hyper care support.
- Optimize existing automations to enhance performance, resilience, and cost efficiency.
- Ensure adherence to RPA governance, security, and best practices.
POSITION 17: Platform Administrator / Computer Systems Analyst
Work Location: Lansing, Michigan
Open for Remote: Yes
- Cloud platforms, virtualization, middleware, SaaS platforms, or operating systems.
- Power, Bash Java, Python and Gosu.
- Experience with Guidewire Insurance Suite version 9.0 or above. PolicyCenter and BillingCenter preferable.
- Proficiency with DBMS and ETL processes. Preferably with Oracle, Microsoft SQL Server.
- Kafka, SQS and messaging protocols like SOAP, Rest APIs.
- Azure, AWS, GCP.
- Platform Management & Operations: Administer, configure, and maintain enterprise platforms (cloud, on-prem or hybrid) to ensure high availability, performance, and resilience. Lead platform upgrades, patching, and lifecycle management while minimizing service disruptions. Monitor system health, performance metrics, and capacity; proactively address risks and bottlenecks. Manage backup, disaster recovery, and business continuity strategies.
- Automation & Optimization: Design and implement automation using scripting, infrastructure-as-code, or orchestration tools to reduce manual effort and improve reliability. Continuously optimize platform cost, performance, and scalability. Standardize configurations, templates, and operational processes.
- Security & Compliance: Enforce security best practices, access controls, and configuration standards across platforms. Partner with security teams to remediate vulnerabilities, apply security patches, and support audits. Ensure platforms adhere to compliance requirements (e.g. HIPAA, or internal governance).
- Technical Leadership: Serve as a subject matter expert (SME) for the platform, providing guidance and technical direction. Mentor and support junior administrators and engineers. Lead root-cause analysis for complex incidents and implement long-term fixes.
- Collaboration & Stakeholder Support: Work closely with application teams, DevOps, architects, and business stakeholders to align platform capabilities with business objectives. Participate in platform architecture design discussions and roadmap planning. Provide Tier 3 support for complex incidents and escalations. Duties also include On-call support when ever required during weekends and beyond regular working hours on weekdays.
Special Requirements
VISAS: Only W2, C2C, 1099, Contract, Full-Time. SCREENING: Candidates born before 1985. INTERVIEW MODE: Not specified. DOMAIN RESTRICTIONS: None specified for this role. For Position 3: REMOTE NOT ALLOWED. For Position 7: Hybrid onsite 2 days/week, In person interview required. For Position 9 & 10: Local Candidate, Face to face Client Interview is MUST. For Position 11: F2F CI IS MUST, LOCAL CANDIDATES ONLY. For Position 12: onsite. For Position 13: onsite. For Position 14: Remote profiles as second priority. For Position 15 & 16: Hybrid. For Position 17: Remote (Yes). For Position 1: Remote, with limited travel to Tacoma, WA. For Position 2: Onsite. For Position 3: Onsite from day one. For Position 4: Onsite. For Position 5: Remote. For Position 6: Remote. For Position 8: Onsite. For Position 12: onsite. For Position 13: onsite. For Position 14: Orlando, FL. For Position 15 & 16: Hartford, Connecticut. For Position 17: Lansing, Michigan.
Compensation & Location
Salary: $100,000 – $150,000 per year
Location: Tacoma, WA
Recruiter / Company – Contact Information
Email: masri@incorporaninc.com
Recruiter Notice:
To remove this job posting, please send an email from
masri@incorporaninc.com with the subject:
DELETE_JOB_ID_9412