Senior Data Engineer (SAP BTP Integration Suite / ADF / Pentaho)

New Today

:
Our client - a major utility firm based out of Westchester County, NY - has an immediate need for Senior Data Engineer. The particulars of the position are as follows.
Job Functions & Responsibilities:
Cloud Data Engineering & Integration:
• Design and implement data pipelines across AWS, Azure, and Google Cloud.
• Develop SAP BTP integrations with cloud and on-premise systems.
• Ensure seamless data movement and storage between cloud platforms.
ETL & Data Pipeline Development:
• Develop and optimize ETL workflows using Pentaho and Microsoft ADF /or equivalent ETL tools.
• Design scalable and efficient data transformation, movement, and ingestion processes.
• Monitor and troubleshoot ETL jobs to ensure high availability and performance.
API Development & Data Integration:
• Develop and integrate RESTful APIs to support data exchange between SAP and other platforms.
• Work with API gateways and authentication methods like OAuth, JWT, API keys.
• Implement API-based data extractions and real-time event-driven architectures.
Data Analysis & SQL Development:
• Write and optimize SQL queries, stored procedures, and scripts for data analysis, reporting, and integration.
• Perform data profiling, validation, and reconciliation to ensure data accuracy and consistency.
• Support data transformation logic and business rules for ERP reporting needs.
Data Governance & Quality (Ataccama, Collibra):
• Work with Ataccama and Collibra to define and enforce data quality and governance policies.
• Implement data lineage, metadata management, and compliance tracking across systems.
• Ensure compliance with enterprise data security and governance standards.
Cloud & DevOps (AWS, Azure, GCP):
• Utilize Azure DevOps and GitHub for version control, CI/CD, and deployment automation.
• Deploy and manage data pipelines on AWS, Azure, and Google Cloud.
• Work with serverless computing (Lambda, Azure Functions, Google Cloud Functions) to automate data workflows.
Collaboration & Documentation:
• Collaborate with SAP functional teams, business analysts, and data architects to understand integration requirements.
• Document ETL workflows, API specifications, data models, and governance policies.
• Provide technical support and troubleshooting for data pipelines and integrations.
Required Skills & Experience:
• 7+ years of experience in Data Engineering, ETL, and SQL development.
• Hands-on experience with SAP BTP Integration Suite for SAP and non-SAP integrations.
• Strong expertise in Pentaho (PDI), Microsoft ADF, and API development.
• Proficiency in SQL (stored procedures, query optimization, performance tuning).
• Experience working with Azure DevOps, GitHub, and CI/CD for data pipelines.
• Good understanding of data governance tools (Ataccama, Collibra) and data quality management.
• Experience working with AWS, Azure, and Google Cloud (GCP) for data integration and cloud-based workflows.
• Strong problem-solving skills and ability to work independently in a fast-paced environment.
Preferred Qualifications:
• Experience working on SAP S/4HANA and cloud-based ERP implementations.
• Familiarity with Python, PySpark for data processing and automation.
• Experience working on Pentaho, Microsoft ADF/or equivalent ETL tools
• Knowledge of event-driven architecture
• Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, .
Education & Certifications
• Bachelor's or Master's degree in a relevant field like Computer science, Data Engineering or related technical field.
Nice to have below certifications:
• Azure Data Engineer associate
• SAP Certified Associate - Integration Developer.
Location:
White Plains

We found some similar jobs based on your search