SAP DATA ENGINEER

Jamy Interactive

Full Time

Experience: 03- 07

Location: (TRIPURA - INDIA)

Salary: 2500000 - 3000000 INR Per Year

Job Description

  • Company is a software-enabled services provider, synonymous with SAP on Cloud, focused on delivering superior, highly automated Managed Services to Enterprise customers. Our customers span multiple verticals and geographies across the Americas, EMEA and APAC. We partner with AWS, SAP, Microsoft and other global technology leaders.
  • Leverage data, software engineering, and data science techniques to create business value through data accessibility. Includes data ingestion, data preparation and analytics processing. Identify, acquire, cleanse & prepare, store data and develop data products aligned with defined architecture patterns.
  • Data Engineering (DE) provides data-acquisition capabilities in SAP Digital Manufacturing Cloud – enabling customers to source data from their on-premise systems like SAP ME, SAP MII, SAP ECC, SAP S/4HANA and other legacy or third-party systems.

Responsibilities Duties:

  • Develops and operationalizes data pipelines to create data sets that are made available for consumption (BI, Advanced analytics, APIs/Services).
  • Works in tandem with Data Architects, Data Stewards, and Data Quality Engineers to design data pipelines and recommends ongoing optimization of data storage, data ingestion, data quality, and orchestration.
  • Designs, develops, and implements ETL/ELT processes using various ETL/ Replication tools
  • Utilizing the cloud Data platforms such as Azure SQL DW (Synapse), ADLS, Azure Event Hub, Cosmos, Databricks, Delta-Lake to improve and speed up delivery of our data products and services.
  • Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.
  • Identifies, designs, and implements process improvements: automating manual processes, optimizing data delivery.
  • Identifies ways to improve data reliability, efficiency, and quality of data management.
  • Communicates technical concepts to non-technical audiences both in written and verbal form.
  • Performs peer reviews for other data engineers’ work.

Key Skills:

  • BA/BS in Computer Science, Engineering, or equivalent software/services experience.
  • Azure / AWS/GCP Certifications.
  • Experience implementing data integration techniques such as event/message based integration (Kafka, Azure Event Hub), ETL.
  • Experience with Git / Azure DevOps.
  • Experience delivering data solutions through agile software development methodologies.
  • Exposure to the retail industry.
  • Excellent verbal and written communication skills.
  • 5+ years’ experience engineering and operationalizing data pipelines with large and complex datasets.
  • 5+ years’ experience with Data Modeling, ETL, and Data Warehousing.
  • 3+ years’ hands-on experience with ETL/Data Replication tools.
  • 2+ years’ hands-on experience with SAP tools like SLT, SAP Data services(BODS)
  • 3+ years’ experience working with Cloud technologies such as ADLS, Azure Databricks, Spark, Azure Synapse, AWS Redshift, Big Query , Snowflake and other cloud data platforms.
  • Extensive experience working with various data sources (DB2, SQL Oracle, flat files (csv, delimited), APIs, XML, JSON.
  • Advanced SQL skills. Solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.
  • Strong understanding of database storage concepts (data lake, relational databases, NoSQL,  data warehousing).
  • Able to work in a fast-paced agile development environment.

Experiance Qualifications:

  • BA/BS in Computer Science, Engineering, or equivalent software/services experience.
  • Azure / AWS/GCP Certifications.
  • Experience implementing data integration techniques such as event/message based integration (Kafka, Azure Event Hub), ETL.
  • Experience with Git / Azure DevOps.
  • Experience delivering data solutions through agile software development methodologies.
  • Exposure to the retail industry.
  • Excellent verbal and written communication skills.
  • 5+ years’ experience engineering and operationalizing data pipelines with large and complex datasets.
  • 5+ years’ experience with Data Modeling, ETL, and Data Warehousing.
  • 3+ years’ hands-on experience with ETL/Data Replication tools.
  • 2+ years’ hands-on experience with SAP tools like SLT, SAP Data services(BODS)
  • 3+ years’ experience working with Cloud technologies such as ADLS, Azure Databricks, Spark, Azure Synapse, AWS Redshift, Big Query , Snowflake and other cloud data platforms.
  • Extensive experience working with various data sources (DB2, SQL Oracle, flat files (csv, delimited), APIs, XML, JSON.
  • Advanced SQL skills. Solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.
  • Strong understanding of database storage concepts (data lake, relational databases, NoSQL,  data warehousing).
  • Able to work in a fast-paced agile development environment.

Benefits:

  • This may include training, health, insurance, commuting support, lunch service etc.