Heidelberg Materials AG

Data Integration Engineer (m/w/d)

ab sofort

Arbeitsort:
69115 Heidelberg

Stellenbeschreibung

For our HDigital department, we are seeking a Data Integration Engineer (f/m/d) to help in our mission to become the first industrial tech company in the construction and building materials sector. Digitalization is one of the transformational pillars to drive Heidelberg Materials into the future. Our digital efforts cross all our business lines and operations. We are setting up the infrastructure, developing digital products and customer solutions to help us be successful at our core business for several decades.

In HProduce we are creating the data foundation in the Azure cloud by syncing and harmonizing the data from our plants, laboratory databases as external vendors and SAP. The data is our foundation for the development of machine learning models, process optimization and operational excellence. In our work, we have close relations to our colleagues from Heidelberg Materials' Competence Center Cement, HConnect and partnering companies.

We are looking for an experienced Data Integration Engineer (f/m/d) to help us with integrating data from heterogeneous systems into the Industrial Data Platform of HProduce. You will be part of a strong team to take on this big technical challenge and contribute to Heidelberg Materials' ambitious digitalization and sustainability goals.

You will directly report to the Senior Manager SWE Beta. The position is located in Heidelberg, Germany.

Your next challenge

  • Responsible for ensuring that data from multiple sources is properly integrated into a target data product
  • Design and implement new data engineering pipelines with cloud tools and frameworks
  • Prepare and accompany the roll-out of our data pipelines and by ensuring data availability and consistency in our Data Lake
  • Work closely with infrastructure teams to deploy needed infrastructure
  • Communicate with different stakeholders
  • Contribute as a valued team member to testing, debugging, QA, and documentation of data pipelines and systems
  • Your profile

    Must have:
  • University degree in Computer Science
  • Strong analytical and problem-solving skills
  • Data Integration Engineering working experience, ideally with Python
  • Willingness to learn new technologies and products
  • 3+ years of experience with C#
  • Experience with Infrastructure as Code, ideally Terraform
  • Experience with SQL and relational DB systems (PostgreSQL, SQL Server)
  • Experience with the Azure stack

  • Nice to have (at least 3 of the following criteria):

  • Experience with data warehousing, infrastructure, ETL/ ELT and analytic tools such as Azure Synapse or Databricks
  • Experience with big data technology stack, for example, Time Series DBs,(Py-)Spark, Delta Tables etc.
  • Experience with multiple Big Data file formats (Parquet, Avro, Delta Lake)
  • Experience with time-series databases and streaming data processing
  • Experience with PySpark
  • Experience with CI/CD pipelines on Gitlab
  • Experience with Terraform and k8s
  • Our offer

    We are convinced that only those who successfully realize their personal goals can also fully contribute professionally. That's why we offer you attractive benefits, such as:

  • Attractive compensation including Christmas and vacation bonuses
  • Flexible working time models
  • Mobile working within Germany on up to three days per week
  • 30 vacation days plus additional special vacation days
  • Individual onboarding with participation in the buddy program
  • Support for advanced training and continuing education
  • Company health management (various company sports groups, health and prevention cam-paigns, company doctor)
  • Ergonomic workstations with height-adjustable desks
  • Canteen with discounted and healthy meals
  • Employer-sponsored company pension plan
  • Attractive conditions and discounts at our cooperation partners (gyms, banks, online stores, etc.)
  • Job ticket
  • Kontakt

    Heidelberg Materials AG