Assignment Description

We are seeking a Data Engineer for an assignment with a hybrid setup that requires three days on-site per week.

In this role, you will be responsible for engineering and solution architecture within the Learning Technologies and Data domain, while also ensuring alignment of engineering efforts across teams in the same area. The work includes designing, building, and maintaining scalable data pipelines in Databricks using Python and SQL, as well as ensuring data quality, lineage, and governance across datasets. You will also focus on optimizing Spark workloads and SQL queries to improve both performance and cost efficiency. Furthermore, the role involves implementing data observability practices, including monitoring and alerting pipelines, and developing as well as maintaining dashboards.

Qualifications:

  • Experience designing and developing data access layers, APIs, and data services, including access control and authentication to support secure data sharing across teams
  • Familiarity with APIs, preferably using Python
  • Experience with CI/CD pipelines for data platforms, infrastructure as code, and monitoring
  • At least two years of experience working with Databricks and Python
  • Experience in general backend engineering work
  • A minimum of three years of experience in data engineering
  • Experience developing scalable solutions using PySpark, Delta Lake, and Databricks Workflows
  • Experience with Git-based development workflows
  • At least two years of experience with cloud platforms, preferably Azure

Placement requirement: Three days on-site per week in Malmö

Language requirement: Fluency in either Swedish or English

Detaljer
Referens: 170645

Geografisk placering: Malmö, SE

Distansarbete:Hybrid

Omfattning:100%

Startdatum:2026-06-01

Slutdatum:2026-12-31

Ansök senast: 2026-04-28

Publiceringsdatum:2026-04-21

Konsultförmedlare

Det går inte längre att söka den här tjänsten.