Lead Data Engineer

Industry - Other
Level - Lead
  • Business Intelligence (BI)
  • Data Modelling
  • Team Management
Employment - Full time
Work Model - Hybrid

Client - STARK

Our vision is to be the most trusted business-to-business distributor of heavy materials in Northern Europe. We take responsibility for the entire value chain in our industry by providing the best terms possible for our customers and suppliers.


As part of STARK’s transformation process, Stark is looking for a team to help build a BI and DWH solution and Azure Databricks and Power BI based Analytics Platform.

Lisbon, Portugal


We’re looking for an experienced Data Engineer/Developer, who can help us not only deliver our BI and DWH solutions, but also build up our best practices, mentor more junior developers and drive a technical community of BI/DWH engineers.
Your key responsibilities will include:
  • Design, develop and maintain data streams and pipelines in an Azure Databricks and Power BI based Analytics Platform
  • Drive establishing the practices and ways of working in a data mesh architecture in STARK, including defining data modelling standards, ETL/ELT best practices, Power BI report building guidelines and domain-driven development approaches.
  • Perform technical reviews of more junior developers’ solutions.
  • Collaborate with the Enterprise Architects, Solution Architects, and other Technical Leads to establish the most effective way of processing data in an event-driven architecture.
Your assignments can include:
  • Build a new logistics data mart with complex KPI definitions.
  • Establish best practices for creating a data mart load and refresh using PySpark and spark SQL in a lambda architecture context.
  • Provide guidance and review on another developers’ back-end (databricks) and front-end (Power BI) reporting solution.


Must-have skills:
  • +8 years of experience working experience as a Data engineer, Data developer, DWH Developer or similar roles
  • Wide understanding of Data Warehouse and BI methodologies and concepts
  • Experience in building enterprise-wide data models
  • Experience with:
  • Proficient in Spark (Databricks)
  • Python
  • SQL
  • Fluency in English
Nice-to-have skills:
  • Practical knowledge of Git and CI/CD
  • Good understanding of industry trends and methodologies used in the Cloud environment (Azure ideally)
  • PowerBi
  • Kafka


Bachelor or Master’s in computer science or similar OR relevant years of experience.


No travel requirement but some roles may benefit from visiting teams and stakeholders in STARK BU’s.


  • Greenfield Project - building from scratch
  • The possibility of work with streaming the data - for future intention is the data to be consumed in real time as possible
  • On going transformation program
Cookies make our website work properly and better for you

We use cookies to improve your experience on our website.
By continuing to browse this site, by default you agree to the use of cookies.
You can change your preferences using the preferences button below.

Visit our Cookies Policy to learn more.