Datalake Devops

Industry - IT
Level - Senior
  • DevOps
  • AWS
  • Python
Employment - Full time
Work Model - Hybrid

Client - Topdanmark

Topdanmark is Denmark's second largest insurance company, and they are happy to have a wide and numerous ranges of customers, i.e. more than one million personal customers, every second Danish farm and one in six businesses in Denmark. Times are exciting in the Insurance space, and Topdanmark is making massive investments in IT projects and IT technology. The ambition is to offer customers some of the very best digital experiences on the market.

Topdanmark is a publicly traded company with headquarters in Copenhagen, Denmark. Topdanmark has 2,100 employees across the country (350 IT employees). Their principal task is to help those people who have shown confidence in them by letting them manage insurance policies. Topdanmark offers an attractive workplace with high professionalism and job satisfaction. You will get to work with competent and motivated colleagues from both Malaysia and Denmark. Mutual trust and no micromanagement are headlines when working in Topdanmark.

For more information, see www.topdanmark.com

Highlights

We are seeking a skilled and motivated Data Lake DevOps Engineer to join our dynamic team as we revolutionize data management through cutting-edge technologies and practices. You will be working in the Datalake data management space where you will monitor, enhance, automate and support our Datalake services, based on AWS and Snowflake as the key platforms.

Kuala Lumpur, Malaysia

Responsibilities

As a Data Lake DevOps Engineer, you will play a crucial role in monitoring and maintaining our data pipelines, AWS services, and ensuring seamless data flow within our data lake ecosystem. Your technical prowess will be put to the test as you diagnose, troubleshoot, and resolve complex data pipeline issues, all while supporting stakeholders and enhancing the overall efficiency of our data operations, based on AWS and Snowflake as the key platforms.
You will collaborate closely with cross-functional teams to enable efficient data storage, processing, and analysis for our organization's data-driven initiatives.

Your primary tasks will be to:

• Monitor Datalake data pipelines QC + AWS Services such as StepFunction/Glue/Lambda/ElasticSearch (good to have also)
• Diagnosing, troubleshooting, and identifying root cause of data pipelines issue and resolve them.
• Providing support and maintenance to Datalake stakeholders
• Standardizing deployment process
• Disaster recovery
• Security and legal compliance
• Annual patch schedule
• Operations planning

Qualifications

Must have:
• Atleast 5+ years proven experience as a DevOps Engineer or similar role, with a focus on data lake environment.
• Proficiency in Python, Javascript, and SQL is a must.
• Working in an Agile scrum team

Good to have:
• Experience with AWS services (e.g., Step Functions, Glue, Lambda) and familiarity with Snowflake is a plus.
• Cloud Certification is good to have
• CI/CD experience

Education

B.S. or higher degree in Computer Science, Engineering, or another technical field.

Travelling

Travel to Denmark for induction and subsequence travels depend on project requirement but is likely.

Remarks

All successful applicants will receive an official invite within 2 weeks of your application to discuss the next steps in the Job Application.

Cookies make our website work properly and better for you

We use cookies to improve your experience on our website.
By continuing to browse this site, by default you agree to the use of cookies.
You can change your preferences using the preferences button below.

Visit our Cookies Policy to learn more.

COOKIES