Filled
This offer is not available anymore

Data Engineer in Berlin

Careem

Workplace
Onsite
Hours
Full-Time
Internship
No
Share offer

Job Description

Data Engineer – Marketplace

Careem is the leading technology platform for the greater Middle East. A pioneer of the region’s ride-hailing economy, Careem is expanding services across its platform to include payments, delivery and mass transportation. Careem’s mission is to simplify and improve the lives of people and build a lasting institution that inspires. Established in July 2012, Careem operates in more than 120 cities across 15 countries and has created more than one million job opportunities in the region.

As a Data Engineer you will:

  • Build ETL pipelines and create solutions for data measurement, to enable data scientists & product teams analyze the data effectively (Spark, Hive, etc.)
  • Develop techniques to analyse and enhance both structured and unstructured data
  • Implement streaming pipelines using Spark Streaming, Kafka (Kafka Connect, Kafka Streams, etc.) and AWS Products (Dynamo DB, Lambda, Kinesis, S3, etc.)
  • Make sure pipelines are production-ready, including IaaC best practices and monitoring
  • Own data products – from development to production
  • Work in an agile team with other Data Engineers as well as work with different product teams

Requirements

The ideal candidate will have a passionate commitment to improving the lives of people, an insane focus on excellence and customer service, and a strong alignment with our core values: being bold, focused, agile and collaborative.

  • In-depth experience in programming efficient, scalable, reliable data pipelines in Java or Scala
  • Solid knowledge of database systems concepts (data modeling, partitioning, indexing, joins, etc.)
  • Familiarity with theoretical distributed systems concepts and parallel data processing
  • Hands-on experience deploying and using the Apache Big Data Stack (Hive, Spark, Kafka, etc.)
  • Solid Linux system and CLI skills: ssh, process monitoring, storage management, tailing logs, etc.
  • Hands-on RDBMS (e.g., Oracle or MySQL) skills

It’s a big plus if you also bring one or more of the following:

  • Python scripting, especially for data processing
  • A great agile mindset and hands-on experience
  • Experience working with and maintaining clusters consisting of 10s-1000s of machines
  • Experience with AWS (EC2, S3, EMR, Kinesis, Lambda)
  • Working knowledge of containerisation (Docker), and supporting technologies
  • Experience with orchestration tools like Airflow
  • Knowledge and experience with version control systems (GIT) and Java build systems (maven)
  • Experience with CI/CD tools, including Jenkins

What do we offer you?

Working in an international environment with colleagues from 70+ nationalities, ownership culture, flexible working hours, unlimited (paid!) holidays and the latest technologies.


 

About Careem

  • Logistics

Careem company page is empty
Add a description and pictures to attract more candidates and boost your employer branding.

Other data engineer jobs that might interest you...