YOU will …
- implement, optimise and maintain ETL processes.
- manage our Big Data infrastructure, including the petabyte-scale Hadoop cluster.
- collaborate with Data Scientists on building scalable and reliable data pipelines.
- develop new and improve existing business intelligence tools and dashboards.
- build data expertise and take responsibility for data quality in our data warehouse.
YOU have …
- proficiency with one or more programming languages we use every day (Java, Scala, Python).
- solid experience working with databases (SQL and/or NoSql). Ideally you can bring expertise in one technology to the table (let it be Postgresql or HBase, you name it).
- a DevOps attitude and the will to take responsibility for operating the software that you design and implement.
- some hands-on experience with data processing at scale.
- understanding core concepts behind Hadoop and/or Kafka.
- enthusiasm for problem solving and debugging complex systems.
We offer …
- a small, young and highly passionate team of extraordinary co-workers.
- an open and embracing culture for the latest hot technologies.
- an agile, focused, yet relaxed atmosphere with flexible working hours.
- a high level of freedom and responsibility.
- a competitive salary.
- a nice and spacious Berlin-style office in the heart of Neukölln.
- awesome Coffee and a large-scale reservoir of Club Mate that can be accessed in real-time!
Send an informal application (including CV, earliest possible entry date) to: firstname.lastname@example.org
About mbr targeting