Filled
This offer is not available anymore

BI Data Engineer in Barcelona

Kiwi.com

Workplace
Onsite
Hours
Full-Time
Internship
No
Share offer

Job Description

Our toolchain democratizes access to data for everyone and makes it easy and painless to run experiments to establish cause and effect. The team focus is on the complete data life-cycle to ensure any data leaving Kiwi.com is of the highest quality.

If you are interested in putting our data loads and collectors on the next level including both the batch and real-time processing of our data routines, you are the ONE we looking for! If you love to experiment and build on top technologies like airflow, custom python apps or anything else from the open source world come to see us!

Few examples of our Data Engineers’ work:

Data workflow management: to manage our data loads for the Analytics world we’re using the Apache Airflow. Apache Airflow enables scheduling data-related workflows with a code-as-configuration model and web front end, we driving our data routines to feed up data provisioning customers.

Real-time streaming infrastructure: to enable our analytics teams to move quickly, getting accurate data with minimal delay is a core focus in data provisioning & engineering. Currently, we are building out real-time infrastructure to allow for easy development of streaming applications that includes anomaly detections and forecasts.

Interactive dimensional analysis: our data analysts have a strong need to query data and compute aggregates on various dimensional cuts in “yesterday was too late” frame. To address this, we are building a query tool stack to allow users to interactively slice-and-dice large datasets.

What will you do?

  • Develop, monitor and support our data workflow management environments and ELT/ETL routines tooling as a service, as well as decommission any no longer used service/tool in order to perform reasonable data associated with those decommissioned tools
  • Provide continuous support on data workflow management end ETL jobs for our data infrastructure services; maintain and provide all relevant information on current infrastructure and tools
  • Educate on the current toolings and data used within the data provisioning stack in order to make the access easier for anyone in the company
  • Regularly update and clearly communicate on the team achievements and main projects progress

What we expect?

  • 2+ years of full-time, industry experience
  • Experienced & interested in technologies like Airflow, Postgres, Redis/Kafka, or Presto
  • Working knowledge of relational databases and query authoring (SQL)
  • Working with batch and real-time data processing routines
  • Strong coding skills in Python (preferred) / Ruby
  • Rigor in high code quality, automated testing, and other engineering best practices
  • Operations of robust distributed systems in cloud (AWS, Google Cloud) is the best fit
  • BS/MS in Computer Science or a related field (ideal)

 

About Kiwi.com

  • Travel

  • Brno, Czech Republic

  • 1,000 - 5,000

  • 2012

.

Other data engineer jobs that might interest you...