cubierta
Esta oferta ya no está disponible

Senior Data Engineer en Madrid o en remoto

Playtomic

Lugar de trabajo
En remoto
Horas
Full-Time
Prácticas
false
Habilidades
Comparte la oferta

Descripción de la oferta

We are the world’s largest racket sports App for players and clubs. Our goal is to socialize sports practice: through the app, players can book courts, find other players with similar skill levels, and join a social community dedicated to playing. Since 2017, Playtomic has earned the trust of national and international investors, consolidating it as one of the main world players in sports digitalization. You will find +160 co-workers around the world with headquarters in Spain and offices in Italy, Portugal, Sweden, Finland, Belgium, UK, EEUU and Mexico.


Your mission

We are seeking an experienced and passionate Senior Data Engineer to join our team and play a key role in building and maintaining our Customer Data Platform (CDP) on Google Cloud Platform (GCP). The ideal candidate will have a deep understanding of GCP data services and be able to work independently to design, develop, and deploy complex data pipelines and architectures. You will also be responsible for collaborating with cross-functional teams to ensure the successful integration and utilization of our CDP.


Responsibilities:


Design and Develop a Real-time Data Synchronization Pipeline:

  • Leverage GCP's event streaming capabilities, specifically Apache Kafka, to establish a near real-time data synchronization pipeline between our disparate data sources and our CDP.
  • Ensure seamless and continuous data ingestion from various sources, including CRM systems, marketing platforms, and transactional systems, to the CDP.
  • Implement data quality checks and monitoring mechanisms to maintain data integrity and consistency throughout the synchronization process.


Build a Scalable and High-Performance CDP:

  • Proficiency in GCP data services, including BigQuery, Dataflow, and Dataproc, to construct a scalable and high-performance CDP architecture.
  • Utilize BigQuery's data warehousing capabilities to store and analyze vast volumes of customer data with exceptional performance and scalability.
  • Employ Dataflow for stream processing and Dataproc for batch processing to handle the diverse data processing needs of our CDP.


Collaborate and Optimize Data Infrastructure:

  • Collaborate closely with data analysts, scientists, and business stakeholders to understand data requirements and provide technical solutions that align with business objectives.
  • Optimize data pipelines and infrastructure for performance and cost-efficiency, ensuring optimal utilization of GCP resources.
  • Document and maintain code and infrastructure to promote transparency, maintainability, and future development.


Stay Ahead of Data Technology Trends:

  • Continuously stay up-to-date with the latest GCP data services, best practices, and emerging technologies to ensure our CDP remains at the forefront of data innovation.
  • Explore and evaluate new data technologies, such as machine learning and artificial intelligence, to enhance the capabilities of our CDP.
  • Actively participate in industry events and conferences to network with peers and stay abreast of the latest trends in data engineering and analytics.


Our Ideal Player

  • Bachelor's degree in Computer Science, Engineering, or a related field
  • 8+ years of experience as a Data Engineer
  • Proven experience in designing, developing, and implementing data pipelines and architectures on GCP
  • Strong understanding of data warehousing concepts, including data modeling, data transformation, and data quality
  • Experience with various data processing tools and frameworks, such as Python, SQL, and Apache Spark
  • Familiarity with cloud computing concepts and experience with AWS or other cloud platforms is a plus
  • Excellent communication and collaboration skills
  • Ability to work independently and as part of a team
  • Passion for data and a commitment to data quality and integrity
  • Strong understanding of incrementality testing across multiple platforms.
  • Fluent English communication skills.


What’s in it for you

  • Salary: Depending on experience, to be discussed in the first meeting
  • 23 days of vacation + day off on your birthday + 2 days off for Wellness days in August.
  • 1 additional day off every year worked with us.
  • Hybrid / Remote work.
  • Flexibility to attend team-building events.
  • Social benefits (63€ a month in one of these: health insurance/tickets restaurant), flexible spending (pre-tax) in training, kindergarten, and transportation.
  • Gympass discount
  • Perks related to the brand and our sponsorship agreements
  • Summer work schedule (July-August 7h/day Monday to Friday)


Our hiring process

Last, and for you to start warming up, we want to show you what our process looks like:

  • People Interview - First contact and cultural fit in Playtomic
  • Technical Interview (1) with our Head of Data
  • Technical Interview (2) with our CTO
 

Acerca de Playtomic

  • Health & Fitness

  • Madrid, Spain

  • 50 - 200

  • 2017

.

Otras ofertas de ingeniero de datos que podrían interesarte...