CTS EVENTIM
Alt

Data Engineer (m/f/d) - Customer 360 & CRM

As a Data Engineer for Customer 360 & CRM, you will build and maintain scalable data pipelines - real-time, near-time, and batch - connecting internal systems with our SaaS Customer Data Platform (CDP) and Customer Engagement Platform (CEP).

You’ll be responsible for ensuring accurate, reliable, and compliant data flows between systems that power personalized customer engagement and marketing automation. Your work will directly enable targeted campaigns, segmentation, analytics, and insights that drive Eventim’s data-driven growth strategy.

Working closely with Marketing, Product, Data Science, and Engineering teams, you will be the technical backbone behind our Customer 360 ecosystem - turning complex, multi-source data into structured, privacy-compliant, and actionable information.

 

Key Responsibilities:

  • Design, implement, and maintain real-time, near-time, and batch data pipelines connecting internal and external data sources to CDP & CEP.
  • Implement and optimize event streaming, API integrations, and data transformations across cloud environments (e.g., GCP, BigQuery, Airflow, dbt).
  • Ensure pipeline stability, observability, and performance across development, staging, and production environments.
  • Guarantee data quality, security, and compliance, including handling of consent and GDPR requirements.
  • Define, maintain, and document data contracts, interface standards, and API requirements with development teams.
  • Develop data models that support campaign activation, customer segmentation, and analytics
  • Collaborate with CRM, Marketing, and Data Science to align technical data pipelines with business requirements.
  • Continuously optimize data architecture for scalability, reliability, and cost efficiency.

Key Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field (or equivalent experience)
  • 5+ years of experience as a Data Engineer or in a similar role.
  • Proven experience in building real-time and batch data pipelines (e.g., Kafka, Pub/Sub, Airflow).
  • Strong proficiency in SQL, Python, and cloud data tools (preferably GCP / BigQuery).
  • Hands-on experience with ETL/ELT frameworks (Airflow, dbt, Dataflow, etc.).
  • Deep understanding of data modeling, data governance, and API integration patterns.
  • Experience with CDP / CEP platforms (e.g., Zeotap, Tealium, Segment, Twilio, Bloomreach, Salesforce Marketing Cloud, Braze, Insider, etc.).
  • Familiarity with CI/CD pipelines, Terraform, and Git-based workflows.
  • Strong focus on data quality, observability, and monitoring (e.g., Datadog, Cloud Monitoring).
  • Understanding of GDPR, consent frameworks, and secure data management.

Benefits

Sofakonzerte & Mitarbeitenden-Events​

Rabatt beim Ticketkauf & Clearing-Einsätze​

25 Tage Workation aus dem EU-Ausland 

30 Tage Urlaub & Möglichkeit auf 15 Tage unbezahlten Urlaub​​​

Corporate Benefits & Vergünstigungen bei Kess​

Flexible Arbeitszeiten​

Zentrale Lage & Bezuschussung ÖPNV​

Bikeleasing​

Mental Health Programm & betriebliche Altersvorsorge​

Sprachlernplattform sowie Lunch & Learn