We are seeking an experienced Data Platform Architect with 5+ years of experience and a deep understanding of data, AI and automation to design high-impact business systems that drive operational efficiency and increase value for Eventim customers. In this role, you will work closely with business leaders and technical teams to understand complex business needs, architect scalable solutions and integrate modern tools and platforms.
What to expect:
- Data-Platform Architecture Design: Design and implement a lakehouse-style, multi-cloud data platform on AWS and Google Cloud, balancing cost, performance and resilience.
- Ingestion & Streaming Frameworks: Design, build, and operate batch, real-time, and streaming pipelines to move data from diverse internal and external sources into the unified platform while preserving scalability, low-latency delivery, and data integrity.
- Unified Customer Data Model: Architect schemas that collect and reconcile events from marketing, advertising and web-tracking channels into a single customer profile, synchronized with CRM and CDP systems.
- AI & ML Platform Architecture: Define the end-to-end framework - feature pipelines, distributed training, automated evaluation, model registry, and low-latency serving - on managed cloud AI services, so use cases such as text summarization, anomaly detection, and predictive analytics can move from experimentation to production safely and at scale.
- Security: Define zero-trust policies, fine-grained IAM, row/column security, tokenization and enterprise API-level protections, compliance.
- Governance: Embed lineage, quality and cataloguing (OpenLineage, DataHub).
- Thought Leadership & Mentoring: Set reference architectures, review designs, coach engineers and champion best practices in event-driven, product-oriented data platforms.
What you’ll need:
- MSc in Computer Science, Electrical Engineering, Software Engineering, ML Engineering or related fields (or equivalent experience).
- 5+ years designing and operating cloud-native data platforms that power large-scale analytics and AI products.
- Expert data-modeling skills (dimensional, Data Vault, entity-relationship, graph) and hands-on mastery of Snowflake or similar cloud data warehouses.
- Proven track record building end-to-end, production-grade data pipelines - both batch and streaming - using tools such as Airflow, Databricks, Google Pub/Sub, Dataflow, Kafka, or comparable orchestration frameworks.
- Experience with API-management (Mulesoft & Keycloak).
- Deep knowledge of modern ELT/ETL patterns, data quality & observability tool-chains, and data-governance best practices (incl. GDPR and RBAC).
- Programming fluency in SQL and Python (Scala/Java or C++ a plus) plus CI/CD and Infrastructure-as-Code (Terraform, GitLab CI/CD, GitHub Actions).
- Working understanding of ML-ops concepts - feature stores, model registries, retrieval-augmented-generation (RAG) architectures - and how to shape data layers that serve them.
- Strong communication and presentation skills and be aware of new trends and technologies in the field of information architecture.
- Fluent in English, German would be ideal.