Join us!
We are looking for a
Fullstack Data & Platform Engineer (m/f/d) to strengthen our team.
At IntegrityNext, we are building a shared AI and data platform on AWS on top of our supply chain and product compliance platform. This platform will power semantic data access, BI, APIs, and agentic product experiences.
Our platform is initially centered on
PostgreSQL-backed structured data and will evolve toward
unstructured data and more
lakehouse-style capabilities on AWS. We work hands-on across
Python-based data services, pipelines, asynchronous integrations, AWS infrastructure, and SQL/PostgreSQL-based processing.
We work according to the principle
“You build it, you run it”: platform capabilities are owned end-to-end, from design and implementation to operational responsibility in production. We also follow
spec-driven development and actively use
AI-assisted engineering tools such as Claude Code, Cursor, and similar tools.
This is a strongly hands-on engineering role focused on building and operating reliable data infrastructure for internal product and platform use cases.
What can you expect?
Data Infrastructure & Platform Foundations- Build and maintain the core data infrastructure that powers the platform and enables scalable, reliable data usage across the company
- Design and implement data ingestion flows, ETL/ELT pipelines, and shared processing patterns
- Continuously evolve the platform’s structured data foundation, centered around PostgreSQL and extending toward S3-backed and lakehouse-style architectures
Event-Driven Architecture & Integrations- Design and implement event-driven and asynchronous workflows using AWS-native services (e.g. Amazon EventBridge)
- Ensure integrations are reliable, resilient, and maintainable in production environments
- Establish scalable patterns for decoupled system communication and data flow
Data Quality, Access & Reusability- Implement data quality checks, validation rules, and freshness controls to ensure trusted data inputs
- Enable reliable data usage across semantic models, APIs, and AI-driven use cases
- Build and maintain curated datasets and shared access layers to support reusable data consumption across teams
Platform Operations & Continuous Delivery- Implement and maintain CI/CD pipelines for data workflows
- Manage infrastructure using Infrastructure-as-Code principles
- Support production readiness through monitoring, troubleshooting, and continuous improvement
Collaboration & Engineering Excellence- Collaborate closely with Data & Platform Architects, Analytics Engineers, and AI engineers to align the data foundation with downstream needs
- Contribute to building a reusable, scalable data platform across domains
- Actively explore and apply spec-driven development and AI-assisted engineering workflows
- Help establish modern engineering practices that improve quality, speed, and maintainability
What should you bring along?
Experience & Domain Focus- Strong hands-on experience in data engineering, building and operating data platforms in production
- Experience with data ingestion pipelines and ETL/ELT workflows, ideally with dbt or similar tools
- Experience in AWS-based environments and modern SaaS / multi-team setups
- Familiarity with event-driven architectures and distributed systems
Technical / Methodological Skills- Strong skills in Python and SQL, with solid knowledge of PostgreSQL (querying, performance tuning, production workloads)
- Experience with orchestration and workflow scheduling
- Good understanding of event-driven systems (idempotency, retries, dead-letter handling, resilient processing)
- Hands-on experience with AWS services (e.g. SQS, Lambda, Step Functions), as well as CI/CD, infrastructure as code, and deployment automation
- Strong understanding of data quality, validation, observability, and operational reliability
Nice to have:- Experience with Airflow, Dagster, or similar orchestration tools
- Experience with S3-based storage, query engines, and lakehouse architectures
- Experience with Terraform or AWS CDK
- Exposure to semantic layers, reusable data products, or analytics engineering
- Experience with unstructured data ingestion and hybrid architectures
Ways of Working & Mindset- Strong ownership mindset, comfortable running data workflows in production end-to-end
- Structured, pragmatic approach to solving complex data and platform challenges
- Comfortable with spec-driven development and AI-assisted engineering workflows
- Willing to contribute across adjacent layers to enable end-to-end delivery
- Focus on building scalable, reliable, and maintainable systems
Communication & Collaboration- Strong collaboration with architects, analytics engineers, and cross-functional teams
- Ability to translate requirements into robust technical solutions
- Clear, pragmatic communication style
- Fluent in English, comfortable working in international environments
What do we offer?
Purpose & Impact- A role with real meaning that is both enjoyable and impactful
- The opportunity to make a sustainable contribution through your work
- Attractive compensation as part of a growing company
Attractive Benefits- 30 days of paid vacation
- EGYM Wellpass membership to support your work-life balance
- Flexible working models to better balance work and personal life
Modern Work Environment & Flexibility- Inspiring office spaces in the heart of Munich
- Flexible remote work from home or anywhere within Germany
Great Team- A professional, welcoming, and highly motivated team
- Collaboration at eye level with an open feedback culture
- An environment where people support each other and grow together
Flat Hierarchies & Ownership- Short decision-making paths and real opportunities to shape things
- Freedom to contribute and implement your own ideas
- A high level of ownership and responsibility