
Senior Data Engineer
- Hybrid
- Hoofdoorp , Noord-Holland, Netherlands
Are you passionate about data? Let’s talk.
Job description
At Dataleaps, we're not just hiring engineers—we’re looking for people who love to build, break, fix, and improve.
People who get excited about solving real problems—not just writing clean code. For us, a great job isn’t just about the work. It’s about learning, growing, and being part of a team that genuinely cares.
No egos. No pointless meetings. Just meaningful tech, smart people, and a culture where your voice matters.
Are you passionate about data? Let’s talk.
🚀 The Mission
Design and ship reusable domain data products—not one-off, point-to-point data feeds.
Instead of building a quick 10-field data extract for one stakeholder, you'll create robust, versioned data products (~40+ fields) that serve a wide range of current and future needs across teams and domains.
🛠️ Tech Stack You'll Use
Python – Core language for building APIs and data services
Terraform – Infrastructure as Code
Cloud Platforms – GCP preferred (Cloud Run, BigQuery), but AWS or Azure experience welcome
dbt – Data modeling and transformation
Apigee (or equivalent) – API management and monitoring
ETL/ELT – Pipeline orchestration, testing, and lineage
CI/CD – GitHub Actions, Bitbucket Pipelines, etc.
Networking – Secure service-to-service communication
🧩 What You’ll Do
Own the full lifecycle of data products: ingest, transform (dbt), store (BigQuery or equivalent), and serve via APIs (Python on Cloud Run or similar)
Translate business needs into robust data contracts with well-designed schemas, versioning, and change management
Design for scale and reuse, replacing brittle, ad-hoc solutions with maintainable and scalable data products
Expose and secure data via APIs and warehouse interfaces with proper access controls and compliance
Automate infrastructure with Terraform, containerize services, and optimize for performance and cost
Collaborate cross-functionally with analysts, product managers, and engineers; lead technical discussions and mentor others
Job requirements
✅ What We’re Looking For
Must-Haves
5+ years of experience building and maintaining end-to-end data products in production environments
Strong expertise in Python and cloud data warehouses (preferably BigQuery)
Solid experience with dbt, API design, and data modeling/lineage
Proven ability to guide stakeholders from vague requests to well-structured, reusable data solutions
Clear, proactive communicator; confident working directly with technical and business teams
Nice-to-Haves
Experience with Apigee, Terraform modules, and serverless containers
Strong foundation in observability, security-by-design, and cost management
Background in complex data modeling (e.g., dimensional models, domain-driven design)
🎯 Why Join Us?
Work on data that powers real-world decisions
Build high-impact, reusable systems—not just pipelines
Join a small, senior team that values autonomy, ownership, and growth
Be part of a company that moves fast and builds thoughtfully
or
All done!
Your application has been successfully submitted!