Hunkemöller · Standplaats: Hilversum · 17 april 2026

life at Hunkemöller

Cloud Data Engineer

Apply for this vacancy

Cloud Data Engineer

Hunkemöller is looking for a Cloud Data Engineer to take a core role in our digital data transformation. This is a fantastic opportunity for a forward-thinking, adaptable data engineering professional to help build and scale the ingestion and infrastructure backbone of our next-generation data platform on GCP

We believe in an "AI-first" development approach . We are looking for an agile learner who leverages modern AI tools (like Gemini, Claude, and Copilot) alongside strong Python and GCP engineering skills to accelerate development and creatively solve complex integration challenges .

This position is tailored for a Data Engineer who loves the " plumbing " of data - connecting APIs, orchestrating workflows , and moving data seamlessly between systems . You will own our upstream ingestion framework , manage our Google Cloud infrastructure (Cloud Composer, Dataflow , Cloud Run), and power our Reverse ETL processes to ensure our operational systems have the data they need .

  • Build Ingestion Pipelines: Design, develop , and deploy robust data ingestion pipelines from various third -party APIs, webhooks , and source systems into Google Cloud.

  • AI- Augmented Engineering: Actively leverage advanced AI coding assistants to accelerate pipeline development , generate boilerplate API connection code, debug complex scripts , and automate repetitive tasks .

  • GCP Infrastructure & Orchestration: Build and manage data workflows using Cloud Composer ( Airflow ), and leverage Cloud Run and Dataflow for scalable , containerized data processing .

  • Drive Reverse ETL: Architect and maintain the data pipelines that push refined data from BigQuery back into our operational platforms ( marketing tools , CRM, etc.) to drive business action .

  • Manage Operational Databases: Utilize Firestore and other NoSQL/relational databases to support operational data needs and microservices .

  • Collaborate and Learn : Partner with our data modeling specialists to ensure smooth handoffs between ingestion and transformation . Participate in code reviews and continuously share new engineering best practices .

  • Core Data Engineering Experience: 3 to 5 years of hands -on experience in data engineering , with a strong focus on data integration , APIs, and pipeline architecture .

  • AI Adaptability & Continuous Learning: A strong desire to learn quickly and adapt to new technologies . You embrace modern development practices and are highly comfortable using AI tools as a force multiplier in your daily work .

  • API Integration & Python: Strong Python programming skills with a proven track record of building custom API extractors , handling pagination , rate limiting , and working with REST/ GraphQL endpoints .

  • GCP Service Expertise: Hands-on experience with Google Cloud Platform's ecosystem , specifically Cloud Composer, Dataflow , Cloud Run, and Firestore .

  • Code Quality & CI/CD: Proficient in writing clean, well- documented , and tested code (e.g., pytest ), with strong experience using Git , Docker, and CI/CD pipelines .

  • Bonus Skills (Nice to Have): Experience managing Infrastructure as Code (specifically Terraform ) or working with downstream data transformation tools ( dbt ).

  • English Proficiency : Excellent written and verbal English communication skills .

Hunkemöller strives to be a much loved , social & inclusive brand . A place where people love to work , are proud of the brand , and where we create true brand ambassadors . Working in a passionate , energetic , design- led and performance- driven environment where our key customer persona ' Shero ' sits at the heart of everything we do. Hunkemöller is certified TOP EMPLOYER of the Netherlands 2024, which underlines our people initiatives and achievements .

Together Tomorrow - Join a Retailer that's on the move to be better for our planet , better for people , better together ! From diversity & inclusion , reducing waste , to product care and how we work with our suppliers , our Together Tomorrow initiative reflects what we do and helps drive change across our business . Ready to help us achieving our ambitious goals ? Where ever you'll start working with us , if in Stores or our HQs, you can contribute !

We believe in an "AI-first" development approach . We are looking for an agile learner who leverages modern AI tools (like Gemini, Claude, and Copilot) alongside strong Python and GCP engineering skills to accelerate development and creatively solve complex integration challenges .

This position is tailored for a Data Engineer who loves the " plumbing " of data - connecting APIs, orchestrating workflows , and moving data seamlessly between systems . You will own our upstream ingestion framework , manage our Google Cloud infrastructure (Cloud Composer, Dataflow , Cloud Run), and power our Reverse ETL processes to ensure our operational systems have the data they need .

  • Build Ingestion Pipelines: Design, develop , and deploy robust data ingestion pipelines from various third -party APIs, webhooks , and source systems into Google Cloud.

  • AI- Augmented Engineering: Actively leverage advanced AI coding assistants to accelerate pipeline development , generate boilerplate API connection code, debug complex scripts , and automate repetitive tasks .

  • GCP Infrastructure & Orchestration: Build and manage data workflows using Cloud Composer ( Airflow ), and leverage Cloud Run and Dataflow for scalable , containerized data processing .

  • Drive Reverse ETL: Architect and maintain the data pipelines that push refined data from BigQuery back into our operational platforms ( marketing tools , CRM, etc.) to drive business action .

  • Manage Operational Databases: Utilize Firestore and other NoSQL/relational databases to support operational data needs and microservices .

  • Collaborate and Learn : Partner with our data modeling specialists to ensure smooth handoffs between ingestion and transformation . Participate in code reviews and continuously share new engineering best practices .

  • Core Data Engineering Experience: 3 to 5 years of hands -on experience in data engineering , with a strong focus on data integration , APIs, and pipeline architecture .

  • AI Adaptability & Continuous Learning: A strong desire to learn quickly and adapt to new technologies . You embrace modern development practices and are highly comfortable using AI tools as a force multiplier in your daily work .

  • API Integration & Python: Strong Python programming skills with a proven track record of building custom API extractors , handling pagination , rate limiting , and working with REST/ GraphQL endpoints .

  • GCP Service Expertise: Hands-on experience with Google Cloud Platform's ecosystem , specifically Cloud Composer, Dataflow , Cloud Run, and Firestore .

  • Code Quality & CI/CD: Proficient in writing clean, well- documented , and tested code (e.g., pytest ), with strong experience using Git , Docker, and CI/CD pipelines .

  • Bonus Skills (Nice to Have): Experience managing Infrastructure as Code (specifically Terraform ) or working with downstream data transformation tools ( dbt ).

  • English Proficiency : Excellent written and verbal English communication skills .

About Us :

Hunkemöller strives to be a much loved , social & inclusive brand . A place where people love to work , are proud of the brand , and where we create true brand ambassadors . Working in a passionate , energetic , design- led and performance- driven environment where our key customer persona ' Shero ' sits at the heart of everything we do. Hunkemöller is certified TOP EMPLOYER of the Netherlands 2024, which underlines our people initiatives and achievements .

Together Tomorrow - Join a Retailer that's on the move to be better for our planet , better for people , better together ! From diversity & inclusion , reducing waste , to product care and how we work with our suppliers , our Together Tomorrow initiative reflects what we do and helps drive change across our business . Ready to help us achieving our ambitious goals ? Where ever you'll start working with us , if in Stores or our HQs, you can contribute !

WE OFFER YOU

25 HOLIDAYS

You will get 25 holidays per annum, based on a fulltime role.

BUY & SELL DAYS

Every year in April, you will be eligible to buy or sell additional vacation days.

HYBRID WORKING

Benefit from the flexibility of working from home, ensuring you can maintain productivity from anywhere. We have a 60:40 policy based on your working hours.

HUNKEMOLLER ACADEMY

Continuous learning opportunities and professional growth are supported through various training programs and workshops.

25-55% STAFF DISCOUNT

Our products are much loved with our employees. As part of the brand, you will receive a discount of 25% on all Hunkemoller products. On some Mondays we have an additional sale on our newest collections. On top of the 25% discount, you will receive an additional 25 - 35% discount.

WORK FROM ABROAD

Get the opportunity to work from abroad for up to 2 weeks!

WE OFFER YOU

HYBRID WORK

Benefit from the flexibility of working from home, ensuring you can maintain productivity from anywhere. We have a 60:40 policy based on your working hours.

BUY & SELL DAYS

Every year in April, you will be eligible to buy or sell additional vacation days.

HUNKEMOLLER ACADEMY

Continuous learning opportunities and professional growth are supported through various training programs and workshops.

25 HOLIDAYS

You will get 25 holidays per annum, based on a fulltime role.

25-55% STAFF DISCOUNT

Our products are much loved with our employees. As part of the brand, you will receive a discount of 25% on all Hunkemoller products. On some Mondays we have an additional sale on our newest collections. On top of the 25% discount, you will receive an additional 25 - 35% discount.

WORK FROM ABROAD

Get the opportunity to work from abroad for up to 2 weeks!

THE JOURNEY TO YOUR DREAM JOB

HIRED

Welcome to Hunkemöller - You'll sign your digital contract and are ready to start.


Meld Misbruik

Hunkemöller

Standplaats: Hilversum

17 april 2026

Vacature kenmerken


Functiegroep
Overig
Functie
big data engineer
Branche
Mode / Textiel / Cosmetica
Dienstverband
Vast
Uren
40 uur per week
Opleidingsniveau
HBO
Carriere
Ervaren
Werklocatie
Liebergerweg, Hilversum

Contact


Adres
Hunkemöller
Contactgegevens