Find A Job That
Fits Your Ambition.

Startup and scale-up jobs in the Greater Rotterdam Area

Senior Data Engineer



Data Science
Rotterdam, Netherlands
Posted on Friday, January 19, 2024

About HousingAnywhere

HousingAnywhere is Europe’s largest mid-term rental platform. With Kamernet and Studapart under its umbrella, it represents three fast-growing brands with over 30 million yearly unique visitors combined, 160,000+ properties available for rent and 100,000+ tenants securing their new homes, based on the 2022 performance. HousingAnywhere serves young professionals and students, primarily aged between 18 and 35, connecting them with accommodation providers. Through its advanced technology platform, tenants rent accommodation for 3 to 12 months outside of their country of origin. Headquartered in Rotterdam, HousingAnywhere operates in most European cities and recently expanded to key cities in the US, establishing a presence in over 125 cities. Driven by the mission to enable people to live wherever and however they choose, thanks to a flexible renting experience, the technology scale-up employs 340 professionals globally.

Our mission

Rent Easy, Live Free.

We are empowering people to live wherever and however they choose. To find comfort and peace of mind on the other side of the world or the other side of town. All while feeling confident and totally at ease, whatever their adventure might involve. We are doing it by creating a new standard of renting. Safe. Harmonious. More options. Less hassle. With the help of our trusted networks of landlords and partners.

Our Values

  • Ownership
  • We are Enablers
  • We are Changemakers
  • We are Connectors

The team

This role will be part of the Data Engineering team, which is part of the Data Science & Engineering organization. This organization also consists of Analytics Engineering and Data Science teams. The Data Engineering team has the mandate to empower our customers with data by building the necessary systems and infrastructure to support it. The team consists of 2 Data Engineers at the moment.

Our stack

Our data tech stack -

  • Snowflake DataWarehouse
  • Airflow
  • Rudderstack
  • Stitch
  • Cube
  • DBT

APIs and data pipelines built and managed by the DE team run on the GCP Kubernetes cluster. This tech stack is what we have today, but it will evolve in the future as the company and team grow.

Your role & Impact

  • Lead the development, and optimization of data pipelines that enable seamless data access for customers through APIs, ensuring high performance and data quality.
  • Improve and maintain the central event bus infrastructure, employing pub-sub mechanisms to enable real-time data streaming and distribution across the organization's systems.
  • Design, implement, and manage data ingestion pipelines that capture, transform, and store event data for advanced analytics, working with both batch and real-time data processing technologies.
  • Drive data contracts with data producers, outlining data formats, quality expectations, and delivery schedules, ensuring a smooth data ingestion process.
  • Collaborate closely with cross-functional teams to understand data requirements and business use cases, translating them into efficient and scalable data engineering solutions.
  • Build data integrations - automation tools to enable data applications for sales, marketing, and product teams
  • Ensure the reliability, availability, and security of the data pipelines, implementing best practices for data governance and monitoring.
  • Mentor junior team members, fostering their growth and cultivating a culture of technical excellence within the team
  • Stay current with industry trends and advancements in data engineering and streaming technologies, and proactively propose relevant enhancements to our data infrastructure.

Your profile

  • 4 to 5 years of professional experience in data engineering, including expertise in designing and implementing data pipelines.
  • Proficiency in at least 1 programming language (Preferably Python)
  • Experience working with Containerization.
  • Experience building and scaling batch data pipelines.
  • Experience building APIs and designing Real-time data pipelines using technologies like Kafka, Kinesis, pub-sub, etc is a plus.
  • Strong understanding of technology choices from the lens of trade-offs
  • Excellent problem-solving skills and the ability to troubleshoot complex data engineering challenges.
  • Strong communication skills, with the ability to convey technical concepts to technical and non-technical stakeholders effectively.

What’s in it for you

  • Diverse international community (46+ nationalities).
  • Hybrid working policy.
  • Unlimited paid holidays, minimum-based not maximum.
  • 1,000 EUR personal development budget.
  • Complete coverage for commuting.
  • Personal equipment, including laptop and ergonomic setup.
  • Relocation support & 30% ruling application assistance.
  • Gym membership discount with GoVital or OneFit.
  • Variable pension scheme.
  • Dutch/English classes budget.
  • Fun team-building and after-work drinks every Friday.

If you have further questions, please email

By applying to work at HousingAnywhere, you agree to our Candidate Privacy Policy