Skip to navigation Skip to main content
Position

Software Engineer - Data Platform Team

Berlin, DE

Remerge helps leading mobile app marketers increase revenue and retention by activating, re-engaging and retaining high-value users through programmatic in-app ads. Founded in 2014, Remerge has established itself as a leading app retargeting player globally, with offices spanning Berlin, New York, Singapore, Seoul and Tokyo. Our international team of experts has contributed to the growth of hundreds of apps across all major verticals, including gaming, on-demand delivery, e-commerce, and finance.

Job Mission

As a Software Engineer on the Data Platform team, your primary mission will be to develop and maintain robust, scalable, performant, and cost-effective data platform, to serve as the backbone for our data-driven initiatives, and empower our internal teams to operate autonomously and efficiently. 

You should apply if:

You are motivated to strengthen both data engineering and infrastructure skills, guided by senior and principal mentors. You like to take ownership of your tasks, and work autonomously, with quick feedback loops. You care for developer experience and want to builds tools, services, and processes that are enjoyable to use for other internal teams. You can discuss technical concepts with both technical and non-technical audiences, and care about clear and usable documentation.You know that what matters the most are the outcomes and the impact your work has on the users.

About The Team

We are a small platform team (3 engineers + team lead) in charge of maintaining data pipelines, managing databases, and enabling access to relevant datasets to users. Our main stakeholders are other internal development, analytical, and operational teams. We think of ourselves as the enablers for the other internal teams, and we strive to treat our Data Platform as a product. Hence we value a product mindset, with the ability to understand and clearly and transparently discuss both technical and business-related challenges, within and outside of our team. We also value autonomy and an ability to work in self-organising fashion.

Job Responsibilities

  • Data Engineering
    • Pipeline Development & Maintenance: Develop and update batch data pipelines using our existing stack (e.g., Prefect, dbt, dlt). 
    • Data Ingestion & Transformation: Collaborate with data scientists and analysts to understand new data requirements, then build ingestion and/or transformation steps.
    • Troubleshooting & Optimisation: Investigate pipeline failures, performance issues, and data quality concerns, propose and implement solutions.
    • Monitoring & Alerting: Ensure reliability and quality of data pipelines by Implementing and maintaining dashboards and alerting mechanisms
  • Platform Development
    • Platform Features: Contribute to new features that enable internal teams to use our platform more efficiently.
    • Data Quality & Governance: Support the implementation of data cataloging, governance frameworks, and other quality initiatives.
    • ML & Streaming Capabilities: Assist in building or enhancing features like a feature store for ML pipelines or real-time/streaming solutions.
  • Platform & Infrastructure Support
    • Docker & CI/CD: Build and maintain containerised services, set up CI/CD pipelines with GitHub Actions, automate deployments
    • Apache Druid administration: Assist with or learn about Druid administration, configurations, and integrations
    • Infrastructure-as-code: Gain exposure to Terraform and Nomad, learn about cluster management, and production monitoring.
    • Production Troubleshooting: Diagnose issues in live environments, using Linux-based tooling and SSH access when needed.
  • Enablement & Collaboration
    • Internal Tech Enablement: Enhance developer experience for other technical teams by creating well-documented, user-friendly platform tools and APIs.
    • Cross-Functional Work: Collaborate closely with data scientists, ML engineers, and analysts to ensure data services align with their needs.
    • Knowledge Sharing: Participate in code reviews, presentations, and write internal docs

Job Requirements 

  • Programming Proficiency 
    • Python: 3+ years of experience, with strong skills for building data pipelines, RESTful APIs, scripting, and automation.
    • SQL: Proficient in writing efficient queries and modelling data in relational or analytical databases.
    • Go: Experience is a plus (our backend production services are primarily written in Go).
  • Data Engineering Proficiency 
    • Comfortable working with data warehousing (BigQuery or similar) and relational databases (Postgres).
    • Comfortable working with one major cloud provider (preferably Google Cloud Platform).
    • Experience with workflow orchestration systems (Prefect, Airflow, or similar).
    • Exposure to data ingestion and transformation tooling (dbt, dlt, or similar).
  • CI/CD & DevOps Exposure 
    • Experience with Docker in development and production.
    • Experience setting up or maintaining CI/CD pipelines (GitHub Actions or equivalent).
    • Basic understanding of infrastructure-as-code, with room for mentorship.
    • Willingness to learn more about infrastructure topics, including Druid deployment and monitoring.
  • Other Relevant Tools (Any experience is a plus) 
    • Batch & Streaming Solutions: Spark, Kafka, etc.
    • Search engines: Elasticsearch
    • Key-Value Stores: Aerospike
    • ML-focused workflows (feature stores, MLOps frameworks)

Our promise

  • Unlimited vacation days - for real. We give you the freedom to figure out the most productive work-life balance for you
  • A truly modern place to work: work from home, from our brand new office in Berlin, or remotely - your work environment is yours to design
  • Generous remuneration package including virtual shares, a dedicated education budget
  • End of the year bonus determined by company performance
  • Enrol in our Short Term Assignment (STA) program and travel to our offices around the globe for up to a month - subject to eligibility and internal policy
  • Comfortable work setup - laptop, phone, screen(s), standing desk etc.
  • Support for your setup while working from home
  • Wellness benefits such as sports memberships, Nilo Health subscription and internet reimbursements
  • Company events including breakfast twice a week, team lunches, birthday and milestone celebrations, quarterly team events and company-wide parties & trips

Remerge is an Equal Opportunity Employer: all qualified applicants are considered for positions regardless of race, ethnic origin, gender, age, religion or belief, marital status, gender identification, sexual orientation, veteran status or disability. We're looking forward to your application!

Important notice: Protect yourself from scammers representing Remerge

Do not respond to job offers from third parties claiming to recruit on behalf of Remerge, or individuals impersonating as Remerge employees offering freelance or remote work opportunities. We do not ask job applicants for personal information like bank account numbers, address, money transfer requests, or other confidential details on any messaging platform. If someone reaches out to you to offer you a job, make sure that they are a verified  Remerge employee by checking their identity on official Remerge channels. 

Please note that the only job openings we offer are listed on our official website: remerge.io/careers

Stay vigilant and protect yourself from potential scams.

Please review our Applicants Privacy Policy. 


Apply