The salary range for this position is (contract of employment):
mid: 14 200 - 19 690 PLN in gross terms
A hybrid work model requires 1 day a week in the office
We are seeking a passionate Data Engineer to join the newly forming A/B Testing Platform team in the Data Science Hub where we apply analytical techniques, mathematics, and machine learning to solve a wide range of business problems.
About the team
The A/B Testing Platform team is a multidisciplinary group of product analysts, software engineers, and data engineers. Our mission is to strategically enhance our A/B testing platform, a critical tool that empowers data-driven decision-making regarding the roll out of new features by assessing the potential impact of these features through user behavior analysis. Through tasks performed, the team plays a pivotal role in shaping the overall user experience on Allegro, one of the world's largest eCommerce platforms.
We are looking for people who:
Have a Bachelor's or Master's degree in Computer Science, Mathematics or a related field.
Know English at min. B2 level.
Have proven experience as a Data Engineer or in a similar role
Possess necessary data-related skill set, meaning:
Are able to fluently work with SQL preferably GCP BigQuery.
Have knowledge of BigData tools in Google Cloud Platform, AWS or Azure.
Have experience with message broker systems and streaming data processing eg. Pub/Sub, Apache Beam
Are aware of data pipelines orchestration tools like Apache Airflow.
Have experience in Python programming and are familiar with software engineering best practices (PEP8, clean architecture, code review, CI/CD etc.).
Experience with Infrastructure as a Code tools - Terraform is welcomed.
Have proven commercial experience in DevOps and CI/CD practice.
Have strong communication skills, capable of conveying complex ideas in a clear, concise manner.
Are detail-oriented and capable of working in a fast-paced, dynamic environment.
Have a positive attitude and ability to work in a team.
Are eager to constantly develop and broaden their knowledge.
In your daily work you will handle the following tasks
Designing, developing, and maintaining robust, scalable data pipelines.
Collaborating closely with product managers, UX designers, data analysts and software engineers to understand their requirements and deliver high quality, prepared data to enable their work.
Building, testing, and maintaining data systems for accuracy and readiness for a bigger pipeline containing streaming data flow.
Designing and implementing data schemas, data models, message brokers, and SQL/No-SQL databases.
Optimizing data systems and building them from the ground up to deliver insights for data analytical systems.
Implementing data pipelines and automated workflows required for the A/B testing platform.
Ensuring data privacy and compliance standards across all projects.
Operating with multiple platforms and technologies such as Google Cloud Platform, Azure Cloud and Allegro Data Centers.
Delivering solutions for multiple markets.
Balancing engagement across ad-hoc support of Product Managers and Data Analysts requests.
Why is it worth working with us:
Data plays a key role in the operation of Allegro - we are a data-driven technology company, and through the models and analyses provided you will have a significant impact on one of the largest eCommerce platforms in the world.
Gain invaluable experience and deepen your skills through continuous learning and development opportunities.
Collaborate with a network of industry experts, enhancing your professional growth and knowledge sharing.
We are happy to share our knowledge. You can meet our speakers at hundreds of technological conferences such as Data Science Summit, Big Data Technology Warsaw Summit. We also publish the content on the allegro.tech blog.
We use, depending on teams and their needs, the latest versions of Java, Scala, Kotlin, Groovy, Go, Python, Spring, Reactive Programming, Spark, Kubernetes, TensorFlow.
Microservices – a few thousand microservices and 1.8m+ rps on our business data bus.
In the Data&AI team you would be a part of a team consisting of over 200 data, ML & product specialists overseeing dozens of products, few hundred production ML models and governs all data in Allegro (several dozen petabyte scale).
We practice Code Review, Continuous Integration, Scrum/Kanban, Domain Driven Design, Test Driven Development, Pair Programming depending on the team.
GenAI tools (e.g., Copilot, internal LLM bots) support our everyday work.
Our internal ecosystem is based on self-service and widely used tools, such as Kubernetes, Docker, GitHub (including CI/CD). This will allow you, from day one, to develop software using any language, architecture and scale, restricted only by your creativity and imagination.
We actively participate in the life of the biggest user groups in Poland centered around technologies we use at work (Java, Python, DevOps).
Technological autonomy: you get to choose which technology solves the problem at hand (no need for management’s consent), you are responsible for what you create.
Once a year, you can take advantage of the opportunity to work in a different team or more often if there’s an internal business need (known as team tourism).
What we offer:
A hybrid work model that you will agree on with your leader and the team. We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms).
Annual bonus up to 10% of the annual salary gross (depending on your annual assessment and the company's results).
A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g., medical, sports or lunch packages, insurance, purchase vouchers).
English classes that we pay for related to the specific nature of your job.
16" or 14" MacBook Pro with M1 processor and 32GB RAM or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need.
Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise.
A high degree of autonomy in terms of organizing your team’s work. We encourage you to develop continuously and try out new things.
Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communication, motivation to work and various technologies and subject-matter issues).
If you want to learn more, check out this webpage or listen to the Allegro Tech Podcast Episode about recent projects in the Data Science Hub.
Apply to Allegro and see why it is #dobrzetubyć (#goodtobehere)
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Are you a passionate Data Engineer looking to make a significant impact? Join the Data Science Hub at Allegro in Warsaw, Poland, as we assemble a dynamic A/B Testing Platform team focused on harnessing data to drive critical business decisions. In this role, you’ll develop and maintain scalable data pipelines, collaborating with talented product managers, software engineers, and data analysts to ensure accurate data delivery for insightful user behavior analysis. We value problem solvers who thrive in fast-paced environments and are eager to learn and grow. Your day-to-day tasks will involve designing and implementing data schemas, optimizing systems, and maintaining data integrity across various platforms. Plus, with our hybrid work model that offers flexibility in working arrangements, you can enjoy the best of both worlds. As part of our team, you’ll have access to advanced tools and technologies, participate in hackathons, and have opportunities for continuous learning. The Data&AI team consists of data specialists and ML experts who manage vast datasets while encouraging autonomy in tech choices. Enjoy perks like a competitive salary, a comprehensive benefits cafeteria, and the chance to attend notable tech conferences. If you're ready to shape the future of one of world's largest eCommerce platforms, apply to be a Data Engineer at Allegro and discover why it’s #dobrzetubyć (#goodtobehere).
Allegro is the most popular Polish shopping destination with about 17 million users monthly and over 1.1 million items sold on the platform daily. Making a site like this work requires a lot of engineering and as the site grows, we learn and adopt...
30 jobsSubscribe to Rise newsletter