The salary range for this position is (contract of employment):
Mid role: 14 200 - 19 690 PLN in gross terms
A hybrid work model requires 1 day a week in the office.
Allegro Pay is the largest fintech in Central Europe – we are growing fast and need engineers who want to learn and develop, while at the same time solving problems related to serving thousands of RPSs. If, like us, you like flexing your mental muscles to solve complex problems and you would be happy to co-create the infrastructure which underpins our solutions, make sure you apply!
In this role, you will be a contributor, helping us expand our modern cloud-based analytical solutions. We embrace challenging and interesting projects and take quality very seriously. Depending on your preference, your position may be more business-oriented or platform-oriented.
We are looking for people who:
Have 2+ years of experience in building data-driven solutions using Python
Have practical knowledge in creating efficient data processing applications
Simply like data to be processed efficiently, they feel satisfied when they ignite a lot of cores to quickly process terabytes of data
Can optimize SQL queries in traditional engines (SQL Server, Oracle), Big Data (Spark) or cloud engines (BigQuery, Snowflake)
Have experience in working with large data sets, understand database algorithms and data structures (e.g. they know what the difference between merge join and join hash)
Can independently make decisions in the areas entrusted to them and take responsibility for the code they create
Are not afraid of new technologies and want to expand their range of skills
Know how to build and deploy containerized applications on the cloud
Nice to have:
Ideal candidate has DevOps experience - knows how to setup CI/CD pipelines and worked with IaC tools like Terraform or Pulumi to deploy and maintain cloud infrastructure
Experienced in programming in statically typed languages (Java, Scala, C#) will be an advantage
What will your responsibilities be?
Design, monitor and improve data flow processes implemented in Python, SQL, Airflow and Snowpark
Implement and maintain Data Mesh processes collecting data from many micro services and cloud sources
Work with various data formats and sources, utilizing novel storage solutions
Optimize the costs associated with the cloud operations in Snowflake, Azure Cloud and GCP
Work with latest technologies such as: Snowflake, Airflow, dbt, .Net, Azure, GCP, Github Actions
Play an active role in decision-making processes regarding the selection and implementation of data frameworks
What we offer
We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
16" or 14" MacBook Pro with M1 processor and, 32GB RAM or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need
Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)
English classes that we pay for related to the specific nature of your job
Why would you like to work with us:
You will work with an experienced team that carries out complex and demanding projects related to real-time processing of data produced by back-end and front-end. We design our data processes with software engineering rigour.
You will work on projects related to the area of finance where the scale, advancement of algorithms, business impact and technical requirements will be a key challenge
You will directly influence data processes that change in real time how millions of users use Allegro
Our employees regularly attend and present on conferences in Poland and abroad (Europe and the USA)
Apply to Allegro and see why it is #dobrzetubyć (#goodtobehere)
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
Exciting times are ahead at Allegro Pay as we seek a motivated Data Engineer to join our dynamic team in Warsaw, Poland! As the largest fintech in Central Europe, we’re on a growth trajectory and eager to welcome engineers who are passionate about data and technology. If you thrive on solving complex problems and enjoy collaborating to build robust infrastructures, this might be the perfect role for you! In this position, you will be integral to expanding our modern, cloud-based analytical solutions while ensuring that data flows efficiently and effectively. With responsibilities ranging from designing data flow processes in Python and SQL to embracing the latest in technologies like Snowflake and Airflow, you’ll have the chance to make a real impact. We value innovation and quality, offering a hybrid work model that requires just one day a week in the office. Whether you prefer a business-oriented focus or a more technical approach, there’s room for your unique talents here at Allegro Pay. Join us, and you'll not only sharpen your skills but also play a vital role in transforming the way millions of users interact with our services. Let’s create something amazing together!
Join Allegro as a Product Manager and take charge of enhancing market-leading consumer experiences through innovative technology and collaboration.
As a Team Leader in Advertising Sales, you'll lead strategic advertising initiatives for renowned brands in a dynamic e-commerce environment.
NVIDIA is looking for a skilled Data Engineering Specialist to enhance their commerce platform through innovative data pipeline solutions.
Looking for a Python Data Engineer in Dallas to develop robust data solutions for cutting-edge Generative AI applications.
Looking for an experienced Lead Data Engineer to spearhead our data initiatives and manage offshore teams in Atlanta.
NIQ seeks a skilled Data Engineer with Java expertise to optimize and maintain data systems in New York.
Join Google's gTech team as a Data Engineer to leverage data infrastructure for innovative solutions in Trust and Safety across multiple platforms.
Join Shift4 as a Data Engineer to enhance our data infrastructure and support global commerce.
Join Fiserv as a Security Analytics Data Engineer to drive innovation in Fintech and optimize data solutions.
Join Kin as a Senior Data Engineer to lead the creation of scalable data solutions in a cutting-edge environment focused on customer-centric insurance.
Allegro is the most popular Polish shopping destination with about 17 million users monthly and over 1.1 million items sold on the platform daily. Making a site like this work requires a lot of engineering and as the site grows, we learn and adopt...
100 jobsSubscribe to Rise newsletter