Ruby Labs is a leading tech company that creates and operates innovative consumer products. We offer a diverse range of opportunities across the health, education, and entertainment industries. Our innovative teams are driving the future of consumer-led products, and we're always looking for passionate individuals to join us. Learn more about our story at: https://rubylabs.com/about-us/
We are seeking a Data Engineer to help build and maintain a robust, trustworthy, and well-documented data platform. In this role, you will be responsible for developing and maintaining data pipelines, implementing monitoring and alerting systems, establishing data quality checks, and ensuring that data structures and documentation stay up to date and aligned with business needs.
You will work closely with the lead data engineer to enhance the reliability, observability, and scalability of our data stack (Python, SQL, Airflow, BigQuery), supporting critical data-driven decision-making across the organization.
Develop and maintain ETL/ELT data pipelines to ingest, transform, and deliver data into the data warehouse.
Design and implement monitoring and alerting systems to proactively detect pipeline failures, anomalies, and data quality issues.
Establish data quality validation checks and anomaly detection mechanisms to ensure accuracy and trust in data.
Define and maintain data structures, schemas, and partitioning strategies for efficient and scalable data storage.
Create and maintain comprehensive documentation of data pipelines, workflows, data models, and data lineage.
Troubleshoot and resolve issues related to data pipelines, performance, and quality.
Collaborate with stakeholders to understand data requirements and translate them into reliable engineering solutions.
Contribute to the continuous improvement of the data platform’s observability, reliability, and maintainability.
Proficiency in Python for data pipeline development, automation, and tooling.
Strong SQL skills and experience working with cloud data warehouses (BigQuery preferred).
Experience with workflow orchestration tools such as Airflow.
Familiarity with data quality frameworks (e.g., Great Expectations, dbt tests) and anomaly detection methods.
Experience building monitoring and alerting systems for data pipelines and data quality.
Ability to write clear, maintainable, and actionable technical documentation.
Strong problem-solving skills and attention to detail.
Experience with observability tools (e.g., Grafana, Prometheus).
You'll play a key role in shaping a scalable, reliable, and transparent data ecosystem. This is an opportunity to combine hands-on engineering with data quality, monitoring, and documentation practices to build trust in data and enable data-driven decisions.
Ruby Labs operates within the CET (Central European Time) zone. Applicants from any country are welcome to apply for the position as long as they are located within approximately ± 4 hours of CET. This ensures optimal collaboration and communication during working hours.
Discover the perks of being part of our vibrant team! We offer:
Remote Work Environment: Embrace the freedom to work from anywhere, anytime, promoting a healthy work-life balance.
Unlimited PTO: Enjoy unlimited paid time off to recharge and prioritize your well-being, without counting days.
Paid National Holidays: Celebrate and relax on national holidays with paid time off to unwind and recharge.
Company-provided MacBook: Experience seamless productivity with top-notch Apple MacBooks provided to all employees who need them.
Flexible Independent Contractor Agreement: Unlock the benefits of flexibility, autonomy, and entrepreneurial opportunities. Benefit from tax advantages, networking opportunities, reduced employment obligations, and the freedom to work from anywhere. Read more about it here: https://docs.google.com/document/d/1dHF4ctKlez75whdn-ybUwP5d5Wr0BdwVrorrm_fM40Q/preview
Be part of our fast-growing team and seize this excellent opportunity for personal and professional growth!
After submitting your application, we conduct a thorough review, which typically takes 3 to 5 days, but may occasionally take longer due to the volume of applications received. If we see a potential fit, we proceed with the following steps:
Recruiter Screening (40 minutes)
Technical Interview (90 minutes)
At Ruby Labs, we are more than a team; we're a community united in pushing the boundaries of technology and innovation. Our combined passion fuels our ambition for excellence, driving impact that resonates around the globe.
We are an equal-opportunity employer and celebrate diversity, recognizing that a diversity of thought and backgrounds builds stronger teams. We approach diversity and inclusion seriously and thoughtfully. We do not discriminate based on race, ethnicity, religion, color, place of birth, sex, gender identity or expression, sexual orientation, age, marital status, military service status, or disability status. Join us and be part of a company that is crafting the future of technology across multiple industries.
#Li-Remote
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
An experienced Senior Data Engineer is needed by Steampunk to lead data platform migrations and develop robust data engineering solutions for federal clients.
Take a lead role as a Senior Data Engineer at Fan Ecosystem, driving data platform development and engineering strategy for business impact.
Contribute as a Data Engineer II to build advanced AI and data science pipelines supporting mission-critical DoD initiatives within Agile Defense.
Capital One seeks a Senior Data Engineer to design and implement cutting-edge data solutions that empower millions and drive technological transformation.
Contribute to The New York Times mission by building real-time data pipelines and APIs that power customer-facing data products in a hybrid work environment.
A Junior Data Engineer role at Teza Technologies to design, optimize, and maintain data infrastructure critical to trading strategies.
MealMe Inc is seeking a Data Engineer to build scalable ETL pipelines and data models, enhancing our AI-ready platform for seamless merchant and customer experiences.
Contribute to SchoolAI’s growth by designing and maintaining scalable BI data pipelines in a dynamic startup environment.
Capgemini partners with a leading U.S. insurer to find a motivated Associate Data Engineer skilled in data analytics, cloud technologies, and model deployment to drive key business insights.
GoodParty.org seeks a remote Data Engineer to build robust data pipelines and models that empower independent candidates and internal teams to transform data into actionable insights.
Lead the design and scaling of cutting-edge data pipelines for Nitrogen’s innovative financial advisory platform as a Senior Staff Data Engineer.
Experienced Data Engineer needed at KBR to support DoD Test and Evaluation data analytics and develop innovative data solutions for national security projects.
MealMe Inc is seeking a skilled Data Engineer to design and maintain AWS-based data pipelines that empower real-time inventory and AI-powered retail solutions.
Subscribe to Rise newsletter