Remo is building the new standard of dementia care. As a virtual dementia care provider, our expert clinical team designs personalized, comprehensive care around patient and family needs (instead of a one-size-fits-all approach). We empower family caregivers by connecting them with a vibrant community of other caregivers, expert content, and tools to manage the entire dementia journey – from anywhere, at any time.
Our mission is simple — to provide accessible, comprehensive, quality dementia care for every person who needs it.
We are looking for our first highly skilled Data Engineer to play a key role in developing and maintaining our Data and Analytics Platform. As a key member of our team at Remo, you'll be supporting our mission to provide accessible, comprehensive, and quality dementia care by engineering robust data pipelines, developing analytical models, and empowering our stellar team to leverage our data across the organization.
Manage the full lifecycle of data at Remo Health, from ingestion to transformation and downstream consumption
Design, develop, and maintain data pipelines and ETL processes using ingestion platforms such as Fivetran and tools like Google Cloud Composer (Apache Airflow)
Manage the Customer Data Platform and lifecycle
Manage GCP services such as BigQuery, Dataflow (Apache Beam), Kubeflow, or Vertex AI Pipelines for data processing and model deployment.
Implement and manage tools such as dbt and semantic layer to support data transformations and modeling
Optimize SQL queries for data warehousing in BigQuery, including database partitioning strategies
Develop Python scripts and/or packages for data processing and automation tasks
Act as a lead analyst to empower and support analytical roles across various departments
Enable data discovery and provide expert guidance on augmenting and leveraging available data and obtaining missing data
Ensure compliance with data governance policies and data loss prevention (DLP) standards
Handle Protected Health Information (PHI), Personally Identifiable Information (PII), and de-identified data securely
Collaborate with data scientists to codify and deploy machine learning models to staging and production environments
3+ years of experience as a Data Engineer and 3+ years of combined experience in a an Analytics Engineering role in the Healthcare industry
Experience building, maintaining, and monitoring robust ETL pipelines
Someone who has held a lead analytical role in forecasting, cohort analysis, and time-series wrangling using healthcare and product data (required)
Experience developing a strategy for a data semantic layer in a healthcare setting (required)
Mastery of administering and augmenting BI and other analytical tooling such as Tableau, PowerBI, or similar (required)
Hands on (2+ years) Google Cloud Experience: BigQuery, Composer, Dataflow (required)
Expert level SQL: Postgres and BigQuery (required)
Experience developing and maintaining Python packages used as part of data transformation pipelines (required)
Strong experience with transform and modeling tools such as dbt
Proficiency in leveraging platforms such as Segment, Fivetran, dbt
Experience with version control for data models (Git)
Strong understanding of CDC (change data capture);
Above all, excellent problem solving, analytical, and communication skills
Knowledge of HIPAA/HITRUST compliance and related security best practices
Hands on experience with Terraform is preferred
Having implemented a CDC solution is a huge plus
Experience collaborating with Data Science teams to codify and deploy ML models into production a plus
Experience enabling the responsible usage of AI within an organization is not required, but highly desirable
Experience developing BigQuery datasets, views, and tagging for DLP a huge plus
Experience with additional GCP technologies such as Cloud Run / Functions, Scheduler, PubSub a plus
At Remo Health, we value diversity in the workplace because it allows us to better understand and meet the needs of our customers and the communities we serve. We want to ensure every job applicant is treated fairly and with respect regarding race, national or ethnic origin, religion, age, gender, sexual orientation, or disability. If you require any support in the application process, including disability accommodation, please contact hr@remo.health.
We use E-Verify to confirm the identity and employment eligibility of all new hires: Participation Poster (PDF), Right to Work Poster (PDF). Background checks are required for all new hires.
At Remo, we’re on a mission to redefine dementia care, and we need a passionate Data Engineer to join our innovative team! As our first full-time Data Engineer, you will be instrumental in creating and maintaining our Data and Analytics Platform. Imagine crafting data pipelines that not only streamline our operations but also empower caregivers with the insights they need to manage dementia care effectively. You’ll dive deep into data ingestion using advanced tools like Fivetran and orchestrate workflows with Google Cloud Composer. Collaborating closely with our talented team, you’ll ensure that data is not only efficiently gathered but also thoughtfully transformed into actionable insights. A significant aspect of your role will involve optimizing SQL queries and managing our Customer Data Platform, guaranteeing that our analytics processes are smooth and effective. With your expertise in Google Cloud products like BigQuery and Dataflow, you will help us elevate our data capabilities. You’ll also support our analysts in understanding and leveraging data to improve patient care. If you have a strong background in healthcare analytics, a knack for problem-solving, and a passion for making a difference, we’d love to hear from you! Join Remo and be part of a caring, dedicated team that truly values innovation, inclusivity, and community. Together, we can change the landscape of dementia care for the better!
Subscribe to Rise newsletter