The Data Quality Engineer for the Operational Data Platform plays a crucial role in ensuring the quality, reliability, and performance of the team’s data pipelines, systems, and tools. This position requires a blend of technical expertise and a strong understanding of data quality methodologies to support the development and maintenance of robust data solutions. The Data Quality Engineer will collaborate closely with data engineers, analysts, and other stakeholders to identify issues, create test strategies, and maintain the highest standards for the team’s deliverables.
This position can be in St. Louis, MO / Remote.
Primary Responsibilities
· Collaborate with the Data Engineering team to understand requirements, design specifications, and technical implementations.
· Develop and execute test plans, test cases, and test scripts for ETL processes, data pipelines, and data transformation workflows.
· Perform end-to-end testing of data platforms to ensure accuracy, reliability, and scalability.
· Validate data integrity across multiple systems and environments.
· Identify, document, and track defects, working with the engineering team to resolve issues in a timely manner.
· Automate testing processes for data validation and system performance to improve efficiency.
· Implement testing and validation using tools such as dbt and Great Expectations.
· Monitor data quality and identify opportunities for improving test coverage.
· Validate APIs and RESTful services to ensure proper integration and data flow between systems.
· Provide regular updates on testing progress, issues, and risks to the team and stakeholders.
· Stay up-to-date with industry best practices and new tools for data testing and quality assurance.
Preferred Skills
· Experience with big data technologies (e.g., Hadoop, Spark, Kafka) and cloud platforms (e.g., AWS, Azure, GCP).
· Familiarity with CI/CD pipelines and tools such as Jenkins or GitLab.
· Knowledge of data governance and data security best practices.
· Experience testing APIs and RESTful services for integration and functionality validation.
Qualifications
· Bachelor’s degree in Computer Science, Information Systems, or a related field, or equivalent work experience.
· Strong understanding of data quality methodologies, tools, and processes.
· Hands-on experience with SQL and data querying for validation and testing.
· Familiarity with ETL processes, data pipelines, and data warehousing concepts.
· Experience in scripting and test automation using tools such as Python, Java, or other relevant technologies.
· Knowledge of CI/CD pipelines and tools such as Jenkins or GitLab.
· Familiarity with tools like dbt and Great Expectations for data testing and validation.
· Knowledge of data visualization tools (e.g., Tableau, Power BI) and their role in validating data outputs.
· Strong analytical and problem-solving skills.
· Excellent communication and collaboration abilities.
About Focus Financial Partners
Focus is a leading partnership of fiduciary wealth management and related financial services firms. Focus provides access to best practices, greater resources, and continuity planning for its affiliated advisory firms, which serve individuals, families, employers, and institutions with comprehensive financial services. Focus firms and their clients benefit from the solutions, synergies, scale, economics, and best practices offered by Focus to achieve their business objectives. For more information about Focus, please visit www.focusfinancialpartners.com.
#LI-SF1 #LI-REMOTE
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
As a Data Quality Engineer for the Operational Platform at Focus Financial Partners, you're stepping into a vital role that helps streamline our data processes and maintain the integrity of our data systems. Located in St. Louis, MO, or available for remote work, this position calls for a unique combination of technical savvy and an in-depth understanding of data quality methodologies. Your primary mission will be to collaborate with a team of data engineers and analysts to tackle challenges, create effective test strategies, and uphold the highest quality standards for our data deliverables. You’ll develop and execute test plans for ETL processes, validate data integrity across various systems, and take the lead in automating testing processes using industry-leading tools such as dbt and Great Expectations. Additionally, your role will involve monitoring data quality continuously and identifying opportunities for enhancement, ensuring seamless integration and flow of data between systems while providing timely updates to your team and stakeholders. We're looking for someone with a solid background in big data technologies, cloud platforms, and data management best practices. If you're passionate about data quality and eager to contribute to a team dedicated to excellence in financial services, then this might just be the perfect role for you!
We want to be the partnership of choice for entrepreneurial, growth-oriented, fiduciary wealth management firms.
26 jobsSubscribe to Rise newsletter