Remote - candidates must be willing to work ET hours. Expected to work within core business hours, and must not exceed 40 hours/week.Machine Learning Engineer Senior / Data Engineer Senior you’ll contribute to highly available and scalable solutions that are built on cutting-edge cloud architecture. You will partner with Data Science and Agile IT teams focused on using new technologies, modern CICD practices, and the right tools to enhance our ability to deliver value to our customers. You’ll build out the necessary data pipelines to support Data Science models and enable efficient access to multiple data sources, support packaging and deploying new models or updated models to production, and create the necessary infrastructure needed for model training.Day-to-Day Responsibilities• Partner with the Data Scientist assigned to the project to support model construction, training, and deployment• Create new data pipelines or improve efficiency in data pipelines needed to support the project• Package up and deploy new models or updated models to production• Create the necessary infrastructure to support model training• Build data pipelines that enable automated data processing and efficient change• Work with the Data Scientist and Application team to ensure the design if fit for requirement• These contractors will be focused on Data Engineering needs for Claims IT Enterprise priorities• They will be part of a standing scrum team• They will be working with many new technologies (please reference the below posting provided)Required Skills• Significant experience developing applications in a hybrid cloud/onprem environment• Significant experience with AWS networking components – VPC, security groups, subnets, endpoints, etc• Significant experience developing AWS applications with Lambda, S3, DynamoDB• Significant experience developing applications with Python• Significant experience with the Linux/Unix operating system and command line tools• Significant experience creating flows registered in prefect cloud.• Significant experience building custom docker images and containers.• Significant experience building docker image backed python AWS Lambda• Experience creating and maintaining AWS assets with terraform cloud/enterprise.• Experience creating yaml pipelines in ADO and github action workflows.Preferred Skills And Experiences• Familiarity with related tools a plus (e.g. Snowflake and/or Tecton)• Experience with analytical databases, snowflake• Experience with graph databases, neo4j a plus• Experience with vector databases a plusEducation And/Or Experience Required• Bachelor's Degree or higher in an Information Technology discipline or related field of study and minimum of two years of work experience designing, programming, and supporting software programs or applications.• In lieu of degree, minimum of four years related work experience designing, programming, and supporting software programs or applications may be accepted.