Sign up for our
weekly
newsletter
of fresh jobs
Location: CINCINNATI, OH, USA
Salary: $70.00 USD Hourly - $75.00 USD Hourly...
Description
Title: Data Engineer Level 3
Location: Remote
Duration: 6 Months contract
Job Description
Position Overview
As a Data Engineer, you will play a pivotal role in developing and delivering technological solutions aligned with targeted business outcomes. Your focus will be on data architecture and information delivery, treating data as a valuable enterprise asset. You’ll adhere to reusable standards, design patterns, and guidelines to ensure streamlined integration and operational efficiency. Collaboration with cross-functional teams, including direct engagement, will be essential. Demonstrating our core values of respect, honesty, integrity, diversity, inclusion, and safety is paramount.
Additional Details
• Top 3 Technical Skills: Databricks, PySpark, Delta Live Tables, Delta Sharing.
• Soft Skills: Excellent communication.
• Project Focus: Data Integration Modernization.
• Team Details: 12-member Agile hybrid team.
• Work Location: Remote.
• Travel Requirement: None.
• Working Hours: 40 hours per week.
Key Responsibilities
• Data Solutions and Integration:
• Utilize enterprise standards for data domains and solutions.
• Streamline operational and analytical use of data.
• Ensure clarity between ongoing projects, escalating issues when necessary.
• Collaborate directly on data initiatives.
• Innovative Technologies and Transformation:
• Leverage cutting-edge technologies to enhance existing data assets.
• Renovate, extend, and transform core data platforms.
• Address gaps between current and future states through high-level migration plans.
• Architectural Decision-Making:
• Contribute to cost/benefit analysis for leadership.
• Detect critical deficiencies in technology environments.
• Recommend solutions for improvement.
• Data Asset Reuse and Documentation:
• Promote the reuse of data assets.
• Manage the data catalog for reference.
• Draft architectural diagrams and interface specifications.
Qualifications
Required:
• 4+ years of Python experience.
• Strong SQL skills, including querying with extensive join conditions and conditional updates.
• DevOps proficiency (Linux, GitHub, Bash).
• Experience with CI/CD lifecycles.
• Unit, integration, and regression testing expertise.
• Familiarity with static code analysis tools, Linters, and adherence to PEP 8 standards.
• Exposure to Microsoft Azure and/or Google Cloud Platform.
• Agile SCRUM project experience.
Highly Desired
• Data Management expertise (Databricks, PySpark, Spark Structured Streaming, Delta Live Tables, Delta Sharing).
• Expert knowledge of SQL.
• Familiarity with Streaming Technologies (Apache Kafka, Azure EventHubs, Avro).
• DevOps skills (GitHub Actions, Terraform, Artifactory).
Desired
• Knowledge of Java, SpringBoot, Kubernetes, Docker, and NoSQL databases.
Contact: kjones10@judge.com
This job and many more are available through The Judge Group. Find us on the web at www.judge.com