Let’s get started
By clicking ‘Next’, I agree to the Terms of Service
and Privacy Policy
Jobs / Job page
Senior Data Engineer image - Rise Careers
Job details

Senior Data Engineer

job description:

You will be a senior member of our US IT Team data engineering practice. You will be an early member of this team and will work closely with the Director of Data Engineering to help establish it. We will have our work cut out for us delivering planned work, building the practice and evolving our technical landscape. This is an exciting opportunity for the right individual to join at the very beginning and grow with the new team.

As a senior you will be expected to lead, mentor others and deputise for the director of data engineering when required. You will also continue to contribute as a hands-on engineer. You will have the knowledge, best earned through 5+ years of experience, to truly inhabit this senior role in an ambitious team.

What you’ll be doing:

  • Working with business stakeholders and our delivery team to understand high value business problems that can be solved through the application of data processing and analytical systems.
  • Developing, expanding and evolving our existing databases and ETL pipelines.
  • Working with architects to design, build and support a transformational Databricks cloud data platform for the business.
  • Being a core and professional member of the new data engineering practice
  • Understand business requirements and help refine into development tasks and estimate their complexity.
  • Research, evaluate and adopt new technologies with a right tool for the job mentality.
  • Focus on both speed of delivery and quality, with suitable pragmatism – ensuring your solutions are always appropriate and not too complex or over-engineered.
  • Quick progression of projects from proof-of-concept to post-production stage.
  • Communication and presentation of ideas to colleagues in all parts of the wider tech team.
  • Participating in code reviews for the data engineering practice.
  • Act as a servant leader in our practice – ensuring team members have what they need to complete work and influencing them to follow best practice.

person specification:

  • Rich knowledge of Databricks and recent hands-on experience implementing a Databricks lakehouse at enterprise scale.
  • Advanced Python development skills.
  • Good knowledge of SQL.
  • Experience working using a modern DevOps approach. You should not just have written data code but also the CI/CD pipelines necessary to test and deploy that code in a professional environment.
  • A robust understanding of core data engineering topics – ETL vs ELT, structured and unstructured data, data quality and data governance.
  • Ability to contribute to all aspects of a solution – design, infrastructure, development, testing and maintenance.
  • The ability to design and advocate for technical solutions to business problems.
  • Effective collaboration with technical and non-technical team members through agile ceremonies – roadmap planning, feature workshops, backlog elaboration, code review.
  • Understanding of cloud technology and significant hands-on experience working with it. You must understand how cloud-native solutions must be built differently from traditional ones.
  • Track record of taking initiative and delivering projects end-to-end; clear evidence of being self-driven and motivated
  • Immense curiosity, high energy and desire to go the extra mile to make a difference.


Desirable Skills

Beyond the required skills we are open to individuals of diverse talents. Experience with additional technologies, data science knowledge or a business background are all valued. We want to know how your unique abilities can contribute to our team.

Our Technology

We operate in a diverse technical landscape and are looking for flexible engineers who can adapt to and use many different tools. We would not expect any engineer to be familiar with the entire tech stack – no engineer can. Instead, we seek people with a good understanding of data structures and algorithms and the ability to apply this knowledge in learning new tools.

We have an existing data landscape involving both MS SQL Server and Oracle databases. We use SSIS for ETL. We will also need to source data from our integration layer – webMethods.

If some of that sounds like it needs bringing up to date, we know! We hope to start building a modern cloud data platform to realise the full value of our data. The technologies we use for that platform are still to be decided. It will be based on Databricks with Azure Data Factory for ELT. Other parts of our business have had success using Prefect for orchestration and dbt for transformation and we will reuse where it is sensible.

We prepare data for use by analysts working with a variety of tools – Tableau, PowerBI, and even Excel.

We take a DevOps approach and strive for continuous integration / continuous deployment. We use Azure Pipelines to deploy our code. We deploy infrastructure the same way using Terraform and Docker.

Our technology landscape is not fixed. Our engineering and architecture teams drive the technology we adopt.



Salary range $140,000 - $170,000

#LI-EA1

Hiscox is a leading specialist insurer, headquartered in Bermuda, with roots dating back to 1901. We target niche risks that other insurers often find too complex to underwrite. Hiscox USA was established in 2006 and is now the fastest-growing div...

5 jobs
FUNDING
TEAM SIZE
DATE POSTED
July 23, 2023

Subscribe to Rise newsletter

Risa star 🔮 Hi, I'm Risa! Your AI
Career Copilot
Want to see a list of jobs tailored to
you, just ask me below!
Other jobs
Company
GPA Hybrid São Paulo, Brazil
Posted last year
Company
Posted last year