About EasyKnock
EasyKnock is the country’s first home equity solutions platform. Our innovative programs give homeowners flexible, quick solutions for their financial needs. Whether paying off debt, purchasing a new home, or funding a business, EasyKnock empowers homeowners to convert their equity to cash without strict lender qualifications through our suite of sale-leaseback solutions. Customers sell their homes to us and remain as renters while working toward their goals. We’re passionate about helping American homeowners access the equity they’ve built up in their homes by giving them back liquidity, flexibility, and control.
We are looking for compassionate people who find joy in connecting others with creative solutions to access the value of their home. If you have a growth mindset, find absolute thrill in building a new business and excel in a dynamic work culture, we want to talk to you.
About the Position
As a Data Engineer you collect and maintain data that provide the organization with analytical capabilities in support of its mission. Reporting to VP of Engineering, you will be part of a diverse, dynamic, and agile squad that is responsible for data pipelines, data integration, data quality, data visualization, self-service analytics and data catalog for the enterprise.
Roles & Responsibilities
- Quickly learn about the business domain and the associated data and analytics products that the team works on
- Understand and take over tools that support data driven decisions and allow our teams to access and prepare data sets and reports easily and reliably.
- Support and maintain data workflows using Airflow and RedShift
- Working knowledge configuring databases/data warehouses to have optimal performance, reliability, and fault tolerance.
- Thrive in a collaborative fast-paced environment where data expertise and responsibility are shared with teammates across disciplines
Requirements
- 3+ years of hands-on experience enhancing and building large scale data warehouse solutions across the entire data lifecycle, from raw data to powerful insights and analytics
- Experience in working with and optimizing existing data ingestion, cleansing, transformation, integration, and validation flows and helping to move them into production
- Ability to quickly understand EasyKnock products, how source systems generate data, and how data is organized in the data mart
- Strong communication skills which enable you to work with owners of source systems on better ways to capture and store transactional data
- Fluent in Python and/or Scala
- Experience with cloud computing platforms (e.g. GCP, AWS, Azure)
Preferred Qualifications
- 4+ years Hands-on experience with Python in data engineering or application development
- Technologist with background in data engineering and data integration with hands on experience in Airflow, Redshift, Tableau and AWS, GCP cloud platforms.
- Solid knowledge of Cloud Infrastructure defined as code with a tool like Pulumi/Terraform.
- Experienced working in Agile product teams.
- Experience designing and maintaining tools that support ETL/ELT pipelines and downstream business use cases of data.
- Knowledge of data architecture and data management best practices
- Overall understanding of data security and privacy best practices
- Clear project management and prioritization skills
- Experience with managing SQL databases, particularly data administrative tasks with DDL
- Experience in a dynamic startup-like environment. Collaborative working style to support larger team goals and outcomes
Tech Stack
- Python
- Airflow
- AWS RedShift
- Postgres
- Kubernetes
- AWS & GCP
Benefits
- Remote-friendly environment
- Competitive base salary commensurate with experience and geographic location. Range: $160,000 - $200,000
- Bonus eligible position
- Full benefits and unlimited PTO
- Generous stock options
- 401k match
- Opportunity to be part of a fast growing company in the financial technology industry
- A chance to work with incredible teammates who are super-bright, creative, talented, and passionate