Data Analytics Engineer

Remote

Applications have closed

FloatMe

Meet your new BFF that’s helped our members save over $50M in overdraft fees with instant cash advances up to $50 between paydays. No credit checks. No interest.

View company page

About the Role

Coedme shape the future of FloatMe! As our Data Analytics Engineer, you will help define the experiences we build to guide the 150M Americans living paycheck to paycheck on their path to financial resilience. In this role, you’ll continue to improve the foundation for a trusted and performant data platform that enables the entire company to make rapid data-driven decisions and ship scalable data-driven features.

What You’ll Do

  • Maintain and improve our data pipelines and warehouse to ensure our internal stakeholders have timely access to accurate data. Our current stack uses Python, SQL, DBT, Thoughtspot, Snowflake and more.

  • You will own our integrations with multiple data providers such as Twilio, Iterable, App Store, among others via APIs. 

  • Write, configure, deploy, and maintain the tools needed to deliver accurate and clean data, rapidly.

  • Partner with Product Analytics to understand our current use cases and in tandem create, maintain and debug data models and reports.

  • Build and maintain consistent data models that act as the source of truth for all metrics within the company.

  • Build and share a deep understanding of our chosen languages, frameworks, and other areas of focus. Continuously pushing yourself and the team to be continuously learning, growing and implementing best practices.

  • Demonstrate ownership for your work including understanding requirements, crafting solutions, and building robust testing plans to ensure the highest data quality.

Who You Are

  • Thoughtful. You guide teams using data and insights and are constantly deploying the best solutions.

  • Driven by ownership. You are maniacal about improving and maintaining our data pipelines to ensure we are able to rapidly pull insights for all stakeholders. 

  • Product’s best friend. Your insights ensure teams both understand and bring value to their users.

Requirements 

  • 5+ years’ of overall data engineering work experience 

  • Familiarity and hands on experience with Kinesis Streams 

  • Extensive experience with DBT jobs and optimization 

  • Experience managing multiple data sets via APIs 

  • Experience with utilizing Terraform 

  • Experience writing in Python

  • Must have ETL experience with Appflow, Dynamodb, Postgres and Snowflake

  • Exceptional analytical, organizational, interpersonal, and communication (both oral and written) skills

  • Self-motivated, driven, resourceful, and able to get things done

Nice to Have:

  • Previous BI reporting experience

Benefits

  • Health insurance

  • Dental & Vision

  • Long-term disability

  • 401(k) with standard & Roth options

  • Team outings (lunches, happy hours, games, and more)

  • Opportunities for growth and professional development

  • Unlimited PTO

  • Sick Leave & Personal Days

Tags: APIs Data Analytics Data pipelines Data quality dbt DynamoDB Engineering ETL Kinesis Pipelines PostgreSQL Python Snowflake SQL Terraform Testing

Perks/benefits: Career development Equity Health care Insurance Team events Unlimited paid time off

Region: Remote/Anywhere
Job stats:  18  7  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.