Lead Data Engineer(Snowflake/ETL)
Hyderabad, Remote
Tide
Our award-winning business account gives time back to people who work for themselves. Join our community of entrepreneurs just like you!
Who are Tide:
At Tide, we’re on a mission to save businesses time and money. We’re the leading provider of UK SME business accounts and one of the fastest-growing FinTechs in the UK. Using the latest tech, we design solutions with SMEs in mind and our member-driven financial platform is transforming the business banking market. Not only do we offer our members business accounts and related banking services, but also a comprehensive set of highly connected admin tools for businesses.
Tide is about doing what you love. We’re looking for someone to join us on our exciting scale up journey and be a part of something special. We are wanting passionate Tideans to drive innovation and help build a best-in-class platform to support our members. You will be comfortable in ambiguous situations and will be able to navigate the evolving FinTech environment. Imagine shaping how millions of Tide members discover and engage with business banking platforms and building this on a global scale.
What we’re looking for:
As part of the team, you will be responsible for building and running the data pipelines andservices that are required to support business functions/reports/dashboard.. We are heavily dependent on Snowflake, Airflow, Fivetran, dbt , Looker for our business intelligence andembrace AWS as a key partner across our engineering teams
As a Analytics Engineer you’ll be:
- Developing end to end ETL/ELT Pipeline working with Data Analyst of business function.
- Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture
- Mentoring other Junior Engineers in Team
- Be a “go-to” expert for data technologies and solutions
- Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
- Troubleshooting and resolving technical issues as they arise
- Looking for ways of improving both what and how data pipelines are delivered by the department
- Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports owning the delivery of data models and reports end to end
- Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future
- Working with Data Analyst to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other “delta loading” approaches
- Discovering, transforming, testing, deploying and documenting data sources
- Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practises, and peer review
- Building Looker Dashboard for use cases if required
What makes you a great fit:
- You have 7+ years of extensive development experience using snowflake or similar data warehouse technology
- You have working experience with dbt and other technologies of the modern datastack, such as Snowflake, Apache Airflow, Fivetran, AWS, git ,Looker
- You have experience in agile processes, such as SCRUM
- You have extensive experience in writing advanced SQL statements and performance tuning them
- You have experience in Data Ingestion techniques using custom or SAAS tool like fivetran
- You have experience in data modelling and can optimise existing/new data models
- You have experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets
- You having experience architecting analytical databases (in Data Mesh architecture) is added advantage
- You have experience working in agile cross-functional delivery team
- You have high development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment
What you’ll get in return:
Make work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state. Additionally, you can work from a different country or Indian state for 90 days of the year. Plus, you’ll get:
- Competitive salary
- Self & Family Health Insurance
- Term & Life Insurance
- OPD Benefits
- Mental wellbeing through Plumm
- Learning & Development Budget
- WFH Setup allowance
- 15 days of Privilege leaves
- 12 days of Casual leaves
- 12 days of Sick leaves
- 3 paid days off for volunteering or L&D activities
Tidean Ways of Working
At Tide, we’re Member First and Data Driven, but above all, we’re One Team. Our Working Out of Office (WOO) policy allows you to work from anywhere in the world for up to 90 days a year. We are remote first, but when you do want to meet new people, collaborate with your team or simply hang out with your colleagues, our offices are always available and equipped to the highest standard. We offer flexible working hours and trust our employees to do their work well, at times that suit them and their team.
Tide is a place for everyone
At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity status or disability status. We believe it’s what makes us awesome at solving problems! We are One Team and foster a transparent and inclusive environment, where everyone’s voice is heard.
#LI-NN1 #LI-Remote
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture AWS Banking Business Intelligence Data analysis Data Mining Data pipelines Data quality Data warehouse EDA ELT Engineering ETL FinTech FiveTran Git Looker Pipelines Scrum Snowflake SQL Testing
Perks/benefits: Career development Competitive pay Flex hours Flex vacation Health care Home office stipend Insurance Team events
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Principal Data Engineer jobs
- Open BI Analyst jobs
- Open Data Analyst Intern jobs
- Open Business Intelligence Engineer jobs
- Open Product Data Analyst jobs
- Open Lead Data Analyst jobs
- Open Senior Data Architect jobs
- Open Sr. Data Scientist jobs
- Open Data Manager jobs
- Open Big Data Engineer jobs
- Open Data Engineer II jobs
- Open Sr Data Engineer jobs
- Open Manager, Data Engineering jobs
- Open Power BI Developer jobs
- Open Senior Manager, Data Science jobs
- Open Data Analytics Engineer jobs
- Open Principal Data Scientist jobs
- Open Business Data Analyst jobs
- Open Lead Machine Learning Engineer jobs
- Open Research Scientist jobs
- Open Data Quality Analyst jobs
- Open Data Engineering Manager jobs
- Open Head of Data jobs
- Open Business Intelligence Developer jobs
- Open Clinical Data Manager jobs
- Open Data pipelines-related jobs
- Open Privacy-related jobs
- Open ML models-related jobs
- Open GCP-related jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open Data management-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open Finance-related jobs
- Open APIs-related jobs
- Open Snowflake-related jobs
- Open PyTorch-related jobs
- Open PhD-related jobs
- Open TensorFlow-related jobs
- Open Consulting-related jobs
- Open Hadoop-related jobs
- Open Airflow-related jobs
- Open NLP-related jobs
- Open Scala-related jobs
- Open Data warehouse-related jobs
- Open Databricks-related jobs
- Open Kubernetes-related jobs
- Open Git-related jobs
- Open Kafka-related jobs