Data Engineer

Dublin, Ireland

Applications have closed

Intercom

Faster resolutions, higher CSAT, and lighter support volumes with the only platform to combine the power of automation and human customer support.

View company page

Communication has changed for people. It’s changed for businesses, too.

Intercom is the only suite of customer messaging products that drives growth at every stage of the customer lifecycle—across acquisition, engagement, and support. Today more than 30,000 businesses use Intercom to connect with a billion people worldwide.

Join the company helping businesses grow revenue through live chat, and so much more.


What's the opportunity? 🤔

The Data Engineering team builds distributed systems and tools that empowers Intercom to make data informed decisions. We’re looking for a senior data engineer with a passion for tackling large scale data challenges to join our growing team.

At Intercom, data scientists, analysts and data engineers have a highly collaborative relationship. We’re building a highly-configurable data platform, enabling our partners to self-serve by building their own end-to-end data workflows, from transforming data and evaluating experiments to  analysing usage and running predictive models. As a member of the Data Engineering team, you will evolve and extend this data platform to support a variety of highly impactful business and product-focused projects.

What will I be doing? 🚀

  • Collaborate with engineers from other teams, product managers, go-to-market teams as well as data analysts and data scientists to develop tooling and infrastructure to support their needs

  • Design, build and evolve batch and real-time data pipelines that scale with the growing velocity and volume of our data

  • Develop tooling to support the creation and discovery of high quality analytics data in an environment where dozens of changes can be shipped daily

  • Implement systems to monitor our infrastructure, detect and surface data quality issues.

  • Using machine learning to identify data quality issues in our core pipelines

  • Building our next generation metrics framework, adding anomaly detection and alerting, and enabling easier discovery and consumption

What skills do I need? 📖

  • You have 4+ years of experience building and operating distributed, highly-scalable software systems that involve significant data processing elements

  • You are highly autonomous, have a great sense of what should be worked on next and know how to break big ambiguous problems into small workable chunks

  • You have practical experience and proficiency with a high level programming language (e.g. Python, Java, Ruby, Perl)

  • Hands-on experience with SQL

  • Experience with Apache Airflow 

  • Familiarity with AWS technologies. We spend most of our time working with EC2, Redshift, Aurora, S3 and Kinesis. Experience with analogous technologies (Azure, Google Cloud) is fine too.

Bonus skills & attributes 🙌

  • Experience and opinions on tools and technologies that are on our radar such as Snowflake, DBT and Debezium.

Benefits 😍

We are a well treated bunch, with awesome benefits! If there’s something important to you that’s not on this list, talk to us! :)

  • Competitive salary and equity in a fast-growing start-up

  • We serve lunch every weekday, plus a variety of snack foods and a fully stocked kitchen

  • Regular compensation reviews - we reward great work! 

  • Peace of mind with life assurance, as well as comprehensive health and dental insurance for you and your dependents

  • Open vacation policy and flexible holidays so you can take time off when you need it

  • Paid maternity leave, as well as 6 weeks paternity leave for fathers, to let you spend valuable time with your loved ones

  • If you’re cycling, we’ve got you covered on the Cycle-to-Work Scheme. With secure bike storage too

  • MacBooks are our standard, but we’re happy to get you whatever equipment helps you get your job done

 

We are a diverse bunch of people and we want to continue to attract and retain a diverse range of people into our organisation. We're committed to an inclusive and diverse Intercom! We do not discriminate based on gender, ethnicity, sexual orientation, religion, civil or family status, age, disability, or race.

Tags: Airflow AWS Azure Data pipelines Distributed Systems EC2 Engineering GCP Google Cloud Kinesis Machine Learning Perl Pipelines Python Radar Redshift Ruby Snowflake SQL

Perks/benefits: Career development Competitive pay Equity Flex hours Flex vacation Health care Insurance Parental leave Salary bonus Snacks / Drinks Startup environment

Region: Europe
Country: Ireland
Job stats:  16  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.