Senior Data Engineer
Remote
Calendly
Calendly is the modern scheduling platform that makes “finding time” a breeze. When connecting is easy, your teams can get more done.What is Calendly?
Calendly takes the work out of scheduling so our customers have more time to work on what’s really important. Our software is used by millions of people worldwide—with hundreds more signing up each day. To maintain this exciting growth, we’re looking for top talent to join our team and help shape the future of our product and data usage at Calendly.
Why join Calendly’s Data Engineering team?
Calendly is looking for a Senior Data Engineer to join our fast-growing team. This role will reside on the Data Engineering (DE) team with a focus on building our Data Platform. This platform, built on the Google Cloud Platform (GCP), ensures our company has access to timely, accurate, relevant, and proactively persuasive insights to efficiently grow the business.
Our ideal candidate will develop and maintain data architectures that align with business requirements. They will work closely with our product, sales, and marketing teams to support and implement high-quality, data-driven decisions while ensuring data accuracy and consistency. The right person for the job will have experience developing data pipelines built for the now and the future, have a keen eye for reliability and performance, and be a team player. This individual should have a hunger and passion for helping the business, asking the right questions, and delivering pipelines that push the business forward.
What are some of the high-impact opportunities you’ll tackle?
- Creation of a high-volume data platform linking all aspects of Calendly’s business
- Ensure proper, secure, and scalable instrumentation of data models in partnership with Product Engineering
- Manage performance, integrity, and security of data transfer and storage
- Identify process improvements for data intake, validation, mining, and engineering as well as modeling, visualization, and communication deliverables
- Work with stakeholders to design and deliver assets necessary for their objectives and key results. These deliverables will span both Business Intelligence and customer facing reporting
- Ensure timeliness, accuracy, and relevance of data and deliverables
This opportunity is for you if you have/are:
- Bachelor’s degree in computer science, information systems, engineering or related field
- 5-7 years of experience in a position designing and maintaining data pipelines, data models and reporting systems
- Strong programming skills in imperative and declarative languages - especially Python and SQL
- Strong data modeling background and cloud data warehouse experience
- Experience with big data tools such as Google BigQuery
- Experience with batch and streaming pipelines, i.e. Apache Spark, Beam
- Experience with orchestration techniques and frameworks like Airflow, PubSub
- Authorized to work lawfully in the United States of America as Calendly does not engage in immigration sponsorship at this time
If you are an individual with a disability and would like to request a reasonable accommodation as part of the application or recruiting process, please contact us at recruiting@calendly.com.
Calendly is registered as an employer in many, but not all, states. If you are not located in or able to work from a state where Calendly is registered, you will not be eligible for employment.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture Big Data BigQuery Business Intelligence Computer Science Data pipelines Data warehouse Engineering GCP Google Cloud Pipelines Python Security Spark SQL Streaming
Perks/benefits: Career development Startup environment
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Marketing Data Analyst jobs
- Open MLOps Engineer jobs
- Open Junior Data Scientist jobs
- Open AI Engineer jobs
- Open Data Engineer II jobs
- Open Senior Data Architect jobs
- Open Power BI Developer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Analytics Engineer jobs
- Open Sr Data Engineer jobs
- Open Manager, Data Engineering jobs
- Open Principal Data Engineer jobs
- Open Business Data Analyst jobs
- Open Product Data Analyst jobs
- Open Data Quality Analyst jobs
- Open Data Manager jobs
- Open Sr. Data Scientist jobs
- Open Big Data Engineer jobs
- Open Data Scientist II jobs
- Open Business Intelligence Developer jobs
- Open Data Analyst Intern jobs
- Open ETL Developer jobs
- Open Principal Data Scientist jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Business Intelligence-related jobs
- Open Data quality-related jobs
- Open Privacy-related jobs
- Open Data management-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open APIs-related jobs
- Open PyTorch-related jobs
- Open PhD-related jobs
- Open Consulting-related jobs
- Open TensorFlow-related jobs
- Open Snowflake-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Data warehouse-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs
- Open CI/CD-related jobs