Senior Software Developer - Data Operations
Oakville, ON, Canada
Geotab Inc.
Our GPS fleet tracking & management system equips thousands of fleets worldwide with technology to automate, track and manage a truly optimized operation.Geotab is a global leader in IoT and connected transportation and certified “Great Place to Work.” We are a company of diverse and talented individuals who work together to help businesses grow and succeed, and increase the safety and sustainability of our communities.
Geotab is advancing security, connecting commercial vehicles to the internet and providing web-based analytics to help customers better manage their fleets. Geotab’s open platform and Marketplace, offering hundreds of third-party solution options, allows both small and large businesses to automate operations by integrating vehicle data with their other data assets. Processing billions of data points a day, Geotab leverages data analytics and machine learning to improve productivity, optimize fleets through the reduction of fuel consumption, enhance driver safety and achieve strong compliance to regulatory changes.
Our team is growing and we’re looking for people who follow their passion, think differently and want to make an impact. Join us for a fulfilling career with opportunities to innovate, great benefits, and our fun and inclusive work culture. Reach your full potential with Geotab. To see what it’s like to be a Geotabber, check out our blog and follow us @InsideGeotab on Instagram, Twitter or Facebook.
Who you are:
You will have deep knowledge in developing and documenting processes for onboarding new data into Geotab’s production data environment. You are highly organized and are able to manage multiple tasks and projects simultaneously. Also a strong mentor and a coach to the team members with the ability to engage with all levels of the organization. If you love technology, and are keen to join a data driven company — we would love to hear from you!
What you'll do:
The Senior Software Developer, DataOps is the foundation of Geotab’s Data Factory. They are responsible for the development of scalable and reusable pipelines that minimize time from insight to production. The role continuously collaborates with data analysts and data scientists to design innovative pipelines using new tools and frameworks. They also work closely with the Data Architect and Data Quality experts to ensure all data assets are delivered with the highest quality, with the right schema and into the right location.
How you’ll make an impact:
- Deploy and maintain ETL/ELT pipelines using SQL, Python and Airflow.
- Design and publish reusable pipeline templates through in-house Python package (i.e. templates for data integration, derived metrics, reporting, custom runs).
- Collaborate with data analysts and data scientists to develop complex pipeline templates involving big data tools (i.e. Spark, Apache Beam, Kubepods).
- Lead optimization of pipelines based on requirements and pipeline performance.
- Develop connectors to extract/receive data from external sources.
- Manage pipeline releases through Git and CI/CD.
- Contribute to the development of an alerting framework for production pipelines.
- Ensure metadata is captured and stored across the pipeline lifecycle (i.e. creation, execution, deprecation/update).
- Support remediation of issues within production pipelines.
- Collaborate with data quality analysts and specialists to ensure all pipelines include automatic quality checks.
- Collaborate with data architect to ensure data schemas comply with DataOps guideline.
- Support data architect in evaluation of model-based pipelines through Dataform.
- Recommend features and enhancements to infrastructure and pipeline framework.
- Contribute to the migration of data assets and pipelines from legacy data structures
What you’ll bring to this role:
- Post-secondary Degree/Diploma specialization in Computer Science, Engineering, Mathematics, or a related field.
- 5+ years of Python development experience.
- 3-5 years of experience building ETL/ELT production pipelines.
- 1-2 years of experience with SQL and Databases.
- Previous experience interacting with REST APIs in Python is an asset.
- Hand-on experience using Apache Airflow in a work setting is an asset.
- Experience working in a cloud based infrastructure, especially Google Cloud Platform, is an asset.
- Experience in defining coding standards and providing mentorship to the team is required.
- Previous experience with python package development is highly regarded.
- Must stay relevant to technology and have the flexibility to adapt to the growing technology and market demands.
- Strong team-player with the ability to engage with all levels of the organization.
Work from home and flex work arrangementsBaby bonusHome office reimbursement programOnline learning and networking opportunitiesElectric vehicle purchase incentive programCompetitive medical and dental benefits (full-time employees only)Retirement savings program (full-time employees only)
How we work:
At Geotab, we understand that the world is always changing and that we need to change with it. Geotab has adopted a hybrid model for working, including a flexible work from home program, with the opportunity to work in our safe, clean offices. When working from home, you are required to have a reliable internet connection with at least 50mb DL/10mb UL. Remote work is supported with cloud-based applications, collaboration tools and asynchronous working. The health and safety of employees are a top priority. We encourage work-life balance and keep the Geotab culture going strong with online social events, chat rooms and gatherings. Join us and help reshape the future of technology!
Geotab encourages applications from all qualified candidates. Geotab is committed to accommodating persons with disabilities throughout the interview, selection and employment process. If you require accommodation at any stage of the application process or want more information about our accommodation policies and practices, please contact us at careers@geotab.com. Click here to learn more about what happens with your personal data.
Tags: Airflow APIs Big Data CI/CD Computer Science Data Analytics ELT Engineering ETL GCP Git Google Cloud Machine Learning Mathematics Pipelines Python Security Spark SQL
Perks/benefits: Career development Flex hours Health care Team events
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open AI Engineer jobs
- Open Lead Data Analyst jobs
- Open MLOps Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Engineer II jobs
- Open Sr Data Engineer jobs
- Open Data Manager jobs
- Open Principal Data Engineer jobs
- Open Power BI Developer jobs
- Open Data Analytics Engineer jobs
- Open Junior Data Scientist jobs
- Open Business Intelligence Developer jobs
- Open Product Data Analyst jobs
- Open Senior Data Architect jobs
- Open Data Scientist II jobs
- Open Sr. Data Scientist jobs
- Open Manager, Data Engineering jobs
- Open Business Data Analyst jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Data Quality Analyst jobs
- Open Principal Data Scientist jobs
- Open Data Product Manager jobs
- Open Junior Data Engineer jobs
- Open ETL Developer jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open GCP-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Java-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open TensorFlow-related jobs
- Open PhD-related jobs
- Open CI/CD-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Kubernetes-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open Data warehouse-related jobs