Senior Software Developer - Data Operations

Oakville, ON, Canada

Full Time Senior-level / Expert
Geotab Inc. logo
Geotab Inc.
Apply now Apply later

Posted 3 days ago

Who we are:
Geotab is a global leader in IoT and connected transportation and certified “Great Place to Work.” We are a company of diverse and talented individuals who work together to help businesses grow and succeed, and increase the safety and sustainability of our communities.
Geotab is advancing security, connecting commercial vehicles to the internet and providing web-based analytics to help customers better manage their fleets. Geotab’s open platform and Marketplace, offering hundreds of third-party solution options, allows both small and large businesses to automate operations by integrating vehicle data with their other data assets. Processing billions of data points a day, Geotab leverages data analytics and machine learning to improve productivity, optimize fleets through the reduction of fuel consumption, enhance driver safety and achieve strong compliance to regulatory changes. 
Our team is growing and we’re looking for people who follow their passion, think differently and want to make an impact. Join us for a fulfilling career with opportunities to innovate, great benefits, and our fun and inclusive work culture. Reach your full potential with Geotab. To see what it’s like to be a Geotabber, check out our blog and follow us @InsideGeotab on Instagram, Twitter or Facebook.
Who you are:
You will have deep knowledge in developing and documenting processes for onboarding new data into Geotab’s production data environment. You are highly organized and are able to manage multiple tasks and projects simultaneously. Also a strong mentor and a coach to the team members with the ability to engage with all levels of the organization. If you love technology, and are keen to join a data driven company — we would love to hear from you!
What you'll do:
The Senior Software Developer, DataOps is the foundation of Geotab’s Data Factory. They are responsible for the development of scalable and reusable pipelines that minimize time from insight to production. The role continuously collaborates with data analysts and data scientists to design innovative pipelines using new tools and frameworks. They also work closely with the Data Architect and Data Quality experts to ensure all data assets are delivered with the highest quality, with the right schema and into the right location.

How you’ll make an impact:

  • Deploy and maintain ETL/ELT pipelines using SQL, Python and Airflow.
  • Design and publish reusable pipeline templates through in-house Python package (i.e. templates for data integration, derived metrics, reporting, custom runs).
  • Collaborate with data analysts and data scientists to develop complex pipeline templates involving big data tools (i.e. Spark, Apache Beam, Kubepods).
  • Lead optimization of pipelines based on requirements and pipeline performance.
  • Develop connectors to extract/receive data from external sources.
  • Manage pipeline releases through Git and CI/CD.
  • Contribute to the development of an alerting framework for production pipelines.
  • Ensure metadata is captured and stored across the pipeline lifecycle (i.e. creation, execution, deprecation/update).
  • Support remediation of issues within production pipelines.
  • Collaborate with data quality analysts and specialists to ensure all pipelines include automatic quality checks.
  • Collaborate with data architect to ensure data schemas comply with DataOps guideline.
  • Support data architect in evaluation of model-based pipelines through Dataform.
  • Recommend features and enhancements to infrastructure and pipeline framework.
  • Contribute to the migration of data assets and pipelines from legacy data structures

What you’ll bring to this role:

  • Post-secondary Degree/Diploma specialization in Computer Science, Engineering, Mathematics, or a related field.
  • 5+ years of Python development experience.
  • 3-5 years of experience building ETL/ELT production pipelines.
  • 1-2 years of experience with SQL and Databases.
  • Previous experience interacting with REST APIs in Python is an asset.
  • Hand-on experience using Apache Airflow in a work setting is an asset.
  • Experience working in a cloud based infrastructure, especially Google Cloud Platform, is an asset.
  • Experience in defining coding standards and providing mentorship to the team is required.
  • Previous experience with python package development is highly regarded.
  • Must stay relevant to technology and have the flexibility to adapt to the growing technology and market demands.
  • Strong team-player with the ability to engage with all levels of the organization.
Why job seekers choose Geotab:
Work from home and flex work arrangementsBaby bonusHome office reimbursement programOnline learning and networking opportunitiesElectric vehicle purchase incentive programCompetitive medical and dental benefits (full-time employees only)Retirement savings program (full-time employees only)

How we work:
At Geotab, we understand that the world is always changing and that we need to change with it. Geotab has adopted a hybrid model for working, including a flexible work from home program, with the opportunity to work in our safe, clean offices. When working from home, you are required to have a reliable internet connection with at least 50mb DL/10mb UL. Remote work is supported with cloud-based applications, collaboration tools and asynchronous working. The health and safety of employees are a top priority. We encourage work-life balance and keep the Geotab culture going strong with online social events, chat rooms and gatherings. Join us and help reshape the future of technology!
Geotab encourages applications from all qualified candidates. Geotab is committed to accommodating persons with disabilities throughout the interview, selection and employment process. If you require accommodation at any stage of the application process or want more information about our accommodation policies and practices, please contact us at Click here to learn more about what happens with your personal data.
Job tags: Airflow Big Data Data Analytics Engineering ETL Machine Learning Python Security Spark SQL
Job region(s): North America
Job stats:  7  1  0
Share this job: