Software Developer - Big Data Platform

Toronto, ON, Canada

Geotab Inc. logo
Geotab Inc.
Apply now Apply later

Posted 1 month ago

Who we are:
Geotab is a global leader in IoT and connected transportation and certified “Great Place to Work.” We are a company of diverse and talented individuals who work together to help businesses grow and succeed, and increase the safety and sustainability of our communities.
Geotab is advancing security, connecting commercial vehicles to the internet and providing web-based analytics to help customers better manage their fleets. Geotab’s open platform and Marketplace, offering hundreds of third-party solution options, allows both small and large businesses to automate operations by integrating vehicle data with their other data assets. Processing billions of data points a day, Geotab leverages data analytics and machine learning to improve productivity, optimize fleets through the reduction of fuel consumption, enhance driver safety and achieve strong compliance to regulatory changes. 
Our team is growing and we’re looking for people who follow their passion, think differently and want to make an impact. Join us for a fulfilling career with opportunities to innovate, great benefits, and our fun and inclusive work culture. Reach your full potential with Geotab. To see what it’s like to be a Geotabber, check out our blog and follow us @InsideGeotab on Instagram, Twitter or Facebook.
Who you are: 
We believe that there is collective value in the massive amount of data arising from all of our sensors across the globe. And most importantly, we believe this data can be used to improve safety, infrastructure, and productivity for our customers and communities. Our fast-growing, industry best Data and Analytics team is looking to hire a Software Developer - Big Data Platform who is passionate about building big data infrastructure for large scale and fast-growing datasets and help our team solve real-world problems.
What you'll do:
The successful candidate will work within the Data Engineering team of Geotab's Data and Analytics department. This team is responsible for the delivery of data ingestion pipeline, consumption API and stream analytics infrastructure for over 5 billion raw and 50 billion processed records by over 2 million vehicles across 7 continents on a daily basis. You will work collaboratively to recommend, prototype, build and debug data infrastructures on Google Cloud Platform (GCP). To be successful in this role, you will be an individual who is passionate about software designing and developing, with a willingness to develop your analytical skills related to the implementation of Big Data solutions. In addition, the successful candidate will have a keen attention to detail, and have a strong understanding around the importance of system reliability and performance and how to tune it to the better. If you love working with big data, are excited about working in a cutting edge, open source data technologies and software paradigms and are keen to join an industry leader — we would love to hear from you!

How you’ll make an impact:

  • Work with team lead and other data engineering team members to design and implement enterprise data platform, tools and applications on Google Cloud Platform, including but not limited to data ingestion pipeline, data API, stream processing, metadata repository.
  • Contribute to the design, development and delivery of the team’s Data Enablement Program and Data Intelligence Program roadmap.
  • Implement and contribute to the design of  the enterprise data ingestion pipeline / data API / data catalogue within the organization's Big Data platform.
  • Interact with project owner, data engineering team lead and related internal customers for feature development and trouble-shooting.
  • Participate and contribute to technology stack research and assessment as required.
  • Contribute to detailed component design and development plan.
  • Implement approved design following industry best practises and with high quality standard.
  • Oversee the collection and analysis of key metrics and reporting dashboards to monitor enterprise data platform performance and reliability.
  • Support the development and maintenance of data engineering guidelines, policies, standards and process narratives for in-scope business functions.

What you’ll bring to this role:

  • Bachelor's degree in Computer Science, Software Engineering, Computer Engineering, or other related degree
  • Practical experience in core Java/C#/Python with solid understanding of the Software Development Life Cycle
  • Solid understanding and experience of any relational databases (Mysql, Postgres, SQLServer, etc)
  • Excellent knowledge of Advanced SQL working with large data sets
  • Familiarity with various design patterns, good component and modeling ability.
  • Familiarity with ETL technologies and tools for large scale data 
  • Experience developing and using virtualization, container-based and cloud platforms such as Kubernetes, Swarm, Docker, etc
  • Experience with Apache Kafka, Apache Airflow, Google Cloud Platform and Google BigQuery
  • Familiarity with mainstream big data related technologies such as distributed frameworks
How we work:
At Geotab, we understand that the world is always changing and that we need to change with it. Geotab has adopted a hybrid model for working, including a flexible work from home program, with the opportunity to work in our safe, clean offices. When working from home, you are required to have a reliable internet connection with at least 50mb DL/10mb UL. Remote work is supported with cloud-based applications, collaboration tools and asynchronous working. The health and safety of employees are a top priority. We encourage work-life balance and keep the Geotab culture going strong with online social events, chat rooms and gatherings. Join us and help reshape the future of technology!
Geotab encourages applications from all qualified candidates. Geotab is committed to accommodating persons with disabilities throughout the interview, selection and employment process. If you require accommodation at any stage of the application process or want more information about our accommodation policies and practices, please contact us at Click here to learn more about what happens with your personal data.
Job tags: Airflow Big Data BigQuery Data Analytics Engineering ETL Java Kafka Kubernetes Machine Learning MySQL Open Source Postgres Python Research Security SQL
Job region(s): North America
Job stats:  13  3  0
Share this job: