Software Developer - Big Data Platform

Toronto, ON, Canada

Geotab Inc. logo
Geotab Inc.
Apply now Apply later

Posted 3 weeks ago

Who is Geotab:
Geotab, a global leader in IoT and connected transportation, is one of the fastest-growing technology companies in North America and a certified “Great Place to Work.” Each day, Geotab processes billions of data points from over 2 million connected vehicles, extracting actionable insights to help empower businesses to better manage their fleets. Recognized as the world’s #1 commercial telematics provider, Geotab’s solutions are used by more than 40,000 customers in over 130 countries around the globe. Geotab understands that telematics is critical to helping create safer and more sustainable drivers, businesses and communities and actively works to help businesses improve driver behavior, reduce greenhouse gas emissions and create safer roads for all. Global demand for telematics is increasing and Geotab is leading the way. Are you ready to join us? To see what it’s like to be a Geotabber, follow us @InsideGeotab on Instagram, Twitter or Facebook today!
Who we are looking for: 
We believe that there is collective value in the massive amount of data arising from all of our sensors across the globe. And most importantly, we believe this data can be used to improve safety, infrastructure, and productivity for our customers and communities. Our fast-growing, industry best Data and Analytics team is looking to hire a Software Developer - Big Data Platform who is passionate about building big data infrastructure for large scale and fast-growing datasets and help our team solve real-world problems.
What are the details of this position:
The successful candidate will work within the Data Engineering team of Geotab's Data and Analytics department. This team is responsible for the delivery of data ingestion pipeline, consumption API and stream analytics infrastructure for over 5 billion raw and 50 billion processed records by over 2 million vehicles across 7 continents on a daily basis. You will work collaboratively to recommend, prototype, build and debug data infrastructures on Google Cloud Platform (GCP). To be successful in this role, you will be an individual who is passionate about software designing and developing, with a willingness to develop your analytical skills related to the implementation of Big Data solutions. In addition, the successful candidate will have a keen attention to detail, and have a strong understanding around the importance of system reliability and performance and how to tune it to the better. If you love working with big data, are excited about working in a cutting edge, open source data technologies and software paradigms and are keen to join an industry leader — we would love to hear from you!

Duties and tasks/essential functions:

  • Work with team lead and other data engineering team members to design and implement enterprise data platform, tools and applications on Google Cloud Platform, including but not limited to data ingestion pipeline, data API, stream processing, metadata repository
  • Contribute to the design, development and delivery of the team’s Data Enablement Program and Data Intelligence Program roadmap
  • Implement and contribute to the design of  the enterprise data ingestion pipeline / data API / data catalogue within the organization's Big Data platform
  • Interact with project owner, data engineering team lead and related internal customers for feature development and trouble-shooting
  • Participate and contribute to technology stack research and assessment as required
  • Contribute to detailed component design and development plan
  • Implement approved design following industry best practises and with high quality standard
  • Oversee the collection and analysis of key metrics and reporting dashboards to monitor enterprise data platform performance and reliability
  • Support the development and maintenance of data engineering guidelines, policies, standards and process narratives for in-scope business functions

Experience/Skills Required:

  • Bachelor's degree in Computer Science, Software Engineering, Computer Engineering, or other related degree
  • Practical experience in core Java/C#/Python with solid understanding of the Software Development Life Cycle
  • Solid understanding and experience of any relational databases (Mysql, Postgres, SQLServer, etc)
  • Excellent knowledge of Advanced SQL working with large data sets
  • Familiarity with various design patterns, good component and modeling ability.
  • Familiarity with ETL technologies and tools for large scale data 
  • Experience developing and using virtualization, container-based and cloud platforms such as Kubernetes, Swarm, Docker, etc
  • Experience with Apache Kafka, Apache Airflow, Google Cloud Platform and Google BigQuery
  • Familiarity with mainstream big data related technologies such as distributed frameworks
Our Core Values:
Geotab is shaping the future of telematics. Using leading technology, we embrace change and challenge the status quo. To stay ahead of the curve, we keep Geotabbers energized with data bootcamps, course subsidies, Friday lunches, and more. We believe collaboration leads to innovation; our teams stretch across floors, cities and continents. All employees, from the CEO to the summer students, maintain an open-door policy. Whether we’re fine-tuning our products, or boosting our office culture, we’re building a foundation for long-term success. To us, this means safer roads, more efficient fleets and a team of dynamic Geotabbers!
Geotab encourages applications from all qualified candidates. Geotab is committed to accommodating persons with disabilities throughout the interview, selection and employment process. If you require accommodation at any stage of the application process or want more information about our accommodation policies and practices, please contact us at (416) - 434 4309.
Job tags: Airflow Big Data BigQuery Engineering ETL Java Kafka Kubernetes MySQL Open Source Postgres Python Research SQL