Software Developer - Big Data Platform/Development Operations (Recent Graduate)
Oakville, ON, Canada
Geotab Inc.
Our GPS fleet tracking & management system equips thousands of fleets worldwide with technology to automate, track and manage a truly optimized operation.Geotab is a global leader in IoT and connected transportation and certified “Great Place to Work.” We are a company of diverse and talented individuals who work together to help businesses grow and succeed, and increase the safety and sustainability of our communities.
Geotab is advancing security, connecting commercial vehicles to the internet and providing web-based analytics to help customers better manage their fleets. Geotab’s open platform and Marketplace, offering hundreds of third-party solution options, allows both small and large businesses to automate operations by integrating vehicle data with their other data assets. Processing billions of data points a day, Geotab leverages data analytics and machine learning to improve productivity, optimize fleets through the reduction of fuel consumption, enhance driver safety and achieve strong compliance to regulatory changes.
Our team is growing and we’re looking for people who follow their passion, think differently and want to make an impact. Join us for a fulfilling career with opportunities to innovate, great benefits, and our fun and inclusive work culture. Reach your full potential with Geotab. To see what it’s like to be a Geotabber, check out our blog and follow us @InsideGeotab on Instagram, Twitter or Facebook.
Who you are and what you’ll do:
Our Data Engineering and DevOps teams are growing and we are looking for smart developers like you to join our team! As a DevOps Engineer your key area of responsibility will be managing and monitoring the performance of our server infrastructure on the public cloud - specifically Google Compute Engine - as part of the DevOps team. You’re an IT expert in automated infrastructure management, and will spend some of your time scripting in Powershell or Python. Our Data Engineering team is responsible for the delivery of data ingestion pipeline, consumption API and stream analytics infrastructure for over 5 billion raw and 50 billion processed records by over 2 million vehicles across 7 continents on a daily basis. You will work collaboratively to recommend, prototype, build and debug data infrastructures on Google Cloud Platform (GCP).
How you’ll make an impact:
- Translate application storyboards and use cases into functional applications
- Design, build, and maintain efficient, reusable, reliable code to develop high throughput, performance critical, scalable applications
- Ensure the best possible performance, quality, and responsiveness of applications by following Geotab coding best-practices, peer collaboration and continuous learning
- Perform Test-Driven and Security-Driven Development
- Conduct comprehensive code reviews and ensure that coding standards are followed
- Analyze call stacks, trace files, debug logs or performance data to troubleshoot and identify the root cause of bugs
- Participate in development of prototypes, minimum viable products, and PoCs
- Help maintain code quality, organization, and automatization
What you’ll bring to the table:
- Bachelor's degree in Computer Science, Software Engineering, Computer Engineering, or other related degree
- Practical experience in core Java/C#/Python with solid understanding of the Software Development Life Cycle
- Solid understanding and experience of any relational databases (Mysql, Postgres, SQLServer, etc)
- Excellent knowledge of Advanced SQL working with large data sets
- Familiarity with various design patterns, good component and modeling ability
- Familiarity with ETL technologies and tools for large scale data Experience developing and using virtualization, container-based and cloud platforms such as Kubernetes, Swarm, Docker, etc
- Experience with Apache Kafka, Apache Airflow, Google Cloud Platform and Google BigQuery
- Familiarity with mainstream big data related technologies such as distributed frameworks
Work from home and flex work arrangementsBaby bonusHome office reimbursement programOnline learning and networking opportunitiesElectric vehicle purchase incentive programCompetitive medical and dental benefits (full-time employees only)Retirement savings program (full-time employees only)
How we work:
At Geotab, we understand that the world is always changing and that we need to change with it. Geotab has adopted a hybrid model for working, including a flexible work from home program, with the opportunity to work in our safe, clean offices. When working from home, you are required to have a reliable internet connection with at least 50mb DL/10mb UL. Remote work is supported with cloud-based applications, collaboration tools and asynchronous working. The health and safety of employees are a top priority. We encourage work-life balance and keep the Geotab culture going strong with online social events, chat rooms and gatherings. Join us and help reshape the future of technology!
Geotab encourages applications from all qualified candidates. Geotab is committed to accommodating persons with disabilities throughout the interview, selection and employment process. If you require accommodation at any stage of the application process or want more information about our accommodation policies and practices, please contact us at careers@geotab.com. Click here to learn more about what happens with your personal data.
Tags: Airflow APIs Big Data BigQuery Computer Science Data Analytics DevOps Docker Engineering ETL GCP Google Cloud Kafka Kubernetes Machine Learning MySQL PostgreSQL Python RDBMS Security SQL
Perks/benefits: Career development Flex hours Health care Team events
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Lead Data Analyst jobs
- Open MLOps Engineer jobs
- Open Data Science Manager jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Manager jobs
- Open Data Engineer II jobs
- Open Sr Data Engineer jobs
- Open Principal Data Engineer jobs
- Open Power BI Developer jobs
- Open Data Analytics Engineer jobs
- Open Business Intelligence Developer jobs
- Open Junior Data Scientist jobs
- Open Data Scientist II jobs
- Open Senior Data Architect jobs
- Open Product Data Analyst jobs
- Open Sr. Data Scientist jobs
- Open Business Data Analyst jobs
- Open Manager, Data Engineering jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Data Quality Analyst jobs
- Open Data Product Manager jobs
- Open Azure Data Engineer jobs
- Open ETL Developer jobs
- Open Junior Data Engineer jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open GCP-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Java-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open TensorFlow-related jobs
- Open PhD-related jobs
- Open CI/CD-related jobs
- Open NLP-related jobs
- Open Kubernetes-related jobs
- Open Data governance-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs