Data Engineer
Melbourne, Australia
Swift Navigation
Swift Navigation’s official site. Learn how Swift’s accurate and affordable precise positioning solutions benefit autonomous, mobile and mass market applications.
Swift Navigation provides precise positioning solutions for automotive, autonomous vehicle, mobile and mass-market applications. What began as the GNSS industry’s first low-cost, high-accuracy, real-time kinematic (RTK) receiver has evolved into a Swift Navigation ecosystem of positioning solutions for autonomous applications. From the nationwide GNSS corrections delivered from the cloud by the Skylark™ precise positioning service, the hardware-independent, integrated software solution that is the Starling® positioning engine to the centimeter-level accurate Piksi® Multi and ruggedized Duro® and Duro Inertial RTK receivers, Swift Navigation is enabling a future of autonomous vehicles to navigate and understand the world. Learn more online at swiftnav.com, or follow Swift on Twitter @Swiftnav
Meet The Team
The data platform product team at Swift Navigation is building a toolkit for location-enabled applications across the last-mile delivery, machine automation, and fleet management industries. Our product team consists of software engineers, product managers, and works closely with the CTO. We are designing systems to ingest real-time location data from tens of thousands of customer devices and transform that data into business intelligence to provide solutions to a wide range of customers
What You’ll Do:
- Selecting products / solutions for setting up data infrastructure that meets the need of the data product platform
- Setup / establish cloud infrastructure for the selected data solutions in cloud providers such as AWS and GCP
- Design and implement scalable and reliable systems for ingestion, processing, and real-time analysis of large, disparate data sets from diverse sources, deployed on AWS using tools such as MSK, Kinesis, AWS EMR, AWS Athena, AWS Glue, Lambdas, API Gateway etc, or champion the adoption of GCP equivalents
- Develop tools and applications to proactively measure, monitor, and improve data quality and consistency during loading and analysis processes
- Analyze and improve efficiency, reliability, and scalability of data infrastructure and processes
- Work with data team to define and promote best practices for data management and analysis, and to build and improve systems to implement and support these practices
- Build appropriate logging of metrics and diagnostics in code to enable effective monitoring and data observability using tools like Splunk, DataDog, Grafana or Prometheus
- Design, architect, build and scale a modern and cloud-native data platform with requisite tooling and APIs necessary to help democratize access to data
- Manage the orchestration of batch jobs with solutions like Apache Airflow
- Support building modular set of data services for data transformation and business logic processing using container microservices like Apache Spark on AWS EMR, ETL technologies like AWS Glue, dbt and SQL
What You'll Need To Succeed
- Experience in building highly scalable data architectures from scratch on AWS or GCP
- Experience with native cloud development on the AWS ecosystem and event-driven architectures and pub-subs like Kafka, Kinesis, AWS Event Bridge, Flink, web sockets and SSE is a must
- Experience retrieving, parsing, cleaning, and transforming data from multiple formats and delivery mechanisms
- Experience with a modern data engineering platform on AWS using MSK, Kinesis, AWS EMR, AWS Athena, AWS Glue, Lambdas, API Gateway etc.
- Experience with batch data processing using tools like Apache Spark
- Experience with structured Spark streaming is highly desired
- Familiarity and experience with data pipeline frameworks, such as Luigi
- Experience with gRPC and Protocol buffers is highly desired
- Familiarity with machine learning and data science ecosystems such as AWS Sagemaker and Databricks
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture Athena AWS Business Intelligence Databricks Data management Data quality Engineering ETL Flink GCP GNSS Grafana Kafka Kinesis Machine Learning Microservices SageMaker Spark Splunk SQL Streaming
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Marketing Data Analyst jobs
- Open MLOps Engineer jobs
- Open Junior Data Scientist jobs
- Open Data Engineer II jobs
- Open AI Engineer jobs
- Open Senior Data Architect jobs
- Open Power BI Developer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Analytics Engineer jobs
- Open Sr Data Engineer jobs
- Open Manager, Data Engineering jobs
- Open Principal Data Engineer jobs
- Open Business Data Analyst jobs
- Open Product Data Analyst jobs
- Open Data Quality Analyst jobs
- Open Data Manager jobs
- Open Sr. Data Scientist jobs
- Open Big Data Engineer jobs
- Open Data Scientist II jobs
- Open Business Intelligence Developer jobs
- Open Data Analyst Intern jobs
- Open ETL Developer jobs
- Open Principal Data Scientist jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Business Intelligence-related jobs
- Open Data quality-related jobs
- Open Privacy-related jobs
- Open Data management-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open APIs-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open PhD-related jobs
- Open TensorFlow-related jobs
- Open Snowflake-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Data warehouse-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs
- Open CI/CD-related jobs