Data Engineer

Toronto, ON, CA

TELUS

By choosing TELUS you get access to amazing plans, phones and high-speed internet on Canada's most-awarded network, and help connect Canadians in need.

View company page

Description


Join our team and what we'll accomplish together

 

As a Data Engineer, you will be responsible for designing, building and supporting the data pipelines which enable innovative, customer centric digital experiences. You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers. Our development team uses a range of technologies to get the job done including ETL and data quality tools from Informatica, streaming via Apache NiFi, google native tools on GCP (Dataflow, Composer, Big Query, etc.). We also do some API design and development with Postman and Node.js
 


You will be part of the team building data pipelines that support our marketing, finance, campaign and Executive Leadership team as well as implementing Informatica Master Data Management (MDM) hosted on Amazon Web Services (AWS).  Specifically you’ll be building pipelines that support insights to enable our business partners’ analytics and campaigns. You are a fast learner, highly technical, passionate person looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice.
 

 

What you'll do
 

  • Learn new skills & advance your data development practice
  • Analyze and profile data
  • Design, develop, test, deploy, maintain and improve batch and real-time data pipelines
  • Assist with design and development of solution prototypes
  • Support consumers with understanding the data outcomes and technical design
  • Collaborate closely with multiple teams in an agile environment

 

Qualifications


What you bring

 

  • You are a developer with 3+ years of experience in IT platform implementation in a technical capacity
  • Bachelor of Computer Science, Engineering or equivalent 
  • Understanding of data warehouses/cloud architectures and ETL processes
  • Working SQL knowledge and experience working with relational databases and query authoring (SQL), as well as working familiarity with a variety of databases
  • Experience with the Google Cloud Platform (GCP) and its related technologies (Kubernetes, CloudSQL, PubSub, Storage, Logging, Dashboards, Airflow, BigQuery, BigTable, Python, BQ SQL, Dataplex, Datastream etc.)
  • Experience with Python and software engineering best practices
  • API development using Node.js and testing using Postman/SoapUI
  • Experience working with message queues like JMS, Kafka, PubSub
  • A passion for data quality

 

Great-to-haves

 

  • Experience with Informatica IDQ/PowerCenter/IICS, Apache NiFi and other related ETL tools
  • Familiarity with Informatica MDM (preferred) but strong skills in other MDM tools still an asset
  • Proficiency in Python, Java
  • Understanding of TMF standards
Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Airflow API Development APIs Architecture AWS BigQuery Bigtable Computer Science Dataflow Data management Data pipelines Data quality Engineering ETL Finance GCP Google Cloud Informatica Java Kafka Kubernetes NiFi Node.js Pipelines Python RDBMS Security SQL Streaming Testing

Region: North America
Country: Canada
Job stats:  4  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.