Python Developer/Data Engineer
Canada
Paramount Commerce
Paramount Commerce provides online gaming and remittance merchants a simple way to accept account-based payments with state-of-the-art risk management.Paramount Commerce is a global payments technology company providing a safe and simple way to integrate bank account-based payments. As a leading solution in both iGaming and sports betting, we are trusted by industry-leading brands such as DraftKings, theScore, PartyPoker, and Bet365.
The online gaming industry is at the forefront of innovation in alternative payments, and our team continues to do what’s never been done before. We are looking for world-class people who want to help us solve for success together, and create raving fans of merchants and consumers. Join our global team of smart, dedicated people on a mission to simplify payments!
We are seeking an intermediate-level Python Developer/Data Engineer to join our Data Engineering Team. You will work alongside our team to design and implement comprehensive Extract, Transform and Load (ETL) pipelines to help move data from our 10+ products into a centralized data lake and data warehouse using a unified data model. You will work closely with our diverse, interdisciplinary teams to understand both the technical and business requirements of our platform and design performant real-time pipelines that successfully drive our reporting and business intelligence platforms throughout the company. Furthermore, you will generate data models, visualizations, and engineering solutions in support of both internal requests and business intelligence requirements following standardized software engineering best-practices and the agile methodology.
This is a remote position, so applicants are expected to be able to successfully connect to video chat programs like Google Meet and Zoom using both video and voice communication.
What will you do?
- Work closely with internal teams to understand requirements and build comprehensive pipelines in support of data intelligence and data engineering initiatives. Reaching out to teams with suggestions on where their work may be improved by data engineering support.
- Build comprehensive, efficient ETL (Extract, Transform, Load) pipelines using Python in AWS cloud with tools such as AWS Glue, AWS Lambda, Apache Airflow, and Apache Spark.
- Write and optimize highly performant SQL queries and stored procedures to explore and transform existing product data into a unified warehouse data model.
- Generating reports for both internal and external business needs using modern Python tools and libraries.
- Participate in design and planning sessions within the data engineering team as we map out changes to our overall architecture and establish an overall vision for data within the company.
- Continuously enhancing existing pipelines and technologies to improve efficiency and performance.
- Prototyping and Testing new technologies to discover those that are able to improve our platform with an eye to performance and accuracy.
- Documenting, monitoring and testing existing and new pipelines to ensure both long term support and quality assurance of data in our data lake and data warehouse.
- Other duties as assigned.
Who you are:
- Driven with a desire to work in a fast-paced, constantly evolving environment.
- Able to work independently as well as in a team with a proven ability to multi-task and bring new ideas and proposals to every discussion.
- Excellent communication skills in English, both verbal and in writing.
- Always learning with a demonstrated history of adapting to new technologies quickly.
- Able to take business requirements and translate them into solid implementation plans including accurate estimation of time requirements.
- Detail oriented with a track-record of generating high quality, documented and testable code.
- Desire to help, support, and teach your coworkers.
What we are looking for:
- A bachelor’s degree or equivalent in computer science, information science, or mathematics
- 4+ years of experience with Python including familiarity with common data science libraries like Requests, Pandas, NumPy, Spark, Keras, SciPy, Plotly, PyTest and Matplotlib and technical expertise with data models, data mining, and asynchronous techniques.
- 3+ years experience in a data engineering (or similar) role at a previous company.
- 3+ years experience designing and maintaining database models and hands-on experience generating and optimizing complex SQL queries and stored procedures.
- 2+ years experience working in an AWS cloud environment and familiarity with DevOps practices like containerization, pipelines and automated-testing.
- Proficient software engineering fundamentals including object oriented development, test driven development and experience with source code management using Git.
- Experience with data lake/warehouse tools like Snowflake, AWS Athena, PostgreSQL and AWS Redshift and designing unified data models for diverse datasets.
- Experience with big data pipeline and orchestration tools like Apache Spark, Apache Airflow, Elastic Search, AWS Lambda and AWS Glue.
- Strong troubleshooting skills and experience setting up automated testing and monitoring of data pipelines in order to ensure near constant uptime of real-time data flows. Bonus points for working knowledge of Datadog or Splunk.
- Experience working in an agile development environment including key concepts such as epics, stories, sprints, and stand-ups.
- Strong ability to translate customer requirements into a formal design.
At Paramount Commerce, we are proud to be an equal opportunity employer. We practice ethical and fair hiring processes and strongly encourage applications from diverse backgrounds. Accommodations are available on request for candidates taking part in all aspects of the selection process.
We are committed to creating a place for our employees to be their authentic selves. We believe diversity of thought, background, and experience are essential to achieve our mission to simplify payments.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Athena AWS Big Data Business Intelligence Computer Science Data Mining Data pipelines DevOps Engineering ETL Git Keras Lambda Mathematics Matplotlib NumPy Pandas Pipelines Plotly PostgreSQL Prototyping Python Redshift SciPy Snowflake Spark Splunk SQL Testing
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Data Science Manager jobs
- Open MLOps Engineer jobs
- Open AI Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Sr Data Engineer jobs
- Open Data Engineer II jobs
- Open Data Manager jobs
- Open Principal Data Engineer jobs
- Open Power BI Developer jobs
- Open Data Analytics Engineer jobs
- Open Junior Data Scientist jobs
- Open Product Data Analyst jobs
- Open Data Scientist II jobs
- Open Senior Data Architect jobs
- Open Business Intelligence Developer jobs
- Open Sr. Data Scientist jobs
- Open Manager, Data Engineering jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Data Quality Analyst jobs
- Open Business Data Analyst jobs
- Open Data Product Manager jobs
- Open Junior Data Engineer jobs
- Open ETL Developer jobs
- Open Principal Data Scientist jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open GCP-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Java-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open TensorFlow-related jobs
- Open PhD-related jobs
- Open CI/CD-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Kubernetes-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs