Senior Data Engineer

Salt Lake City, Utah, United States

Full Time Mid-level / Intermediate
Go1 logo
Go1
Apply now Apply later

Posted 3 weeks ago

GO1 is on a mission to unlock positive potential through a love of learning. The GO1 platform makes learning easy with thousands of courses in one simple solution. Since our founding in 2015, we have grown to an organization of 200+ in 9 global offices. At GO1, we go to infinity and beyond through excellent work, prioritising actions over words and encouraging creativity and candour. It is with this spirit that we are driving hypergrowth and international expansion.

We are currently looking for an experienced Senior Data Engineer who will be be responsible for technical delivery within the Go1 data platform technical domain. Working closely with business & product teams and our technical architects, you will be setting data platform development best practice aligned to the architecture design. You will be instrumental in delivering our data driven organisation future.

You will be a champion for good development practice and will demonstrate this through practical work, constructive feedback and close collaboration. You will be a critical thinker who is able to rapidly problem solve. You bring you practical experience to this domain, land running from day one. You have a talent for getting the balance and fidelity of documentation and practical knowledge transfer, just right.

Your main responsibilities

  • Build Data pipelines to Acquire, Store, Transform & Surface enterprise data.
  • Develop Batch process framework and deliver use case pipelines.
  • Deploy change data capture for MySQL, landing data to Azure Data Lake 2.
  • Develop transformation pipelines suppling analytics data to Azure Analysis Services.
  • Develop PowerBI Apps/Dashboards/Reports.

This is a permanent full time position ideally based out of Salt Lake City, Utah but will also consider Settle and San Francisco Bay locations. Salary is c$115k to $140k base plus benefits, depending on skills, experience and location.

Requirements

  • Azure Datalake 2.
  • Parquet, Avro, Delta file formats.
  • Acquiring data from Kafka/Rest API/Database's.
  • Azure Data Factory / Synapse Orchestrate (Dataflows).
  • Databricks/Synapse Notebooks.
  • PowerBI, DAX, setting up shared data sets & dataflows.
  • Exceptional SQL, PySpark (Python), .net Development (Azure Functions)

Benefits

  • We love what we do, and we work hard but we’re also flexible and have fun in the process
  • We work as a team – we are open to new ideas, resolve issues together and continue to learn from each other
  • Enjoy unlimited access to the Go1 Content Hub - thousands of training courses are waiting for you!
  • Go1 is on a stable and rapid-growth curve and you will have the opportunity to grow with us
  • We offer extensive benefits, including but not limited to:
    • Unlimited leave policy
    • Health Care Plan (Medical, Dental & Vision) and Retirement Plan (401k)
    • Life Insurance


Job tags: Avro Data pipelines Kafka MySQL Parquet PowerBI PySpark Python Rest API SQL
Job region(s): North America
Job stats:  8  3  0
  • Share this job via
  • or