Core Cloud Data Engineer

Atlanta, New York, Chicago

Cloudreach logo
Cloudreach
Apply now Apply later

Posted 3 days ago

Cloudreach is the leading multi-cloud services provider. Our mission is to help companies navigate their unique journeys to the cloud and build new foundations for future growth. We’re a team of multi-cloud natives with certifications across AWS, GCP and Azure. Businesses that work with Cloudreach adopt cutting-edge technologies to solve challenges and create new opportunities. Working exclusively on public cloud, we deliver unrivaled value for more than 1000 enterprise clients globally.

 

Behind our services are our Cloudreachers. 

 

We pride ourselves on being the go-to destination for curious, talented, and driven people looking for unique work experiences to maximize their potential. We are passionate about asking questions, finding solutions, playing with the latest tools & technology, doing our lives' work, and having fun along the way. You can learn more about our innovative culture, global workforce, and Cloudy Values on our website

 

We're not content with the status quo. We're here to do things better, and then do better things. 

 

What team will you be on?

Joining the Data and Analytics team (DnA), you are part of the broader Cloud Professional Services team. You will form the core of Cloudreach’s Data Engineering capability. We currently help customers of all shapes and sizes in their public Cloud Adoption journey, alongside infrastructure as code, DevOps automation and Operational governance experts.

 

 

What will your role be? 

The purpose of a Cloud Data Engineer is to enable data scientists and analysts to gain insights into data by architecting and implementing data-driven cloud based solutions. At Cloudreach, they will be subject matter experts and will be responsible for the technical leadership for data ingestion and processing engagements. A good understanding of cloud platforms and prior experience working with big data tooling and frameworks is required.

  • Build technical solutions required for optimal ingestion, transformation, and loading of data from a wide variety of data sources using open source, AWS, Azure or GCP ‘big data’ frameworks and services.
  • Work with the product and software team to provide feedback surrounding data-related technical issues and support for data infrastructure needs uncovered during customer engagements / testing.
  • Understand and formulate processing pipelines of large, complex data sets that meet functional / non-functional business requirements.
  • Create and maintain optimal data pipeline architecture
  • Working alongside the Cloud Architect and Cloud Enablement Manager to implement Data Engineering solutions
  • Collaborating with the customer’s data scientists and data stewards/governors during workshop sessions to uncover more detailed business requirements related to data engineering




What are we looking for? 

  • Has experience successfully implementing simple enterprise solutions using a vast range of data engineering technologies. Has experience deploying and using the services in a major cloud platform
  • Delivers - Significant code contributions to more complex components of ETL pipelines
    Writes test, test cases, and code documentation with a high level of detail. Skill mastery in a few areas
  • Coding - Pushes back bad code and employs ETL best practices
  • Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
  • Experience with Machine Learning toolkits
  • Experience in building scalable end-to-end data ingestion and processing solutions
  • Good understanding of data infrastructure and distributed computing principles
  • Proficient at implementing data processing workflows using Hadoop and frameworks such as Spark and Flink
  • Good understanding of data governance and how regulations can impact data storage and processing solutions such as GDPR and PCI
  • Ability to identify and select the right tools for a given problem, such as knowing when to use a relational or non-relational database
  • Working knowledge of non-relational and row/columnar based relational databases
  • Stakeholder Management - Has guided & influenced customer stakeholders. Manages customer expectations on project deliverables. Aware of the customers political landscape beyond the technology. Clarifies and communicates internally customer objectives.



What are our perks?

    • Meaningful and impactful work opportunities at a pioneering, cutting edge cloud services provider
    • People-first mentality. We know that you and your mental health & wellbeing are #1. That’s why we give you an uncapped holiday allowance (+ your birthday off!), employee assistance programs, and resources to support your mental health & wellbeing
    • We embrace technologies that unlock agile & flexible ways of working. We respect our people to do their work when and how they work best. Work-life blend is a priority!
    • Our dynamic work environment enables autonomy while also promoting a sense of belonging to a global community
    • Opportunity for growth & development. Not only will you work alongside and learn from industry thought leaders, you will also be reimbursed for function-based certifications. We’re multi-cloud and proud!
    • An inclusive workplace where varying backgrounds, ideas, and points of view are celebrated and the individual is respected, included and empowered to bring their whole self to work
    • Transparency in business updates & communications. Whether you’re on the senior leadership team or a brand new employee,  you’re an integral part of the team and we’ll make sure you know what’s up
  • Recognition-rich company culture where daily wins are celebrated and individuals living out our values are applauded 

 

We strive to remove barriers, eliminate discrimination and ensure equal opportunity through our transparent recruitment process. We are open to all groups of people without regard to age, disability, marital status, gender identity, race, colour, sexual orientation, religion, military status, veteran status or any other legally-protected characteristic.

Job tags: AWS Big Data Engineering ETL Hadoop Java Machine Learning Open Source Python Scala Spark