Data Engineer - Great Lakes

Wisconsin

CDC Foundation

The CDC Foundation is a global nonprofit, managing public health programs that impact chronic and infectious diseases and emergency threats like COVID-19.

View company page

The CDC Foundation helps the Centers for Disease Control and Prevention (CDC) save and improve lives by unleashing the power of collaboration between CDC, philanthropies, corporations, organizations and individuals to protect the health, safety and security of America and the world. The CDC Foundation is the go-to nonprofit authorized by Congress to mobilize philanthropic partners and private-sector resources to support CDC’s critical health protection mission. Since 1995, the CDC Foundation has raised over $1.9 billion and launched more than 1,300 programs impacting a variety of health threats from chronic disease conditions including cardiovascular disease and cancer, to infectious diseases like rotavirus and HIV, to emergency responses, including COVID-19 and Ebola. The CDC Foundation managed hundreds of programs in the United States and in more than 90 countries last year. Visit www.cdcfoundation.org for more information.  

Job Highlights

  • Location: Remote, must be based in the United States
  • Salary Range: $115,000-$165,000, plus benefits
  • Position Type: Grant funded, limited-term opportunity
  • Position End Date: June 30, 2025 

Overview

  • The Data Engineer will play a crucial role in advancing the CDC Foundation's mission by designing, building, and maintaining data infrastructure for a public health organization and working within the Great Lakes Inter-Tribal Epidemiology Center. The Data Engineer will deliver the architecture needed for data linkages, storage, processing, and analysis. The Data Engineer will collaborate with data content experts, analysts, data scientists, data modelers, warehouse architects, IT staff, and other organization staff to design and implement proposed solutions and architectures that meet the needs of the public health agency. The Data Engineer will also work with current staff to increase their knowledge while allowing for the sustainability of the project to the Tribes as more data sets are added. The Data Engineer will be hired by the CDC Foundation aligned to the Workforce Acceleration Initiative (WAI) and assigned to the Great Lakes Inter-Tribal Epidemiology Center, which is a program of the Great Lakes Inter-Tribal Council. This position is eligible for a fully remote work arrangement for U.S. based candidates.

Responsibilities

  • Create and manage the systems and pipelines that enable efficient and reliable flow of data, including ingestion, processing, and storage.
  • Collect data from various sources, transforming and cleaning it to ensure accuracy and consistency. Load data into storage systems or data warehouses.
  • Optimize data pipelines, infrastructure, and workflows for performance and scalability to handle growing data volumes associated with public health initiatives.
  • Monitor data pipelines and systems for performance issues, errors, and anomalies, and implement solutions to address them.
  • Implement security measures to protect sensitive public health information, complying with relevant regulations (e.g., HIPAA).
  • Collaborate with data scientists, analysts, and other partners to understand their data needs and requirements, and to ensure that the data infrastructure supports the organization's goals and objectives.
  • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs.
  • Implement and maintain ETL processes to ensure the accuracy, completeness, and consistency of data.
  • Design and manage data storage systems, including relational databases, NoSQL databases, and data warehouses.
  • Knowledgeable about industry trends, best practices, and emerging technologies in data engineering, and incorporating the trends into the organization's data infrastructure.
  • Provide technical guidance to other staff.
  • Provide technical guidance to Tribes as requested.
  • Communicate effectively with partners at all levels of the organization to gather requirements, provide updates, and present findings.

Qualifications

  • Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field.
  • Proficiency in programming languages commonly used in data engineering, such as Python, Java, Scala, or SQL. Candidate should be able to implement data automations within existing frameworks as opposed to writing one off scripts.
  • Experience with big data technologies and frameworks like Hadoop, Spark, Kafka, and Flink.
  • Strong understanding of database systems, including relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
  • Experience regarding engineering best practices such as source control, automated testing, continuous integration and deployment, and peer review.
  • Knowledge of data warehousing concepts and tools.
  • Experience with cloud based computing  and data storage solutions (e.g. AWS S3, Azure).
  • Expertise in data modeling, ETL (Extract, Transform, Load) processes, and data integration techniques.
  • Familiarity with agile development methodologies, software design patterns, and best practices.
  • Strong analytical thinking and problem-solving abilities.
  • Excellent verbal and written communication skills, including the ability to convey technical concepts to non-technical partners effectively.
  • Flexibility to adapt to evolving project requirements and priorities.
  • Outstanding interpersonal and teamwork skills; and the ability to develop productive working relationships with colleagues and partners.
  • Experience working in a virtual environment with remote partners and teams
  • Proficiency in Microsoft Office.
  • Knowledge of data security best practices specific to healthcare data (e.g., HIPAA compliance).
  • Experience with DevOps principles and tools for automating data pipelines and infrastructure deployments

Special Notes

  • This role is involved in a dynamic public health program. As such, roles and responsibilities are subject to change as situations evolve. Roles and responsibilities listed above may be expanded upon or updated to match priorities and needs, once written approval is received by the CDC Foundation in order to best support the public health programming.
All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of race, color, religion, sex, national origin, age, mental or physical disabilities, veteran status, and all other characteristics protected by law.
We comply with all applicable laws including E.O. 11246 and the Vietnam Era Readjustment Assistance Act of 1974 governing employment practices and do not discriminate on the basis of any unlawful criteria in accordance with 41 C.F.R. §§ 60-300.5(a)(12) and 60-741.5(a)(7). As a federal government contractor, we take affirmative action on behalf of protected veterans.
The CDC Foundation is a smoke-free environment.  Relocation expenses are not included.
Apply now Apply later
  • Share this job via
  • or
Job stats:  1  0  0
Category: Engineering Jobs

Tags: Agile Architecture AWS Azure Big Data Cassandra Computer Science Data pipelines Data Warehousing DevOps Engineering ETL Flink Hadoop Java Kafka MongoDB MySQL Nonprofit NoSQL Pipelines PostgreSQL Python R RDBMS Scala Security Spark SQL Testing

Perks/benefits: Health care Relocation support

Regions: Remote/Anywhere North America
Country: United States

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.