Senior Data Engineer

Remote, US

As a Data Engineer at TSPi, you will implement data engineering solutions for a variety of government agencies including, but not limited to, clients with conservation and environmental missions. You’ll deploy and develop pipelines and platforms that organize and make disparate data meaningful. At TSPi you will work with a multi-disciplinary team of analysts, data engineers, developers, and data consumers in a fast-paced, Agile environment. You’ll use your experience in analytical exploration and data examination while you manage the assessment, design, building, and maintenance of scalable data platforms for your clients.

Job Duties

  • Optimize queries and data processing to improve performance and reduce costs.
  • Perform data transformations and data cleansing to prepare data for analysis and reporting.
  • Monitor data pipelines and systems for performance, reliability, and data quality issues.
  • Document data pipelines, data models, and data infrastructure components.
  • Share knowledge with team members and stakeholders to ensure a clear understanding of data processes.
  • Implement data validation and data enrichment processes.
  • Design and develop robust and scalable data pipelines to extract, transform, and load (ETL) data from various sources into data warehouses or data lakes.

Experience:

  • Ability to work in a fast-paced, dynamic environment with tight deadlines and competing requirements. Strong organizational, planning, and time management skills.
  • 3+ years of professional and technical experience as a Data Architect, Data Engineer, or related role developing data models and architectural solutions.
  • Knowledge of developing and using data standards comprising common formats, representation, definition, structuring, manipulation, tagging, transmission, use, and management of data
  • 3+ years of experience working with data programming languages (Python, Java, SQL, etc.), data orchestration and integration pipelines
  • Knowledge of cloud computing platforms similar to Google Cloud Platform (GCP), Amazon Web Services (AWS), Azure or equivalent.
  • Understanding of software development lifecycle including Agile and traditional project management and delivery methodologies
  • Strong written and oral communications skills. 
  • Strong math and analytical skills.

Required Skills:

  • 3+ years technical working or other relevant industry experience
  • 2+ years of experience with SQL, database development, and administration
  • 2+ years of experience of AWS, Azure, or GCP Cloud services (AWS preferred)
  • Experience with building dashboards using data visualization tools, including Tableau, Qlik, PowerBI, or QuickSight
  • Excellent oral and written communication skills
  • Successful candidate is subject to a background investigation by the government and must be able to meet the requirements to hold a position of public trust

Education:

  • Bachelor’s degree in Computer Science or related field
Apply now Apply later
  • Share this job via
  • or

Tags: Agile AWS Azure Computer Science Data pipelines Data quality Data visualization Engineering ETL GCP Google Cloud Java Mathematics Pipelines Power BI Python Qlik QuickSight SQL Tableau

Regions: Remote/Anywhere North America
Country: United States
Job stats:  18  5  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.