Data Engineer

Remote

Applications have closed

AnaVation is looking for a talented Data Engineer who is passionate about technology and working with customers and a strong team to provide solutions for our mission-critical customer.  The ideal candidate appreciates partnering with our customer and a group of engineers to create innovative engineering solutions.  The candidate will work with a small group of developers building a cyber data-lake and related technologies.  If you are looking to be challenged, then this is the environment for you. This position supports remote work with occasional in-person team meetings in the Chantilly, VA area. 

Responsibilities

  • Develop tools and processes to ingest cyber data into an enterprise data lake.
  • Make recommendations on standards for a common cyber data model and standards for harmonization of common data elements on data ingest.
  • Develop and evaluate tools to search, analyze, discover, and otherwise exploit data in the data lake to support investigative operations.
  • The ideal candidate will possess an understanding of prior AnaVation experience, frameworks, and solutions in the big data management domain. 

Required Qualifications

  • 5 or more years of relational database design and development (PostgreSQL, Oracle, Microsoft SQL Server)
  • 5 or more years of ETL/ELT development experience
  • 2 or more years with Linux environment experience
  • 2 or more years of shell scripting experience
  • Experience implementing data access controls
  • Experience analyzing unstructured, structured, and semi-structured data
  • Strong technical and computational skills, coupled with the ability relate data to use cases, mission requirements, and end-user experience  
  • Experience with development in one or more programming or scripting languages (Java/Python/Go)  
  • Bachelor’s degree in Computer Science, Information Systems. or related discipline

Desired Qualifications

  • Experience with PostgreSQL
  • Experience with Elasticsearch
  • Experience with cloud concepts and big data architectures such as Hadoop, Kafka, etc.
  • Knowledge of Continuous Integration/Continuous Delivery tools and practices
  • Experience with cloud platforms such as AWS
  • Experience working in Agile Environments
  • Experience with DevOps toolsets
  • Familiarity with containerization (Docker, Containerd, etc.)
  • Experience with microservices

Clearance Requirements

  • IRS Suitability

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Agile Architecture AWS Big Data Computer Science Data management DevOps Docker Elasticsearch ELT Engineering ETL Hadoop Kafka Linux Microservices Oracle PostgreSQL Python Shell scripting SQL

Region: Remote/Anywhere
Job stats:  14  1  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.