Data Analytics Engineer

Coronado, California, United States

Applications have closed

Endeavor Consulting Group

Endeavor is a leader in logistics and supply chain management providing clarity, guidance, and expertise from new product launch through to commercialization.

View company page

Endeavor Consulting Group, LLC, We are a specialty project management technology and integration consulting firm laser focused on Operational Systems, Supply Chain Integration and Product Development for Life Science enterprises. At Endeavor we utilize industry-accepted best practices, methodologies and tools, and we are experienced in several software, hardware, network platform, Quality by Design product development, manufacturing equipment and integration. We are constantly monitoring our industry to stay on top of the latest trends in technology so that we can always bring the best solution that fits our clients’ needs.

Endeavor Consulting Group was formed in 2006 and has experienced solid and continual growth since its inception. Our core values of integrity, quality, creativity, and freedom have allowed us to continually expand our service offerings and expand our client base.

Our experience and skills in program management, supply chain collaboration, industry standards and regulations, serialization, serialization technologies and manufacturing processes, equipment and systems provide us with the ability to offer our clients a high value and extremely cost effective approach to their manufacturing serialization program.

We are looking for an experienced Data Analytics Engineer to provide Force-wide data analytics and data engineering full-stack web software development support to include analysis of requirements and systems flow, and shall determine architecture, software, database, data storage, and usage requirements while engineering software and analytics solutions to facilitate agile data DevSecOps and DataOps. The Data Analytics Engineer will develop and implement web tools and manage the Neptune web application in support of the NSW Data Environment (NDE) and will serve as lead NSW engineer in support of Advana/Jupiter and JAIC JCF environment requirements.

This position will fill the role of bridging data engineering, software development, and data analytics. He/She will work with both on-premises and cloud-based data science and analytics “customer facing” data engineering requirements, providing engineering support for business systems, data analysts, data scientists, machine learning engineers, and other technology stakeholders. The Data Analytics Engineer will be part of the NSW Data Engineering Team and will serve as subject matter expert for data analytics engineering.

Additionally, He/She will support administrative and governance requirements such as agreements, policies, procedures, drawings, data pipelines, user training and resources, Remedy change management and incident ticketing, Azure DevOps and other Kanban boards and backlogs, and cybersecurity accreditations as well as support NSW enterprise software development and data analytics engineering support for Microsoft O365, PowerBI, Neptune, Microsoft Azure, AWS, Advana/Jupiter, DefenseReady, data science/AI/ML, and NSW/SOF business systems.


Role and Responsibilities:

  • Assemble large, complex sets of data that meet non-functional and functional business requirements. Manage data mart and web service requirements in support for SOF and NSW business systems. Manage Microsoft Power BI Report Server and Microsoft M365 Data Gateway data requirements for analytics solutions, including Power Platform (Power Apps, Power Automate, and Power BI).
  • Manage the Neptune web application, to include microservices for External Data Load (EDL), web services, and agreements builder, as well as future requirements for open-source and customized tools.
  • Manage Advana/Jupiter data requirements for analytics solutions (e.g., Data Bricks, Qlik, Trifacta, SQL Builder, and Data Robot, etc.).
  • Build software tools that help data scientists and data analysts work more efficiently (e.g., writing an internal tooling package for analysts to use). Collaborate with data engineers on infrastructure projects and advocate for and emphasize the business value of applications.
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management. Build and optimize ‘big data’ data pipelines, architectures, and data sets.
  • Manipulate, process, and extract value from large, disconnected datasets. Identify, design, and implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
  • Build required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS or Azure or SQL technologies. Build analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition.
  • Work with stakeholders including data, design, product, and executive teams and assisting them with data-related technical issues. Identify the most optimal data engineering solutions and maintain data system infrastructures in accordance with industry best practices and security policies for continuous integration/delivery (CI/CD).
  • Support all aspects of data engineering and DataOps, including data wrangling, data triage, data pipelines, data visualizations and connections supporting the integration of platforms and features across multiple subsystems, containerization, and microservices architecture.
  • Support the Government in system lifecycle sustainment operations and configuration, testing, training, release, and operations. Archiving final copy documentation and deliverables meeting a definition of done (built, reviewed, verified, tested, ready to operate, and releasable).
  • Support the Government in Product Backlog (PBL) grooming, Sprint Planning, and Sprints. Document user stories using INVEST criteria (independent, negotiable, valuable, estimable, sized, testable) that capture acceptance criteria and include a definition of done (built, reviewed, verified, tested, ready to operate, and releasable).
  • Analyze information and user requirements, design and debug software, and document technical notes within the Government Off the Shelf (GOTS), Commercial-Off the Shelf (COTS), and approved open-source software.
  • Provide testing, problem solving, requirements collection, stakeholder collaboration, user support, user training, Standard Operating Procedures (SOP), and system maintenance. Capture and document initial requirements, system requirements, issues, infrastructure requirements, incidents, change requests, and change management processes using Remedy change management system
  • Plan, create, and manage data systems infrastructure, develop back-end systems functionality, and design Application Programming Interfaces (APIs). Educate stakeholders on the implementation of new data engineering and DataOps technologies and initiatives.
  • Design, develop, deploy, modify, and improve modular data engineering solutions in accordance with industry standards and best practice for a MOSA implementation to the data science and analytics workflow processes to obtain, clean, analyze, model, and interpret data.
  • Ensure efficient functioning of data storage and processing functions in accordance with company security policies and best security practices compliant with DoD Zero Trust Architecture for cloud and DoD Enterprise DevSecOps Reference Design.
  • Identify, analyze, and resolve infrastructure vulnerabilities and software deployment issues and regularly review existing systems to make recommendations for improvements. Create and maintain configuration mapping and engineering documentation.
  • Support the U.S. Government USG in tasks for data engineering and DataOps processes and accreditation, scans, and documentation to obtain an Interim Authority to Test (IATT), Authority to Operate (ATO), continuous Authority to Operate (cATO), Continuous Monitoring (CONMON) Strategy, Configuration Management (CM) Plan, Social Security Number (SSN) memorandum, Privacy Impact Assessment (PIA), Security Technical Implementation Guides (STIG), Risk Management Framework (RMF) Body of Evidence documents and checklists. Continuously maintain and update on-premises and cloud technology stacks.


Requirements

  • Five (5) or more years of experience within the past ten (10) years with full-stack web application development, data engineering, data analytics, data modeling, data quality, data performance, data visualization, data security, web services and Application Programming Interfaces (API), and serving data to end users in on-premises and cloud environments.
  • Five (5) or more years of experience within the past ten (10) years in RESTful APIs, web development, software engineering, Extract, Transform, Load (ETL/ELT), merging, manipulating, transforming, cleansing, and validating data, and experience in databases, data stores, data marts, data lakes and data warehouse design and operations, report writers, scripting, ad hoc querying, online analytical processing (OLAP), online transactional processing (OLTP), data visualization, data mining, and statistical analysis, excellent analytical and problem-solving skills, and a fundamental knowledge of working with structured, unstructured, and semi structured data.
  • Five (5) or more years of experience within the past ten (10) years in all of the following: Structured Query Language (SQL), Microsoft (MS) C#.NET, MS SQL Server, MS SQL Server Integration Services (SSIS), MS SQL Server Reporting Services (SSRS), MS ASP.NET, MS Internet Information Server (IIS), MS Visual Studio, MS Power BI, and Microsoft SQL Server BI stack including SQL Server (2012 +), Python, and JavaScript.
  • DoD-approved IAT-II level IA baseline cyber certification for admin privileges IAW DoD 8570.01-M, paragraph C2.3.9 required at time of proposal.
  • Currently active Secret security clearance – needs prior to the start of work.
  • Bachelor's or master's degree in Computer Science, Engineering, Mathematics, Statistics, Science, Information Systems, Technical, Education, Business, Computer EngineeringClearance: Active DoD Secret/SCI clearance

Benefits

A comprehensive benefits package which includes medical, dental, vision, disability and life insurance, 401k and a generous PTO Policy.

Tags: Agile APIs AWS Azure Big Data CI/CD Computer Science Consulting Consulting firm Data Analytics Databricks Data Mining Data pipelines Data visualization DevOps ELT Engineering ETL JavaScript Kanban Machine Learning Mathematics Microservices MS SQL .NET OLAP Pipelines Power BI Python Qlik Security SQL SSIS Statistics Testing Trifacta

Perks/benefits: 401(k) matching Career development Health care Insurance Startup environment

Region: North America
Country: United States
Job stats:  3  0  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.