Data Engineer
Warsaw, Poland
Ivanti
Ivanti finds, heals and protects every device, everywhere – automatically – so employees can work better from anywhere.The Data Engineer is responsible for the assessment, improvement, and governance of quality of critical data assets. This role is responsible for gathering and analyzing data requirements and developing the complex logical and physical database designs and data models in support of enterprise data management. A successful candidate for this role will be able to build and measure the timeliness and correctness of data and data models. You are a good fit for this role if you have a natural curiosity, a tendency to investigate oddities and a desire to organize the disorderly.
Title: Data Engineer
Responsibilities:
- Own the design, development, and maintenance of data pipelines, warehouse layers and integrations.
- Enable effective decision making by retrieving and aggregating data from multiple sources and compiling it into a digestible and actionable format
- Ensure data accuracy by validating data for new and existing tools.
- Learn and understand a broad range of data resources and know how, when, and which to use and which not to use.
- Managing metrics across multiple projects simultaneously
- Working with technology and analytics teams to support the development of tools and dashboards
- Communicating with and supporting various internal stakeholders and external audiences
- Develop processes for the delivery of data collected from operational applications and the distribution of this data to the informational platform and data marts for analysis and reporting.
- Implement data governance standards, policies, and procedures for the control, protection, consistency and delivery of corporate data and information assets.
- Implement strategies for maintaining data quality, data integrity and performance metrics in the informational infrastructure.
- Perform impact analysis for changes to operational application processes and data that may affect the informational environment. Define conceptual data models.
- Review and guide the development of logical and physical data models that have informational impact.
- Develop and maintain the master data management solution and metadata repository.
- Working with a team US based; primarily remote employees but may also expand globally.
A strong candidate will have:
- 3 year of experience in data engineering or data analytics
- Strong understanding of RDBMS Concepts
- Experience with writing data dictionaries and documenting Entity Relationship Diagrams
- Good understanding of Snowflake
- Good understanding of AWS products
- Strong understanding of ETL concepts and popular products, especially Fivetran
- Ability to learn quickly and maintain a diverse workload in a fast-paced environment
- Experience supporting performance and load testing, analysis, and evaluation of results, and providing recommendations
- Skilled in understanding and maintaining data integrity, developing documentation, and ability to be accountable for overall data validation
- Able to assimilate unstructured information and produce clear data plans for large enterprise-level IT projects
- Experience architecting and maintaining data warehouses
- Excellent interpersonal skills with the ability to build relationships across a variety of departments.
- Curiosity of technology to explore “art of what is possible”
- Excellent communication verbally and in writing, able to liaise between technical and non-technical stakeholders.
- Desire to take your work to the next level
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: AWS Data Analytics Data management Data pipelines Engineering ETL FiveTran Pipelines RDBMS Snowflake Testing
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Business Intelligence Engineer jobs
- Open Lead Data Analyst jobs
- Open MLOps Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Engineer II jobs
- Open Sr Data Engineer jobs
- Open Data Manager jobs
- Open Data Analytics Engineer jobs
- Open Principal Data Engineer jobs
- Open Power BI Developer jobs
- Open Junior Data Scientist jobs
- Open Business Intelligence Developer jobs
- Open Product Data Analyst jobs
- Open Senior Data Architect jobs
- Open Data Scientist II jobs
- Open Sr. Data Scientist jobs
- Open Manager, Data Engineering jobs
- Open Business Data Analyst jobs
- Open Big Data Engineer jobs
- Open Data Quality Analyst jobs
- Open Data Analyst Intern jobs
- Open Principal Data Scientist jobs
- Open Data Product Manager jobs
- Open ETL Developer jobs
- Open Junior Data Engineer jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open GCP-related jobs
- Open ML models-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Java-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open TensorFlow-related jobs
- Open PhD-related jobs
- Open CI/CD-related jobs
- Open NLP-related jobs
- Open Kubernetes-related jobs
- Open Data governance-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open Data warehouse-related jobs