Senior Systems Data Engineer
NC - Greensboro (HQ), US
Qorvo
Qorvo’s diverse and innovative team creates semiconductor solutions that help connect, protect and power the planet.Qorvo (Nasdaq: QRVO) supplies innovative semiconductor solutions that make a better world possible. We combine product and technology leadership, systems-level expertise and global manufacturing scale to quickly solve our customers' most complex technical challenges. Qorvo serves diverse high-growth segments of large global markets, including consumer electronics, smart home/IoT, automotive, EVs, battery-powered appliances, network infrastructure, healthcare and aerospace/defense. Visit www.qorvo.com to learn how our diverse and innovative team is helping connect, protect and power our planet.
Summary
As a Senior Systems Data Engineer, you will be part of the EBS Reporting Team under Qorvo’s IT Enterprise Business Applications organization. Your primary focus will be on leading and overseeing the design, development, and management of the data infrastructure on the Databricks platform within an AWS government cloud. You will manage key client technical projects and workstreams, coordinating the work of more junior engineers, and often working alongside them. You will also partner closely with business analysts and project managers to complete projects on time, within budget and scope, and with high customer satisfaction. This position can be based in Greensboro, NC, Richardson, TX, or Hillsboro, OR.
Qualifications
- B.S. in Computer Science/Engineering or relevant field; Masters degree preferred
- 8+ years of experience in the IT industry
- 5+ years of hands-on experience in data engineering/ETL using Databricks on AWS/Azure cloud infrastructure and functions
- Government Cloud experience
- Expert understanding of data warehousing concepts (Dimensional (star-schema), SCD2, Data Vault, Denormalized) implementing highly performant data ingestion pipelines from multiple sources
- Expert level skills with Python / PySpark and SQL
- Demonstrable experience in developing and nurturing a Data Engineering framework to include package/dependency management tools, functional testing (e.g., Pytest, Pytest-Cov, PyLint), and load testing
- Experience with CI/CD on Databricks using tools such as Unity Catalog, Jenkins, GitHub Actions, and Databricks CLI
- Integrating the end-to-end Databricks pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained
- Strong understanding of Data Management principles (quality, governance, security, privacy, life cycle management, cataloging)
- Evaluating the performance and applicability of multiple tools against customer requirements
- Working within an Agile delivery/DevOps methodology to deliver proof of concept and production implementation in iterative sprints
- Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT)
- Hands on experience developing batch and streaming data pipelines
- Able to work independently
- Energetic and self-motivated, willingness to learn and openness to change are important
- Ability to work in a fast-paced, changing environment, and with all levels of the organization and cope with rapidly changing information
- Power BI/Tableau/QuickSight experience preferred
- Nice to have: experience with SAP ECC or S/4, AWS Redshift
Responsibilities
- Establish and grow a data engineering framework to ensure the reliability, scalability, quality, and efficiency of data pipelines, storage, processing, and integration
- Establish data pipelines to ingest and curate data containing SAP business content from S/4 to Databricks
- Improve, maintain and execute the data strategy at Qorvo including governance, project prioritization, resourcing, and value delivery
- Follow the Medalion Architecture (Bronze, Silver, Gold) to logically organize data in a lakehouse, with the goal of incrementally and progressively improving the structure and quality of data as it flows
- Work effectively in an Agile Scrum environment
- Create technical, functional, and operational documentation for data pipelines and applications
- Use business requirements to drive the design of data solutions/applications and technical architecture
- Provide overall project management and administration activities for the data engineering organization
- Work with other developers, designers, and architects to ensure data applications meet requirements and performance, data security, and analytics goals
- Anticipate, identify, track, and resolve issues and risks affecting delivery
- Lead the configuration, build, and testing of data applications and technical architecture
- Coordinate and participate in structured peer review/ walkthroughs/code reviews
- Provide application/technical support
- Maintain and/or update technical and/or industry knowledge and skills through continuous learning activities
- Mentor and develop junior engineers
- Adhere to lean principles and standard processes to ensure continuous improvement
- Communicate clearly and effectively
Soft Skills
- Team Player
- Excellent communication skills
- Collaboration
- Logical thinking
- Time Management
#LI-LR1
MAKE A DIFFERENCE AT QORVO
We are Qorvo. We do more than create innovative RF and Power solutions for the mobile, defense and infrastructure markets – we are a place to innovate and shape the future of wireless communications. It starts with our employees. As a unified global team, we bring a commitment to excellence, growth and a passion for creating what's next. Explore the possibilities with us.
We are an Equal Employment Opportunity (EEO) / Affirmative Action employer and welcome all qualified applicants. Applicants will receive fair and impartial consideration without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, military or veteran status, physical or mental disability, genetic information, and/or any other status protected by law.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture AWS Azure CI/CD Computer Science Databricks Data management Data pipelines Data strategy Data Warehousing DevOps Engineering ETL GitHub Jenkins Pipelines Power BI Privacy PySpark Python QuickSight Redshift Scrum Security SQL Streaming Tableau Testing
Perks/benefits: Career development
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Data Science Manager jobs
- Open Research Scientist jobs
- Open Data Engineer II jobs
- Open Principal Data Scientist jobs
- Open Business Data Analyst jobs
- Open BI Analyst jobs
- Open Sr Data Engineer jobs
- Open Data Scientist II jobs
- Open Business Intelligence Engineer jobs
- Open Data Science Intern jobs
- Open Sr. Data Scientist jobs
- Open Senior Business Intelligence Analyst jobs
- Open Software Engineer, Machine Learning jobs
- Open Lead Data Analyst jobs
- Open Azure Data Engineer jobs
- Open Junior Data Scientist jobs
- Open MLOps Engineer jobs
- Open Data Analytics Engineer jobs
- Open Manager, Data Engineering jobs
- Open Marketing Data Analyst jobs
- Open Data Engineer III jobs
- Open Junior Data Engineer jobs
- Open Data Engineering Manager jobs
- Open Data Analyst II jobs
- Open ETL Developer jobs
- Open GCP-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Excel-related jobs
- Open ML models-related jobs
- Open Data pipelines-related jobs
- Open APIs-related jobs
- Open PhD-related jobs
- Open PyTorch-related jobs
- Open LLMs-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open TensorFlow-related jobs
- Open Consulting-related jobs
- Open Deep Learning-related jobs
- Open Business Intelligence-related jobs
- Open Generative AI-related jobs
- Open CI/CD-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open DevOps-related jobs
- Open Kubernetes-related jobs
- Open Git-related jobs
- Open Hadoop-related jobs
- Open Docker-related jobs