Data Engineer (Python/AWS/Spotfire) - Remote
Tampa, FL
Nielsen
A global leader in audience insights, data and analytics, Nielsen shapes the future of media with accurate measurement of what people listen to and watch.As Data Engineer, you will be part of the Big Data Platform team, which is composed of software developers and analysts, whose purpose is to develop, configure, maintain and deploy big data application platforms that function to allow the team to manage multiple large data sources. These software development activities enable Nielsen to collect, store, process and analyze complex data sets for market trends and insights through its digital products and solutions.
You will be responsible for designing, developing, maintaining, implementing and deploying data warehouse solutions for processing and analyzing data (e.g., using complex programming and software engineering tools to create data pipelines, implement platform frameworks and automated application workflows to extract, transform and load (ETL) data).
As part of the Big Data Platform team, develop highly-complex, business-critical bigdata platform applications and automation solutions for data ingestion, processing, cleansing, and transformation using programming, application/platform development and software, systems, data and engineering tools, such as Python, AWS Spark framework, Pyspark (The Spark Python Application Programming Interface), Lambda, Athena, Airflow and Presto.
Using software and platform engineering tools, monitor and maintain automation services, jobs, server health, and performance tuning of client-facing business intelligence reports and data analytics dashboards.
Job Responsibilities:
- Hands on developer to work on data modeling, code, and test iteratively on technical solutions for monitoring and analytics requirements
- Implement ETL processes, UI and dashboards for appropriate trends and alerts in BI tools (Spotfire, Superset, Tableau, etc.) for monitoring
- Implement and monitor automated deployment solutions for cloud hosted monitoring applications
- Work as part of the DevOps team create CI/CD pipeline and automation services
- Create appropriate documentation (process flows and technical specs)
- Assist in Monitoring platform support and troubleshooting production issues
- Analyze data contained in the existing data lake to define monitoring solutions
- work from your home office! #remote
Desired Qualifications:
- BS degree in Computer Science, or equivalent experience
- Architecture, design, implementation, and operational support expertise with Amazon Web Services such as IAM, EMR, EC2, S3 storage, Lambda, Relational Database Service, Simple Notification Service (AWS Certifications desired)
- Strong "problem-solver" skills, experience with continuous integration and deployment automation tools such as Terraform or Cloudformation
- Expertise in Apache Airflow
- Strong Database Platform background and knowledge, with at least 5 yrs experience with large scale PostgreSQL/MySQL/Hive production environments
- Experience with scripting languages - Pyspark, Python and Shell scripting
- Experience with business intelligence tools such as TIBCO Spotfire, Superset and Tableau
- Team player with strong verbal and written communication skills
Required experience:
- High Performance Computing Solutions: 4 years
- AWS Cloud Architecture: 2 years
- Business Intelligence development: 2 years
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Athena AWS Big Data Business Intelligence CI/CD Computer Science Data Analytics Data pipelines DevOps EC2 Engineering ETL HPC Lambda MySQL Pipelines PostgreSQL PySpark Python Spark Spotfire Tableau Terraform
Perks/benefits: Team events
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Marketing Data Analyst jobs
- Open MLOps Engineer jobs
- Open Junior Data Scientist jobs
- Open AI Engineer jobs
- Open Data Engineer II jobs
- Open Senior Data Architect jobs
- Open Sr Data Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Analytics Engineer jobs
- Open Power BI Developer jobs
- Open Manager, Data Engineering jobs
- Open Product Data Analyst jobs
- Open Principal Data Engineer jobs
- Open Business Data Analyst jobs
- Open Data Quality Analyst jobs
- Open Data Manager jobs
- Open Sr. Data Scientist jobs
- Open Data Scientist II jobs
- Open Big Data Engineer jobs
- Open Business Intelligence Developer jobs
- Open Data Analyst Intern jobs
- Open Principal Data Scientist jobs
- Open ETL Developer jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Business Intelligence-related jobs
- Open Data quality-related jobs
- Open Privacy-related jobs
- Open Data management-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open APIs-related jobs
- Open PyTorch-related jobs
- Open PhD-related jobs
- Open Consulting-related jobs
- Open TensorFlow-related jobs
- Open Snowflake-related jobs
- Open NLP-related jobs
- Open Data warehouse-related jobs
- Open Data governance-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs
- Open Kubernetes-related jobs