Data Analytic Developer - Location Flexible

Austin, TX; Remote - US

Applications have closed

Dropbox

Dropbox helps you simplify your workflow. So you can spend more time in your flow.

View company page

Role Description 

The Customer Intelligence team builds analytical solutions and generates insights to drive strategy within the Customer Experience organization at Dropbox. Using your technical skills and creativity, and leveraging the best open source and proprietary tools available, you will develop cutting edge data pipelines, analytical tools, and reporting solutions. You’ll be involved in projects from inception to delivery, ensuring your work delivers high-quality and relevant data to intelligently grow our business. Working alongside talented analysts and data scientists, you will champion data-driven insights to add value for our stakeholders across Customer Experience. You're a self-starter with an affinity for problem-solving and deep knowledge of data, data architecture, and analytical tools and approaches. You will work with teams across Customer Experience to understand their evolving requirements, and alongside other analysts on the team to deliver impactful and innovative solutions. You'll use your creativity, curiosity, and drive to build solutions at scale, and to provide training and guidance to team members and partners.

Responsibilities

  • Devise, drive, and deliver technical solutions that generate data-driven insights.
  • Build, automate, and refactor robust, scalable data pipelines and datasets.
  • Troubleshoot complex issues throughout the data lifecycle, identify and implement effective solutions.
  • Proactively engage with stakeholders across Customer Experience to ensure relevant and accurate delivery of business requirements.
  • Build informative, insightful dashboards and data platforms across different domains and audiences.
  • Create new workflow processes, associated monitoring, and alerts.
  • Contribute to schema design, data consistency, and overall data governance.

Requirements 

  • Bachelor’s degree in Computer Science, Statistics, or related field
  • 3+ years’ work experience in a data-centric role
  • Proficient SQL skills with excellent hands-on exposure to ETL principles
  • Strong hands-on experience with R and/or Python
  • Ability to tackle and solve complex technical problems with little direction
  • Flexible and adaptable, willing to quickly learn and understand new technologies
  • Strong communication skills
  • Strong problem solving and collaboration skills
  • Excellent debugging and troubleshooting skills
Preferably:
  • Experience with Python, R, Airflow, React/JavaScript
  • Experience with shell scripting and the Linux/Unix command line
  • Experience with good code management practices, expertise with git
  • Expertise in data warehousing principles, schema design, data governance, data pipeline automation

Tags: Airflow Computer Science Data pipelines Data Warehousing ETL Git JavaScript Linux Open Source Pipelines Python R React SQL Statistics

Perks/benefits: Flex hours

Regions: Remote/Anywhere North America
Country: United States
Job stats:  16  3  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.