Data Engineer
Remote or San Francisco, California
Applications have closed
Curology
Powerful, personalized skincare formulas prescribed by real dermatology providers. Start the skin quiz to try it for 30 days.As a team, Data Engineering enables teams throughout the company to make decisions with data they feel confident in. Data underpins virtually everything we do at Curology—from a truly individualized patient experience, to efficient business operations, to cutting edge Marketing workflows. Data Engineering is the foundation a data driven company is built on. For this role, we believe in the following:
- Data as a product. We believe the true potential of Foundation teams lies in a product-oriented mindset and that this is even more relevant to data.
- Exceptional impact. Data Engineers are force-multipliers that enable others to work better and faster. Data is deeply integrated into what we do, this role and team are key to our continued success.
- Modern data stack. We use the best tools for the job and you will be part of growing and cultivating our modern data stack. AWS, Snowflake, our S3 Data Lake, we build for the future.
- A talented and passionate team. Our small team of data engineers has achieved outsize results by maintaining a high bar for ownership and product quality.
Essential Functions and Impact Areas
- Design, manage and optimize the flow of data throughout the organization.
- Utilize a modern stack to build a cloud-first product.
- Work closely with application and site reliability engineers to ensure data quality, integrity and availability.
- Work with the team to integrate consistent and high coding standards.
- Automate manual processes by working closely with teams like Marketing, BizOps, and Product to discover opportunities for programmatic efficiency.
- Work on initiatives to keep our system elegant and productivity high — such as improving our metrics, analytics, and experimentation infrastructure.
- Keep Privacy and Data Protection (PDP) the first-order consideration of data.
- Respond quickly to data strike team requests.
Competencies and Skills
- At least 3 years experience building software with Python
- Write idiomatic python.
- Write object oriented and reusable code.
- Familiar with industry standard coding and documentation best practices
- At least 4 years experience modeling SQL and noSQL data.
- Understand the tradeoffs between different database models.
- Given an analysis problem, suggest data models and key reasons for choices.
- Know major data warehousing tools and concepts.
- Used Airflow or other job schedulers to develop and monitor batch data pipelines.
- Strong expertise in data architecture for efficiency of storage and retrieval of data, especially in modern usage-based systems.
- Experience with AWS data services.
- Worked with modern database systems, especially Snowflake.
- Worked with permissions and regulated or controlled data (HIPAA/GDPR/CCPA/FDA).
- Experience with the following technologies: Python, MySQL, Snowflake, Airflow, Terraform, AWS
- Passionate about getting the right data to the right person.
- High emotional intelligence and a kind demeanor
- Willing to lead in areas of strength and learn new skills when needed.
- Requires minor to moderate supervision when creating new pipelines or data marts.
- Requires no supervision when editing existing pipelines or data marts.
- Writes SQL that is correct, readable, and efficient.
- Creates data models that correctly address the needs of data storage, access, and usage.
- Has strong expertise in Python and git.
- Has a thorough understanding of the Lattice framework and is able to expand and extend it.
- Is the primary consultant responsible for many different pipelines and data marts.
Additional Optional Skills
- Worked with Apache Spark or other Big Data tools.
- Built systems at scale with AWS Lambda or a similar server-less technology.
- Worked closely with Marketing teams and has a deep understanding of paid channel automation.
Physical Characteristics of Work Environment
- Remote-based work environment
- Typing and computer usage throughout the day
- 9-5 business hours M-F (business hours may depend on regional and local timezones)
- Occasional travel requirements to office or offsite
This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.
This job description is not designed to cover or contain a comprehensive listing of activities, duties or responsibilities that are required of the employee. Other duties, responsibilities and activities may change or be assigned at any time with or without notice.
Curology encourages applications from people of all races, religions, national origins, genders, sexual orientations, gender identities, gender expressions and ages, as well as veterans and individuals with disabilities. Notice to Applicants under the CCPA.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow AWS Big Data Data pipelines Data Warehousing Engineering Git Lambda MySQL NoSQL Pipelines Python Snowflake Spark SQL Terraform
Perks/benefits: Startup environment Team events
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open AI Engineer jobs
- Open MLOps Engineer jobs
- Open Data Science Manager jobs
- Open Data Manager jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Engineer II jobs
- Open Power BI Developer jobs
- Open Sr Data Engineer jobs
- Open Principal Data Engineer jobs
- Open Data Analytics Engineer jobs
- Open Business Intelligence Developer jobs
- Open Junior Data Scientist jobs
- Open Data Scientist II jobs
- Open Senior Data Architect jobs
- Open Product Data Analyst jobs
- Open Business Data Analyst jobs
- Open Sr. Data Scientist jobs
- Open Big Data Engineer jobs
- Open Data Analyst Intern jobs
- Open Manager, Data Engineering jobs
- Open Data Quality Analyst jobs
- Open Junior Data Engineer jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open ETL Developer jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open ML models-related jobs
- Open Data management-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open Privacy-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open APIs-related jobs
- Open Deep Learning-related jobs
- Open PyTorch-related jobs
- Open Snowflake-related jobs
- Open Consulting-related jobs
- Open TensorFlow-related jobs
- Open PhD-related jobs
- Open CI/CD-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Kubernetes-related jobs
- Open Airflow-related jobs
- Open LLMs-related jobs
- Open Data warehouse-related jobs
- Open Databricks-related jobs
- Open Hadoop-related jobs