Data Engineer Intern
United States
Lacework
Need better insight into the security of your cloud environments? Learn how Lacework can automate cloud security, prioritize risks, and help you scale.At Lacework, we strive to provide a supportive, collaborative environment where people are empowered to do the best work of their careers.
Our team members enjoy solving complex problems, big sky thinking, and obsess over getting the details right—all while building bonds of teamwork and friendships that last a lifetime. We love what we do and are proud of our work to secure clouds and container environments for thousands of users worldwide.
Lacework is changing cloud security forever, with data & automation. Lacework security platform, powered by Polygraph, automates cloud security at scale so our customers can innovate with speed and safety. Polygraph is the only security platform that can collect, analyze and correlate a variety of data across an organization’s AWS, Azure, GCP environments, and narrow it down to a handful of security events that matter.
Data Science & Engineering (DSE) team works on some of the most complex problems in security efficacy, alerts detection, product adoption, user engagement, customer health, and cloud efficiency areas, to name a few. We use a variety of Data Science and Data Engineering techniques to solve these problems and help the company build next generation of security solutions & features.
The Role: We are looking for a Data Engineer Intern to help the DSE team explore & exploit newer datasets & technologies. This is a hands-on engineering role with a potential to deliver a large impact within a few months of work. You will be responsible to deliver tangible scope of work with
- Scalable data pipelines, datasets and models, forming the data foundation that can accelerate work across multiple use cases
- Tools & technologies such as Snowflake, Python, Spark, Parquet/Iceberg, Airflow and others, in a cloud native environment
- Data modeling & data quality techniques to build resilient and accurate datasets
- Actionable insights delivered via visualizations in Tableau, Sigma and Grafana
- Cross functional partnerships & collaboration to explore, understand and define the problem space as well as come up with suitable solution
- Effective communication to present your work to engineering organization & leadership
Minimum Qualifications
- Currently enrolled in a degree in a quantitative field (e.g. Computer Science, Engineering, Mathematics, Statistics, or other related field)
- Experience in building data pipelines using batch or real-time techniques
- Experience in profiling & analyzing data to discover anomalies that need addressing or patterns that can be leveraged
- Experience in programing languages such as Python, Scala, Java (or others) and SQL
Preferred Qualifications
- Everything listed in Minimum Qualifications
- Experience in handling large datasets in cloud environments such as Snowflake, Spark
- Experience in cloud platforms such as AWS, GCP, Azure
- Experience in Cloud Security or Infrastructure Security
Lacework is an Equal Opportunity Employer. It is the policy of Lacework to provide equal employment opportunity to all persons, regardless of age, race, religion, color, national origin, sex, political affiliations, marital status, non-disqualifying physical or mental disability, age, sexual orientation, membership, or non-membership in an employee organization, or on the basis of personal favoritism or other non-merit factors, except where otherwise provided by law
Tags: Airflow AWS Azure Computer Science Data pipelines Data quality Engineering GCP Grafana Mathematics Parquet Pipelines Python Scala Security Snowflake Spark SQL Statistics Tableau
Perks/benefits: Team events
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open MLOps Engineer jobs
- Open Data Science Manager jobs
- Open Lead Data Analyst jobs
- Open Data Manager jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Engineer II jobs
- Open Power BI Developer jobs
- Open Sr Data Engineer jobs
- Open Principal Data Engineer jobs
- Open Business Intelligence Developer jobs
- Open Data Analytics Engineer jobs
- Open Junior Data Scientist jobs
- Open Product Data Analyst jobs
- Open Data Scientist II jobs
- Open Sr. Data Scientist jobs
- Open Senior Data Architect jobs
- Open Business Data Analyst jobs
- Open Data Analyst Intern jobs
- Open Big Data Engineer jobs
- Open Manager, Data Engineering jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Junior Data Engineer jobs
- Open Data Quality Analyst jobs
- Open Principal Data Scientist jobs
- Open GCP-related jobs
- Open Data quality-related jobs
- Open Business Intelligence-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open PhD-related jobs
- Open APIs-related jobs
- Open TensorFlow-related jobs
- Open PyTorch-related jobs
- Open NLP-related jobs
- Open Consulting-related jobs
- Open Snowflake-related jobs
- Open CI/CD-related jobs
- Open LLMs-related jobs
- Open Kubernetes-related jobs
- Open Generative AI-related jobs
- Open Data governance-related jobs
- Open Hadoop-related jobs
- Open Airflow-related jobs
- Open Docker-related jobs