Data Engineer
Geneva / Annecy / Bochum
Applications have closed
SonarSource
Accelerate Clean Code for developers and teams to enable clear, readable, understandable, maintainable, portable, reliable and secure code standards across your organization.The Data Team is looking for a passionate Data Engineer to support the fast-growth of the data platform. By joining our team, you will be part of an autonomous team. You have a word to say on what you will be working on. Within our team, you will be supported by great colleagues who are always willing to exchange knowledge. Using your skills you will contribute to the growth of the development and professional capabilities of the team, and focus on what matters.
The Impact you can have
We want to make the best decisions we can for our products and in order to do that we want to make data-driven decisions.To achieve this goal, we are looking for a talented Data Engineer - you! - to join the Data team.
• Take technical ownership of the data platform and produce high-quality self-served components that the domain teams will be able to use autonomously• Get involved in the definition of new features and technologies which will have an impact on all SonarSource’s teams as well as hundreds of thousands of developers around the world• Help improve the team's effectiveness by challenging and continuously improving existing processes, designs and engineering practices• Have an opportunity to contribute to the growth of the team by participating in the recruitment process• Lead by example, mentor and grow more junior members of the team
You have a solid experience in software engineering, with a focus on developing analytics components in the cloud. You have been exposed to big data architecture essentials, streaming, data manipulation using SQL, ETL batches, data store administration, ...
On a daily basis
-
You will design, and develop the data platform to support the functional domains to handle their data by following the state of the art, while helping them in their journey to data ownership. This means:
- Design and implement streaming data capabilities following Data Mesh principles
- Continuously enhance the data platform to propose data bricks as a service according to stakeholders needs (Product Managers, Marketing, Sales, other development teams, ...)
- Establish top-class end-to-end pipelines including data cleaning and modeling
- Develop and deploy connectors to data sources of interest
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Maintain evergreen data flow documentation across systems
- Collaborate on Data Governance with other stakeholders/domains
- Implement appropriate entitlement controls around data access
- Work with the security function to ensure we adhere to expected security standards
- Collaborate with the incident manager to bring your expertise when asked and own implementation of recommendations from post-mortems.
The skills you will demonstrate
-
You have a solid experience in software engineering, with a focus on developing analytics components in the cloud. You have been exposed to big data architecture essentials, streaming, data manipulation using SQL, ETL batches, data store administration, ... You see yourself as a developer working on data. You are motivated by both technical and functional aspects of services.You are a friendly, enthusiastic, and organized team player. You actively share your knowledge, give and receive feedback, to improve the team and yourself.
- Strong experience building end-to-end ETL or ELT pipelines with batch or streaming technologies in Python
- Strong experience with the AWS platform including, but not limited to, services such as: MSK, Glue, RedShift, CloudFormation, S3, CloudWatch, Lambdas, etc… or equivalents
- Good knowledge of data engineering concepts (Data Lake, Data Warehouse, etc…)
- Experience querying, manipulating and modeling data with SQL. Knowledge of dbt a plus
- Experience with Infrastructure as code tools (Terraform, CloudFormation, CDK)
- Knowledge of how to use visualization tools such as, but not limited too; Tableau, AWS Insights or Looker
- You are fluent in English, both written and spoken
The skills we are looking for:
Why you will love it here
• Safe work culture - we value respect, kindness, and the right to fail.• Flexible hours - we schedule our days in order to be effective at work, while also being able to enjoy life’s important moments.• Great people - we value people skills as much as technical skills and strive to keep things friendly and laid back. Still, that does not prevent us to be passionate leaders in our domains. Our 300+ SonarSourcers from 33 different nationalities can relate!• Work-life balance - keeping a healthy work-life balance is important. This is why we have a hybrid work policy and some people prefer working some days from home.• Always keep learning - in an ever-changing industry, learning new skills is a must, and we're happy to help our team to acquire them.
What we do
Sonar was started by a team of developers that wanted to change the way code is built in an agile development process. The company was created to develop the open-source tool SonarQube, which is now the standard in code quality management with over 350,000 instances deployed today. Every day we are focused on solving developers’ next big problem.
Who we are
At Sonar we believe in people, excellence, and delivery. We’re a team of problem solvers and overachievers who seek out others who are also passionate and relentless in their respective missions. We want to work with people who are ready to fasten their seat belts and be part of an incredible ride. We work hard not because we’re told to, but because we genuinely love what we do and do what we love. If there’s one main message we want you to remember about us, it’s that we push others to be best in class at whatever they do: choose your battle, innovate, take risks, and lead change. Join us; we’ll be smarter and stronger together.
If this sounds like you, apply now!
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture AWS Big Data Databricks Data governance Data warehouse ELT Engineering ETL Looker Open Source Pipelines Python Redshift Security SQL Streaming Tableau Terraform
Perks/benefits: Career development Flex hours Startup environment Team events
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Marketing Data Analyst jobs
- Open MLOps Engineer jobs
- Open Junior Data Scientist jobs
- Open AI Engineer jobs
- Open Data Engineer II jobs
- Open Senior Data Architect jobs
- Open Sr Data Engineer jobs
- Open Senior Business Intelligence Analyst jobs
- Open Data Analytics Engineer jobs
- Open Power BI Developer jobs
- Open Manager, Data Engineering jobs
- Open Product Data Analyst jobs
- Open Principal Data Engineer jobs
- Open Business Data Analyst jobs
- Open Data Quality Analyst jobs
- Open Data Manager jobs
- Open Sr. Data Scientist jobs
- Open Data Scientist II jobs
- Open Big Data Engineer jobs
- Open Business Intelligence Developer jobs
- Open Data Analyst Intern jobs
- Open Principal Data Scientist jobs
- Open ETL Developer jobs
- Open Azure Data Engineer jobs
- Open Data Product Manager jobs
- Open Business Intelligence-related jobs
- Open Data quality-related jobs
- Open Privacy-related jobs
- Open Data management-related jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open ML models-related jobs
- Open Finance-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open APIs-related jobs
- Open PyTorch-related jobs
- Open PhD-related jobs
- Open Consulting-related jobs
- Open TensorFlow-related jobs
- Open Snowflake-related jobs
- Open NLP-related jobs
- Open Data governance-related jobs
- Open Data warehouse-related jobs
- Open Airflow-related jobs
- Open Hadoop-related jobs
- Open Databricks-related jobs
- Open LLMs-related jobs
- Open DevOps-related jobs
- Open Kubernetes-related jobs