Data Scientist - 4201

REMOTE

Applications have closed

We are looking for a Data Scientist (4201-#LI-KA1) to work for us remotely from ANY European Country

Job Responsibilities

  • Development of features to enhance the user experience.
  • Building reusable code for future use.
  • Understanding and implementation of security and data protection.
  • Cross-platform optimisation.
  • Implement core business logic.
  • Optimization of the application for maximum speed and scalability.
  • Maintenance of code integrity and organisation.
  • Writing clean, functional code.
  • Compile and analyse data, processes, and codes to troubleshoot problems and identify areas for improvement.
  • Testing and fixing bugs or other coding issues.
  • Coordination of the implementation of the technical architecture.
  • Participation in technical working groups, progress meetings and meetings with the users.
  • Design and development of architecture components and building blocks using cloud components.
  • Translating project-specific needs into a cloud structure that will suit those needs and considering the project needs in terms of resource usage and scalability.
  • Designing the cloud environment from a holistic point of view, ensuring it meets all of the company's requirements.
  • Carrying out deployment, maintenance, monitoring, and management tasks within this implemented cloud structure.
  • Completing the integration of new applications into the cloud environment.
  • Monitoring cloud resources usage in the cloud and making sure the structure is operating at full efficiency.
  • Conducting migration, where necessary, to bring tools and other elements into the cloud.
  • Maintenance of cloud services.
  • Operational support for cloud services.
  • Setting up and maintaining the required tools for IaC (infrastructure as code).
  • Creating and documenting IaC ((infrastructure as code)) working procedures and processes.
  • Development of infrastructure code.
  • Testing infrastructure code and developing automated tests.
  • Interaction with the business analysts, customer, users, project leaders and the developers
  • Production of technical documentation.
  • Participation in meetings with the various stakeholders.
  • Periodic reporting on the work performed.

Job Requirements

  • Good knowledge of data processing architecture best practices and patterns.
  • Good knowledge of data processing tools in the cloud.
  • Good knowledge of modelling tools.
  • Experience in administering systems in a virtualized Cloud environment.
  • Strong understanding of security, risk and compliance frameworks, disaster recovery, high availability architectures, hardware, operating systems and networking connectivity.
  • Good knowledge of cloud infrastructure setups from vendors such as, but not limited to, Azure and Amazon.
  • Analysis and problem solving skills
  • Capability to write clear and structured technical documents
  • Ability to participate in technical meetings and very good communication skills with technical and non-technical audiences
  • Fluency in English both in oral and written expression.

Due to the particular nature of the client, candidates should also have the following non-technical skills:

  • Capability of integration in an international/multicultural environment, rapid self-starting capability and experience in working in team;
  • Ability to participate in multilingual meetings;
  • Ability to work in multi-cultural environment, on multiple large projects;
  • Excellent Team Player
  • High degree of discretion and integrity

SPECIFIC EXPERTISE

  • At least 4 years of specific expertise with data processing languages such as R, Python, Scala, Julia (min. competence level4)
  • At least 2 years of specific expertise with distributed computing tools such as Spark, Hadoop, Hive (min. competence level3)
  • At least 2 years of specific expertise with cloud services operation, preferably AWS (min. competence level 3)
  • Good knowledge of architectural design and implementation of scalable modern data stores.
  • Knowledge of data management in PostgresSQL and ElasticSearch
  • Knowledge of code versioning tools such as Git or Gitlab.

CERTIFICATIONS & STANDARDS:

  • Microsoft Certified Data Scientist or,
  • Microsoft Certified Data Engineer or,
  • AWS Certified Data Analytics or,
  • AWS Certified Big Data

The following documents / procedures will be requested to successfully complete the hiring process :

  • A copy of your university degree(s)
  • A copy of your criminal record
  • Security Clearance Procedure

 

EU-Nationality is mandatory for ALL our job openings
We can, also, take into consideration CVs, that acquire work permit in the location where the client is or the job should be performed

 

WHO ARE WE?

CRI Group belongs to VASS GROUP as of November 2021 (https://vasscompany.com/en/).

VASS is a leading digital solutions group of companies headquartered in Madrid, Spain, present in 25 countries in Europe, the Americas and Asia with more than 4,300 professionals.

VASS helps large companies in their digital transformation process, developing and executing the most innovative and scalable projects, from strategy to operations.

All our growth comes from our talented people, passion for innovation, and a constant search for improvement, always the VASS way: “Complex made simple”.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Architecture AWS Azure Big Data Data Analytics Data management Elasticsearch Git GitLab Hadoop Julia Python R Scala Security Spark Testing

Perks/benefits: Startup environment

Region: Remote/Anywhere
Job stats:  32  5  0
Category: Data Science Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.