Big Data Software Engineer

Paiania, Attica, Greece

Intracom Telecom

Intracom Telecom is a global telecommunication systems vendor. The company's clientele consists of fixed & mobile telecom operators, public authorities and large enterprises.

View company page

INTRACOM TELECOM is a global telecommunication systems and solutions vendor operating for over 40 years in the market. The company innovates in the wireless access and transmission field, offers a competitive telco software solutions portfolio and combines its offerings with a complete range of professional services.

Our mission is to shape future through technology and we recognize that human capital is the key factor to achieve this in today's business environment. Our company's highly specialized and experienced personnel are pivotal to achieving demanding objectives and advancing the capabilities of the company to better serve its customers. Within this framework, we are looking for an agile and highly-motivated “Big Data Engineer” to join a team of future shapers.
The candidate will be part of a team that's passionate about the applications and libraries they engineer. She/he will be deeply involved in the design and implementation of applications to facilitate the flow, transformation and storage of various data; securely and at scale. We are keen to work with talented engineers in a highly collaborative working environment. This is a hands-on role where you will be expected to contribute at every level of the design, development and implementation.


Responsibilities:

  • Develop applications and data pipelines from various data repositories and APIs across the enterprise, ensuring data quality and application efficiency
  • Develop relational and NoSQL data models to help conform data to meet users’ needs
  • Participate in large-scale client engagements
  • Help on investigation of performance and scalability issues in large-scale data lake environments
  • Work with Data Scientists and fellow engineers to develop, extend and scale applications according to requirements
  • Collaborate with multi-disciplinary research groups in the context of collaborative research and innovation projects
  • Putting into practice good organizational and time management skills, with the ability to prioritize and complete multiple complex project.

Requirements

  • A Bachelor’s or Master’s degree in Computer Science, Engineering or other related fields.
  • Proficiency in at least one programming language (Java or Python)
  • SQL Programming and Optimization
  • Understanding of Column Stores and modern binary file formats for optimal data storage (e.g. Parquet, Avro)
  • Experience with Distributed filesystems (Cloud based or on premise)
  • Knowledge and some minimal experience of distributed data processing frameworks (either Spark or Flink)
  • Experience with GNU/Linux and basic network skills
  • Excellent communication, analytical and problem-solving skills

Considered as a plus

  • Software engineering tools such as Git and CI/CD (e.g. Jenkins/GitLab)
  • Relational data modelling for Transactional and Data Warehouse schemas
  • Knowledge of ELK / Grafana / Prometheus
  • Knowledge of Docker and Kubernetes
  • BI Analytics and experience with data visualization solutions
  • Workflow / Scheduling frameworks
  • Java frameworks like Spring, Java EE, Web Services (SOAP/REST) tools
  • Python frameworks like Flask, Django, Celery, NumPy, Pandas
  • Scala, JavaScript, C, C++, Rust, Go
  • Experience in designing, building, tuning, testing & validating statistical and ML/AI algorithms/models
  • An automation engine, such as Ansible or SaltStack
  • Test-driven development

Benefits

INTRACOM TELECOM provides an excellent working environment which encourages team spirit, cooperation and continuous learning, in which the career prospects depend on each employee’s performance. Remuneration is competitive and aligned with the company’s credo “our competitive advantage is our human capital”.

Education and continuous personal improvement constitute major priorities for the company to keep abreast with the technology evolution and maintain the high growth rate and its strategic position.

Tags: Agile Ansible APIs Avro Big Data CI/CD Computer Science Data pipelines Data visualization Django Docker ELK Engineering Flask Flink Git GitLab Grafana JavaScript Kubernetes Linux Machine Learning NoSQL NumPy Pandas Parquet Pipelines Python Research Rust Scala Spark SQL TDD Testing

Perks/benefits: Career development Startup environment

Region: Europe
Country: Greece
Job stats:  0  0  0

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.