Work In Tech

Find your next role at Canada's fastest-growing tech companies

DevOps Engineer

Eddyfi/NDT

Eddyfi/NDT

Software Engineering
Barcelona, Spain
Posted on Sep 13, 2025
Purpose:

The goal of the DevOps Engineer Big Data to automate, secure and build reliable expandable systems. Develop scripts and tools to automate deployment tasks and monitor critical aspects of the operation, resolve engineering problems and incidents. Collaborate with architects and developers to help create platforms for the future. Work closely with Big Data and Software engineers to understand our applications and services and keep them running smoothly.

Tasks:
  • Infrastructure planning and maintenance in the Big Data and Software environments
  • Mentors, supports, and coaches colleagues, Data Scientists and Engineers in Big Data and Data Engineering topics
  • Find ways to automate tasks and monitoring systems to continuously improve our systems
  • Designs and prepares Big Data platform for machine learning and data processing in production environments
  • Documents the work for the developed solutions and maintain consistency of information
  • Make the infrastructure resilient and efficient
Requirements:
  • A bachelor's degree in computer science, Engineering or related education
  • A minimum of 3 years of experience with Docker and Kubernetes environments
  • Advance experience in Linux and scripting languages (Bash, Python)
  • Solid understanding of virtualization and container technologies (e.g., writing Dockerfiles, managing registries, troubleshooting containers and VMs)
  • Advance experience with Docker and Kubernetes environments (OKD, Openshift, Rancher)
  • Intermediate experience working in cloud environments (AWS or Azure)
  • Intermediate experience with NoSQL Databases (Redis) and SQL Databases (MySQL, Microsoft SQL Server)
  • Experience with Apache Nifi is a plus

Skillset

  • Proven ability to design, automate, and maintain scalable infrastructure for Big Data and Software environments
  • Skilled in monitoring systems and implementing automation to improve reliability and performance
  • Experience preparing Big Data platforms for machine learning and data processing in production
  • Strong collaboration skills to work with architects, developers, and cross-functional teams
  • Capable of documenting solutions and maintaining consistency across systems and environments
  • Committed to building resilient and efficient infrastructure