Data Engineer

Eddyfi/NDT

Eddyfi/NDT

Software Engineering, Data Science

Calgary, AB, Canada

Posted on Apr 30, 2026

Data Engineer

Goal / Purpose of the Role

The Data Engineer will design, build, and maintain scalable data pipelines ensuring that data flows smoothly from its source to its destination efficiently and securely. The Data Engineer is also responsible for automating processes and tasks. This may include creating scripts to move files from one location to another or developing algorithms to process data more efficiently.

Job Overview / Key Responsibilities

Design and build data pipelines by creating systems to collect, move, and transform data from various sources into a usable format.

Manage data storage by constructing and maintaining databases, data warehouses, and data lakes to store and organize data securely and efficiently.

Automate processes by building data automation through APIs that move data seamlessly across systems.

Collaborate with teams including Data Architects, BI teams, and other stakeholders to understand their data needs and build the necessary infrastructure.

Ensure data quality by implementing Master Data and related data processes to clean, validate, and monitor reliability and performance.

Maintain infrastructure by ensuring the data infrastructure is scalable, secure, and compliant with company policies. Monitor and troubleshoot data workflows to ensure high availability and performance. Develop and maintain documentation for data processes and architecture.

Qualifications / Education

Bachelor's degree in Computer Science, Data Engineering, or a related field.

Must be proficient in spoken and written English.

Experience

Minimum of 4+ years of experience working within a Data Engineering team building data automation solutions.

Strong proficiency in SQL and experience with relational and NoSQL databases.

Hands-on experience with ETL tools and frameworks such as SSIS or Azure Data Factory.

Familiarity with data warehousing concepts and tools such as Snowflake or BigQuery.

Knowledge of Python or other scripting languages used for data processing.

Experience working with cloud services such as Azure or AWS.

Understanding of data governance, security, and compliance best practices.

Exposure to machine learning pipelines and advanced analytics environments is preferred.

Strong problem-solving and communication skills.

Competencies / Behaviors

Solutions Oriented Mindset
Demonstrates resilience and analytical thinking to break down complex issues and develop viable solutions.

Teamwork
Fosters a collaborative environment where teamwork is valued and contributes to the success of the team, department, and organization.

Customer Mindset
Places internal and external customers first and proactively anticipates and exceeds expectations.

Communication
Builds and maintains strong communication practices that foster trusted business relationships.

Accountability
Takes ownership and accountability for responsibilities and outcomes while demonstrating a strong work ethic.

Safety Mindset
Maintains a strong awareness and commitment to safety in all aspects of work.

Global Mindset
Promotes integrity, respect for individual differences, and cultural diversity in a respectful and inclusive workplace.

Leading for High Performance
Builds and supports high-performing teams through engagement, collaboration, and reliability.

Decision Making
Demonstrates the ability to make sound, evidence-based decisions while assessing risks related to delivery and execution.

Business Acumen
Understands business operations and the competitive environment while striving for results and responding effectively to challenges.

Inspire and Value People
Promotes an inclusive environment that encourages individuals to perform at their best while supporting professional development.

Strategic Leadership and Innovation
Encourages strategic thinking and champions innovative ideas, methods, products, and solutions.

Software / Technology / Equipment

Advanced knowledge of ETL tools such as SSIS, Azure Data Factory, or equivalent.

Advanced knowledge of SQL and NoSQL databases.

Intermediate knowledge of cloud services such as Azure or AWS.

Basic knowledge of Python or R scripting and machine learning pipelines.

Basic knowledge of analytical tools such as Power BI or similar platforms.

Work Environment

This role operates within a general office environment.

Occasional travel may be required for deployment or training purposes.