Flexiti is one of Canada’s fastest growing fintech lenders. We aim to make our customers’ lives more affordable and help our retail partners grow their sales by offering flexible financing options. Through our award-winning omni-channel platform, customers can be approved instantly to shop with their FlexitiCard®, which they can use online or in-store to make multiple purchases, within their credit limit, without needing to reapply.
At Flexiti, we work hard, we love what we do, and we have some fun along the way! If you are looking for an energizing and innovative work environment with great people and big ideas, we’d love to have you join us!
To learn more about Flexiti, please visit www.flexiti.com
Flexiti is looking for a resourceful person who will be responsible for end-to-end ELT/ETL development of data into our Data Lake, Data Warehouse, and Data Mart environments. The individual should be experienced in the design, development, testing, review, and implementation of ELT/ETL code in a manner that enables sustainability, maintainability, repeatability, and scalability.
What you will be doing:
- Analyze business, IT, and data requirements to design ELT/ETL processes feeding our data environments.
- Leverage a combination of Python, PySpark, and SQL in our Databricks environment to build end-to-end ELT/ETL processes enabling data environments for analytics, reporting, operations, finance, machine learning, etc.
- Work with a variety of structured, semi-structured, and unstructured data coming from a variety of upstream sources.
- Assist in the QA process to ensure ELT/ETLs built are adequately unit tested to build a trusted gold data layer for downstream consumption.
- Optimize code as needed in to keep the environment running efficiently.
- Collaborate with analytics and business teams to improve and enhance data models that feed business intelligence tools.
- Partner with stakeholders to understand their business drivers so you can build the best possible data environment to meet their needs.
- Work on high visibility projects that drive business transformation and growth.
Why you would love to work here:
- You’ll be a part of an award-winning, fast-growing company
- Our innovative culture promotes on-going learning opportunities with training and mentorship
- Competitive compensation package commensurate to experience plus benefits
- Comprehensive drug/medical/dental insurance
- Remote working capability across Argentina
- Flexible time off in addition to labour days, and a paid volunteer day
What you should have:
- Bachelors degree in Computer Science, a related technical degree, or equivalent experience.
- 3+ years of experience in an ELT/ETL developer role using Python/PySpark for Data Transformations and Data Pipelining.
- Experience collaborating with Data Modelers and Data Architects to design data environments.
- Wide exposure to technologies such as SQL Databases, No-SQL, Cloud Environments, Containerization, Visualization tools, Microservices, Streaming, APIs, SFTP, etc.
- Experience in Waterfall and Agile project delivery methodologies
- Experience with version control software and CI/CD
- Excellent communication skills and demonstrated ability to work with business partners, understand their requirements, and deliver exceeding their expectations.
- Experience working in a Databricks environment in a Data Engineering capacity is an asset.
- Relevant certification is an asset.
- Financial services experience is an asset.
Flexiti embraces diversity. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. All qualified applicants will receive consideration without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, gender expression, disability, age, marital status, or family status. If you require disability-related accommodation during the application or interview process, simply let us know and we’ll work with you to ensure you have a positive experience.