Principal Data Scientist
Flexiti Financial
Data Science
Canada · North York, Toronto, ON, Canada
What’s in it for you as an employee of QFG?
-
Health & wellbeing resources and programs
-
Paid vacation, personal, and sick days for work-life balance
-
Competitive compensation and benefits packages
-
Work-life balance in a hybrid environment with at least 3 days in office
-
Career growth and development opportunities
-
Opportunities to contribute to community causes
-
Work with diverse team members in an inclusive and collaborative environment
What’s it like working in Growth Portfolio?
Most financial institutions are built to maintain; we are building to disrupt. As a part of the Growth Portfolio, you join a small, highly talented team where Questrade's most ambitious ideas go from 0 to 1. We operate with a specific mandate: to expand our risk tolerance OR venture where others hesitate, never settle for the status quo and build the tools that will redefine financial success for Canadians. We operate on two fronts - pioneering entirely new ventures while reinventing existing product lines. This is a high-visibility, high-impact environment for individuals who thrive in startup and scale-up cultures. The Growth Portfolio operates like a startup, with Questrade as its incubator. You won't just execute an ambitious roadmap—you will shape it.
We’re looking for our next Senior Data Scientist. Could It Be You?
The Alternative Investments Team is launching new crypto and alternative product capabilities. We're looking for a subject matter expert (SME) for Data and AI Engineering within Growth Portfolio who can take us from 0→1 on the data value chain by implementing and maintaining flows into BigQuery, build compliance-grade reporting, establish evergreen dashboards, and enable self-serve analytics for the team.
We are not starting from scratch: Alternative Investing is already live with Precious Metals trading, and we have existing patterns for shipping data for analytics, creating views, and building dashboards/reporting pipelines. The expectation is that you build on those patterns and accelerate—then push us into the next maturity level. Once the foundation runs itself, you shift into higher-leverage work: predictive analytics and forecasting that sharpen business strategy, data-driven experimentation that makes our products better, and AI-powered automation that makes our operations faster. You resolve complex technical challenges within the existing data framework, adapt and optimize AI tools to meet evolving compliance and reporting requirements for crypto products.
Need more details? Keep reading…
In this role, responsibilities include but are not limited to:
-
Reporting & compliance foundation (near-term): Build and operate daily reconciliation and supervision/surveillance reporting flows for crypto, derivatives, and precious metals products and business lines; ensure outputs are reliable, traceable, and delivered via approved secure mechanisms.
-
Data integration & modeling: Ingest, model, and curate core datasets in BigQuery; define metric definitions with clear grains and ownership.
-
Dashboards & monitoring: Execute the full lifecycle from data pipeline to shipped dashboard. Whether it's a Looker explore for business users, a Grafana panel for real-time ops, or a custom PyShiny app for interactive decision-making—you pick the right tool, build it, deploy it, and keep it running. We care about how you think and that the result actually provides value.
-
Self-serve enablement: Build and maintain queryable tables/views, documentation, interfaces, and examples so internal users can answer questions, explore data, research ideas, and generate new insights. Rather than becoming the analyst bottleneck—you make data insights possible without one.
-
Production engineering & delivery: Write and ship production code. Collaborate on shared codebases with software engineers, deploy through CI/CD pipelines, and monitor services on scalable infrastructure (GKE/Kubernetes). You're comfortable with end-to-end delivery of simple full-stack solutions for internal users.
-
Stakeholder partnership & problem solving: Engage directly with compliance, ops, product, and business leaders with an owner's mindset to understand their needs, diagnose issues, and deliver solutions. You don't wait for a ticket; you solicit feedback, discover problems, and fix them.
-
Automation & AI that compounds: Automate recurring workflows, shrink manual operational load, and then turn that same instinct on your own process. Use AI to draft pipelines, generate tests, explore data faster, and rapidly prototype solutions that would have taken weeks in 2023. Your goal is to make the "must-have" work boring so you can spend your time on the work that actually moves the business.
-
Stack pragmatism: Our baseline today is BigQuery + Looker + Python microservices in GKE. We care about outcomes, not stale patterns. If you see a better way, you have the autonomy to shape direction, and the pragmatism to work with legacy systems when that's the fastest path.
So are YOU our next Senior Data & AI Engineer? You are if you…
-
Strong SQL + Python and a track record owning production data pipelines and analytics outputs end-to-end
-
Experience with modern cloud data analytics platforms (GCP/BigQuery, Databricks, Snowflake, etc.)
-
Strong data modeling instincts and the ability to create stable metric definitions
-
Experience building dashboards that people actually use and evolving them with the business
-
Production software engineering: you write code that ships, you collaborate on codebases via Git, and you're comfortable with CI/CD and container orchestration (Docker, Kubernetes/GKE)
-
Comfort operating in regulated environments with strong privacy/PII awareness and auditability discipline
-
Exceptional written and verbal communication skills; you can translate "data speak" into business decisions
-
High autonomy: you can prioritize, ship thin slices, and navigate enterprise constraints without stalling
-
AI-first mindset: you have concrete examples of using AI to accelerate your engineering/analytics work, automate repetitive tasks, and continuously improve your own workflow.
Additional kudos if you…
-
Product analytics instrumentation/experimentation (e.g. Amplitude)
-
Time-series forecasting or operational modelling
-
Experience with GCP data services beyond BigQuery (e.g., Dataflow, Composer/Airflow, Pub/Sub) or a similar suite in AWS/Azure
-
Success enabling self-serve analytics / data democratization in complex orgs
-
Enterprise or hobby projects building LLM applications with an Agent Framework (e.g. LangChain/LangGraph, Google's ADK, Open-Claw, etc.) Bonus points for explaining how your evals, traces, and LLMOps make it robust!
Compensation Information:
-
Base salary range: $130,000 - $170,000
-
The final compensation package will be commensurate with the successful candidate's experience, skills, and geographic location (Canada). It includes a comprehensive benefits plan and a competitive incentive (bonus) program for Full-Time Permanent roles.
-
Vacancy statement: This job posting is for an existing vacancy.
-
AI in hiring: QFG's Applicant Tracking System utilizes artificial intelligence (AI) for application screening. The AI system operates on predetermined criteria, with final decisions subject to human review.
At Questrade Financial Group of Companies, with multiple office locations around the world, we are committed to fostering a diverse, inclusive and accessible work environment.
Sounds like you? Click below to apply!
#LI-JW1 #LI-Hybrid
