Data Engineer
Tactable
Our Mission at Tactable
What if you could build software that didn’t just scale, but transformed entire organizations?
At Tactable, this is what drives us. We’re building a world-class cloud, data, and API engineering firm with a mission to power the most influential tech of tomorrow, through expert-led delivery, strong partnerships, and a relentless focus on quality.
We don’t just consult. We build.
Founded by engineers who care deeply about people, process, and product, we go beyond just solving problems. We embed with clients, work across full project lifecycles, and operate at the speed of startups while upholding the rigor of enterprise-grade engineering.
From financial institutions to emerging tech ventures, our work is behind some of the most mission-critical systems in production today, and we’re just getting started. With growing demand from top-tier clients and a strong runway for expansion, we’re building a team of curious, ambitious developers who want to build meaningful things with meaningful people.
Take a look at tactable.io to learn more about our work and what sets us apart.
The Role: Data Engineer (Cloud, Streaming)
This is a hands-on, hybrid role designed for data engineers who thrive at the intersection of infrastructure, pipelines, and product. As a Data Engineer, you’ll be part of a deeply technical team solving real-world data challenges for enterprise clients, from building real-time Kafka pipelines and distributed systems to optimizing large-scale workflows in production environments.
You’ll have the opportunity to own architectural decisions, drive platform migrations, and mentor others across data and software engineering domains. Whether you're scaling a trading system or modernizing a legacy data warehouse, your work will be high-impact and highly visible.
If you care about clean pipelines, scalable architecture, and building systems that power real users, you’ll thrive at Tactable.
What You’ll Do
End-to-End Data Engineering
- Build scalable, reliable ETL/ELT pipelines using PySpark, and SQL
- Design and optimize real-time data pipelines using Kafka, Java Spring, and streaming APIs
- Integrate diverse data sources from Azure Blob, ADLS, and both relational and non-relational systems
- Migrate legacy data architectures into modern, cloud-native environments
- Optimize workloads for performance, cost, and fault tolerance
Technical Leadership
- Own critical components of infrastructure across batch and streaming systems
- Break down complex technical challenges into executable components
- Mentor junior engineers through code reviews, architectural guidance, and technical pairing
- Automate and standardize pipelines to reduce overhead and increase velocity
- Contribute to internal tooling, documentation, and engineering best practices
Client & Project Exposure
- Work directly with client engineering teams in finance, trading, and enterprise tech
- Rotate across domains and industries every 6–12 months, gaining broad exposure to systems and technologies
- Tackle greenfield builds, system migrations, and enterprise-scale data transformations
- Engage as a true partner, not a vendor, in shaping data product strategy
What You Bring
Must-Have Experience
- Extensive experience in software or data engineering roles
- Strong programming skills in Java and Python
- Deep understanding of Kafka, event-driven architectures, and streaming frameworks
- Deep experience with Spark/PySpark, and large-scale production data systems
- Strong SQL and experience working with both structured and unstructured data
- Hands-on experience with cloud infrastructure (preferably Azure) and CI/CD pipelines
- Familiarity with data orchestration tools (e.g., Airflow, dbt) and containerized environments
Bonus Points For
- Experience with full-stack development (e.g., React) and building data-driven frontends
- Prior work in finance, trading, or other regulated/high-performance environments
- Strong documentation, collaboration, and knowledge-sharing mindset
- Experience with Databricks
Why You’ll Love This Role
- Growth-First Culture
- From custom career paths to project rotation, we design roles around your goals, not just business needs.
- Full Ownership & Impact
- You’ll own critical parts of delivery, architecture, and technical decision-making. No red tape, no silos.
- Tight-Knit Team
- We’ve built a culture of trust, collaboration, and curiosity. Whether it’s team lunches, hack days, or a new internal tool, we move as one unit.
- Real Work, Real Users
- We’re not building MVPs that sit on a shelf. You’ll work on systems that millions rely on, every day.
- Flexibility with Structure
- We’re a hybrid-first team with a strong appreciation for in-office collaboration, especially at our downtown Toronto HQ. We encourage in-person presence to foster mentorship, connection, and collaboration.
Why This Might Not Be a Fit
- You’re looking for narrowly scoped responsibilities or long-term focus on a single domain
- You prefer a rigid hierarchy with formal titles and isolated workstreams
- You’re more comfortable in a vendor-style delivery model than deep technical partnerships
Compensation, Benefits & Perks
- Salary Range: Competitive and flexible, based on experience and fit for the role.
- Comprehensive health and dental plan
- Generous PTO and holidays
- Laptop & home office equipment provided
- Career coaching and personalized development plans
- Regular social events, team outings, and wellness activities
Ready to Build the Next Generation of Data Infrastructure?
No cover letter required. Just apply, and let’s start building.