Work In Tech

Find your next role at Canada's fastest-growing tech companies

Junior Data Engineer

Citylitics

Citylitics

Data Science
Toronto, ON, Canada
Posted on Apr 4, 2024

About Citylitics

Citylitics delivers predictive intelligence on local utility & public infrastructure markets

What is Infrastructure? It is the roadways you rely on to safely get to Grandma's house, it's the potable water that comes out of your kitchen tap that you wash your family's food with and it's the energy that heats our homes and powers our digital lifestyles.

Every year, trillions of dollars are spent on all areas of infrastructure to maintain our quality life and move our economy forward. However, our infrastructure is no longer equipped to meet the needs of the future. We hear about infrastructure failures, whether bridge collapses, power blackouts, or water main breaks, every day in the news. Climate change and extreme weather events are disrupting the basic infrastructure we took for granted for years.

Citylitics is solving the hardest data problems in infrastructure while building the sales intelligence platform that enables a faster, more transparent, and more efficient infrastructure marketplace. We turn millions of unstructured documents into high value intelligence feeds and datasets that are available on an intuitive user experience. Our goal is to enable solution providers to connect with cities with relevant infrastructure needs in a faster and more digital way than historic market channels. As more companies adopt our platform, cities & utilities will be able to access solutions that deliver on the promise of moving towards a more resilient, sustainable, and equitable infrastructure future.

Who Are We Looking For?

We are seeking a Junior Data Engineer with an analytical mindset and a passion for using data to drive decision-making. As a member of our team, you will play a vital role in developing and interpreting data insights, as well as building and maintaining data pipelines and dashboards.

What Will You Accomplish?

  • Data Pipeline Development and Maintenance:
    • Design, develop, and maintain data pipelines using Airflow, Django and SQL to ensure efficient extraction, transformation, and loading (ETL) of data.
  • Stakeholder Collaboration:
    • Collaborate with stakeholders to gather data requirements and translate them into technical solutions.
  • Data Integrity and Quality Control:
    • Ensure the integrity, quality, and security of data throughout the ETL process.
  • Database Optimization:
    • Assist in the optimization and performance tuning of database queries and processes.
  • Integration and Orchestration:
    • Support the integration of Airflow/Cloud Composer into existing data workflows.
  • Dashboard Development:
    • Build and maintain dashboards (in Looker Studio and Dash) to visualize and communicate data insights.
  • Documentation and Best Practices:
    • Contribute to the documentation of data pipelines, processes, and best practices.
  • Continuous Learning:
    • Stay updated with industry trends and best practices in data science and related tools.
  • Other duties as assigned.

Technologies We Use:

  • Backend: Python, Django, Cloud SQL, Airflow/Cloud Composer
  • Cloud Infrastructure: Google Cloud Platform
  • Data Visualization: Looker Studio, Dash
  • Other Tools: Javascript, React