Data Scientist / Engineer

Québec City, Canada
Full-time

We’re looking for a Data Scientist with hands-on expertise in the design, implementation, DevOps, and management of a Big Data architecture.

You will work with product owners to understand data requirements and build ETL to ingest information through our Data Platform and transform it into relevant insights. You are an expert at designing, implementing, and operating stable, scalable, low-cost solutions to flow operational data from APIs to databases and into end-user applications such as business intelligence software. You possess excellent communication skills and strive in a fast-paced and ever-changing environment. Above all, you are passionate about data and how to wield it to meet user needs and business requirements.

Responsibilities

  • Propose, investigate, develop and refine new analytical capabilities through exploratory proof-of-concepts and targeted data analysis;
  • Architect, design and build new data models and pipelines on our Data Platform;
  • Build algorithms, tools and custom solutions to empower the engineering team;
  • Design and develop visual dashboards on our BI software;
  • Prepare reports and present investigation findings to management.

Minimum qualifications

  • BS or MS in computer science, computer engineering, software engineering, or any other equivalent field;
  • 2-5 years of experience in data engineering, or demonstrated experience;
  • Experience in natural language understanding, computer vision, machine learning, data mining, deep learning or artificial intelligence;
  • Strong programming skills in one or more of the following: C, C++, Python, Java, R;
  • SQL, RDBS, ETL and data architecture skills;
  • Excellent analysis skills and the ability to explore complex data sets and extract insights;
  • Experience applying data science to business challenges;
  • Enthusiasm to work in a fast-paced engineering team.

Desirable assets

  • Experience leveraging cloud platforms like AWS to automate tasks, data pipelines, and stream processing;
  • Knowledge of big data and NoSQL;
  • Hands-on experience with large-scale data processing;
  • Experience in the research, development, and support of data visualization techniques and analytics.

Technical environment

Python, C++, AWS, Sisense, Docker, Linux, SQL, NoSQL.