Login | Register

Senior Data Engineer to support the ingestion, processing and delivery of batch and real-time streaming data for a retail client

Job Type: Permanent
Positions to fill: 1
Start Date: Aug 08, 2022
Job End Date: Aug 08, 2022
Pay Rate: Salary: Negotiable
Job ID: 122118
Location: Calgary, Edmonton, Halifax, Montreal, Ottawa, Regina, Toronto, Vancouver, Victoria, Winnipeg
Apply
Our valued Retail client is seeking a Senior Data Engineer to support the ingestion, processing and delivery of batch and real-time streaming data.

Permanent Full-Time Opportunity (100% Remote)! Several opportunities available for Junior and Intermediate Data Engineers as well. 

As the successful candidate, you will be responsible for the development and support of real-time data pipelines built on AWS technologies such as EMR, Lambda, Glue, Kinesis, Redshift/Spectrum and Athena.

Responsibilities:
  • Capture business requirements for data lake and data warehouse and support end-to-end data migration (from requirements to implementation)
  • Build, test and operate stable, scalable data pipelines that cleanse, structure and integrate disparate data sets into readable and accessible formats in the AWS data lake and Snowflake data warehouse
  • Maintain and monitor orchestration tasks for pipelines; ensure data is securely transferred throughout the data pipeline and identify/drive data quality improvements
  • Develop and write scripts for API integrations and automated data extractions
  • Develop code using Python, Scala, and R
  • Develop required transformation models in DBT
Must haves:
  • 4+ years of experience as a Data Engineer or similar role
  • Strong Python, R, and SQL experience
  • Experience building data pipelines in AWS using Lambda and Glue (or similar experience with a public cloud platform)
  • Demonstrated experience with Cloud Data Warehouse technologies (AWS Redshift or Snowflake)
  • Demonstrated expertise with ETL (Informatica, DataStage, Talend, Pentaho, Lambda) or a workflow automation platform (Airflow, DBT, etc.)
Nice-to-Haves:
  • Prior experience working in a start-up environment or within the retail industry
  • SnowPro certification or AWS, Azure, or GCP certifications
  • Experience with DBT (data build tool) or Infrastructure-as-code (Terraform)
  • Experience working with Graph API
  • Experience and interest in advanced analytical capabilities (ML/AI and predictive analytics)
  • Working experience in IoT projects