Login | Register

Data Engineers to work data quality projects

Job Type: Permanent
Positions to fill: 2
Start Date: Jul 01, 2022
Job End Date: Jul 01, 2022
Pay Rate: Salary: Negotiable
Job ID: 120257
Location: Toronto
Data Engineers to work data quality projects

Location: Toronto (Hybrid Model)
* Permanent Position *

Job Responsibilities:

• Work with stakeholders to understand data sources and Data, Analytics and Reporting team strategy in supporting within our on-premises environment and enterprise AWS cloud solution
• Work closely with Data, Analytics and Reporting Data Management and Data Governance teams to ensure all industry standards and best practices are met
• Ensure metadata and data lineage is captured and compatible with enterprise metadata and data management tools and processes
• Run quality assurance and data integrity checks to ensure accurate reporting and data records
• Ensure ETL pipelines are produced with the highest quality standards, metadata and validated for completeness and accuracy
• Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
• Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
• Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
• Writes unit/integration tests, contributes to engineering wiki, and documents work.
• Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
• Defines company data assets (data models), spark, sparkSQL jobs to populate data models.
• Designs data integrations and data quality framework.
• Designs and evaluates open source and vendor tools for data lineage.
• Database, storage, collection and aggregation models, techniques, and technologies – and how to apply them in business


• Working with ETL tools, Querying languages, and data repositories
• Experience in data cleansing
• Working experience with data modeling, relational modeling, and dimensional modeling
• Knowledge about file formats (e.g. XML, CSV, JSON), databases (e.g. Redshift, Oracle) and different type of connectivity is also very useful.

Nice to Have:

• Working experience with the following Cloud platforms is a plus: Amazon Web Services, Google Cloud Platform, Azure
• Working knowledge of source code control tool such as GIT