Scheduled Maintenance: will be Unavailable from Friday, Dec 9th @ 8:00 PM MST - Saturday, Dec 10th @ 8:30 AM MST for regularly scheduled maintenance.
Login | Register

Intermediate Data Engineer to develop ETL and provide input on architecture using Talend for a large banking client-27988

Job Type: Contract
Positions to fill: 1
Start Date: Nov 14, 2022
Job End Date: Mar 31, 2023
Pay Rate: Hourly: Negotiable
Job ID: 124839
Location: Toronto
Intermediate Data Engineer to develop ETL and provide input on architecture using Talend for a large banking client-27988

Contract Length: 4 months (potential for extension)
Location: Hybrid ( 1 day a week at Scarborough)
Hours: 9am-5pm

Story Behind the Need:

Business Group: CBT Data Architecture and Database Services
Data and Analytics Engineering team is responsible for delivering data integration solutions for a variety of business lines. The team is responsible for the CBT data aggregation/ETL and data delivery applications. We support and/or interface with a variety of platforms and technologies: Talend, Spark, Information Builders (Data Migrator and Data Quality Centre), Python, Java, Scala, DB2, COBOL, Hadoop, Hive, etc.

The Data Engineer will focus on the TSYS project. This will require them to develop ETL and provide input on the architecture that is being built to process new statements using Talend. This will also involve complexity with data formats.
The project is currently in the initial stage of reviewing requirements and assessing solutions

Typical Day in the Role:
  • Develop and Design ETL job using Talend,
  • Provide technical guidance to diverse projects that differ in size and scope from their inception to delivery overseeing the technical solution, identifying reducing and mitigating risks.
  • Guide and assist in the definition of non-functional requirements and the delivery of a highly scalable, secure and flexible solution.
  • Collaborate with multiple technical teams to understand interdependencies, commonality and variability of the solutions under development and then help the teams deliver a platform made of reusable and configurable capabilities.
  • Provide input to the continuous improvement of processes and adopting latest technologies and methodologies.
  • Ensure Agile and DevOps practices are applied to software development and architecture design.

Must Have Skills/Requirements:
  • 4+ years of Data Engineer/ IT development experience
  • 3+ years: Strong knowledge and hands-on experience running Talend Data Fabric.
  • 3+ years: Strong knowledge and experience with
    Java and Python, SQL/HQL, Linux OS Scripting/Commands and REST API.
  • 4+ years: Working knowledge and experience with Spark and Hadoop (Hive)
  • 4+ years: Knowledge and experience with different IDEs including Eclipse, CI/CD tools (Experience with code repository, version control and code promotion tools such as bitbucket and Jenkins) Confluence, defect tracking such as Jira

Nice to Haves:
  • Apache open-source projects such as Sqoop or Zookeeper
  • Hands-on experience on scaled Agile methodologies
  • Talend certification

  • Bachelor Degree in Computer Science or related field.

Interview Process:
  • 2 step process: 1 Technical Interview (45 minute) & take-home Talend Assignment
  • Interviews to take place ASAP