Sr ETL Developer to design, build, automate and optimize complex data ETL/ELT processes for a big data ingestion project for our large financial client-31540
Location Address: Hybrid - – 2 days/week On Site (Mondays + Wednesdays)
Contract Duration: approx. 4 months - ASAP to 02/29/2024
Number of Positions: 1
Reason: Additional Workload
Story Behind the Need
Business group: Emerging Technology & Platform Engineering – in finance and risk teams, data ingestion and pre-staging work
Project: Support development of FRDP LZ/Pre-stg ingestion pipelines – support regulatory reporting – in the middle of the project; sourcing big data ingestion
Typical Day in Role:
- Act as a technical lead for the data initiative. Deliverable includes: mapping specification, data model, design, creating complex ELT pipelines, conducting unit tests, deployment using devops.
- Working with the business analysts to translate the business requirements and functional specifications into an application design appropriate to meet the business and operational/IT needs
- Work with Quality Assurance (QA) to code, test, and debug extensions/tools integration services
- Analysis, design and development work on complex data pipelines
- Ensuring compliance with architecture framework and system standards, and Agile Development Methodology
- Software Development Lifecycle (SDLC) end-to-end Agile/scrum and Waterfall methodologies
- Primarily working at data ingestion for batch and real time processing
Must Have Skills:
- 8+ years’ experience as a developer
- 5+ years’ designing, building, automating and optimizing complex data ETL/ELT processes
- 8+ years’ creating data assets (denormalized) from highly normalized Oracle database, and load the data asset to another type of DBMS (MS SQL, PostgreSQL, etc.) using demonstrated advanced SQL skills (PL/SQL, TSQL) as well as performance tuning
- 5+ years’ scripting languages (Java, Python, Unix shell)
- 5+ years’ experience with data warehousing design and development with focus on extracting, transforming data coming from various data sources and loading into star/snowflake schemas
Nice-To-Have Skills:
- Familiar with one or more ETL/ELT tools (DataStage, Talend, NiFi, Spark,)
- Working knowledge of issue tracking system JIRA, Azure DevOps
- Using Git hub for version control and data model changes
- Cloud technologies (Azure, Google, AWS)
- Oracle Exadata
- Experience from FI/banking an asset
Soft Skills Required:
• Excellent communication skills
Education:
Post-secondary in computer science, engineering, or related technical field
Candidate Review & Selection
Up to 2 rounds – preferably on site / in person panel interviews - – up to 1 hour – with HM, senior devs on the team; there will be a technical component, no prep needed, may need to do small case samples to demonstrate technical skills during interview