Login | Register

Int. Data Engineer to support & advise on enterprise Kafka platform for large fintech company - JR102367

Job Type: Permanent
Positions to fill: 1
Start Date: Feb 06, 2023
Job End Date: Feb 06, 2023
Pay Rate: Salary: Negotiable
Job ID: 126832
Location: Calgary, Edmonton, Halifax, London, Montreal, Ottawa, Regina, Toronto, Vancouver, Victoria, Winnipeg
Apply

PERMANENT Int. Data Engineer to support & advise on enterprise Kafka platform for large fintech company - JR102367

Location: REMOTE (anywhere in Canada, EST hours)
 
Job Description:
Reporting to the Manager, Site Reliability Engineering and DevOps, you will be supporting & advising the enterprise Kafka platform ensuring message delivery, security, retention, and recoverability of the topics and messages. You will handle all Kafka environments from end to end and will serve as the subject matter expert in systems configuration, application integration, capacity planning, performance analysis, automation, and infrastructure planning.
 
Job Responsibilities:
  • Act as an SME for Kafka environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring (using the Confluent Platform)
  • Provide expertise as needed on Kafka Brokers, Connectors, Zookeeper, Schema Registry, KSQL and Kafka Control Center to internal and external project teams as required
  • Develop and deploy Converters (Avro/JSON) and Kafka connectors
  • Cluster Administration functions including topics, cluster redundancy, certificate/ key management, monitoring and alerting
  • Manage Kafka critical 24/7 application escalations
Must Have Skills:
  • 5+ years professional experience as a Data Engineer
  • 3+ years administrating the following Kafka Components: Apache Kafka, Kafka Brokers, Zookeeper, Schema Registry, Connect cluster, REST proxy, Replicator etc.
  • 3+ years on Kafka cluster with hands-on production experience, capacity planning, installation, administration/platform management, and a deep understanding of the Kafka architecture and internals
  • Expert Experience in Kafka cluster, security, disaster recovery, data pipeline, data replication, partitioning and performance optimization
  • Linux experience
  • Java experience
Nice to Have Skills:
  • Azure cloud
  • DevOps knowledge (Ansible)
  • Python & Shell scripting experience
  • Confluent Kafka admin or developer certification