Sr. Data Engineer to support a wireless migration to GCP and Big Data solution deliveries for our large Telecom client - 3417
Sr. Data Engineer to support a wireless migration to GCP and Big Data solution deliveries for our large Telecom client - 3417
Location: Hybrid (3 days in office) Toronto, Montreal
Duration: 1 year (Possible extension/FTE)
As a member of the Network Big Data team, reporting to the Network Big Data CoE Manager, the Big Data DevOps will play a leading role in the development of new products, capabilities, and standardized practices using Big Data technologies. Working closely with our business partners, this person will be part of a team that advocates the use of Big Data technologies to solve business problems, and be a thought-partner in the Big Data space.
Responsibilities:
· Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support.
· Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis.
· Ensure Big Data practices integrate into overall data architectures and data management principles (e.g. data governance, data security, metadata, data quality).
· Create formal written deliverables and other documentation, and ensure designs, code, and documentation are aligned with enterprise direction, principles, and standards.
· Train and mentor teams in the use of the fundamental components in the Hadoop stack.
· Assist in the development of comprehensive and strategic business cases used at management and executive levels for funding and scoping decisions on Big Data solutions.
· Troubleshoot production issues within the Hadoop environment.
· Performance tuning of a Hadoop processes and applications.
MUST HAVES:
- Proficiency with SQL, NoSQL, relational database design and methods.
- Proficiency in GCP/AWS.
- Experience building and coding applications using Hadoop components - HDFS, Hive, Impala, Sqoop, Flume, Kafka, StreamSets, HBase, etc.
- Experience building Java Applications.
- Experience coding Scala / Spark, Spark Streaming, Java, Python, HiveQL.
- Hands on expertise in Linux/Unix and scripting skills are required.
- ETL tools & Data Warehousing architecture.
- Experience working in an Agile environment.
Nice to Have:
- Wireless/Telecom Operations and Engineering business Knowledge including basic understanding of Radio access, Core network and Value added Services technologies and configurations.
- Experience in Exadata and other RDBMS.
- Experience using Maven, Git, Jenkins, Se, Ansible or other continuous integration tools.
- Ability to Identify requirements to apply design patterns like self-documenting data vs. schema-on-read.
- Knowledge of predictive analytics techniques (e.g. predictive modeling, statistical programming, machine learning, data mining, data visualization).
- Bilingual (English/French).