CTC004163 - Big Data Developer - senior

Secteur industriel: Telecommunications
Type d'emploi: Contract
Durée:
Mode de travail: On Site

Description

*****

3 top skills:

1. Extensive experience reading real-time complex data from a secured Kafka cluster using Scala/Spark Streaming and writing data in real-time to Hive/Impala and Solr in a secured Hadoop environment.

2. Must be capable of independently working through new and complex technical challenges and providing thought-leadership to junior and intermediate developers on the team.

3. Candidate must have excellent organization and communication skills, as they will be working directly with the business in an Agile environment to understand and refine technical requirements.

*****

Our Network Big Data team is seeking an experienced Senior developer capable of owning the technical delivery of complex development projects.

The suitable candidate will have advanced development skills in a real-time Big Data environment, including experience with the following tools:

Tool/Environment Skill level required Minimum Years work experience

  • Scala Expert 3yrs
  • Kafka Expert 3yrs
  • HDFS Expert 3yrs
  • Spark Streaming Expert 3yrs
  • Oozie Expert 3yrs
  • Docker Strong working knowledge 1yr
  • Kubernetes Strong working knowledge 1yr
  • Solr Expert 2yrs
  • Linux Scripting Expert 3yrs
  • Sqoop Expert 2yrs
  • sshfs Strong working knowledge 1yr
  • Networking and connectivity Expert 3yrs
  • sftp, ssh, ssl Expert 3yrs
  • experience working in Hadoop and Kafka environments with Kerberos enabled
  • Expert 3yrsThe developer must have extensive experience reading real-time data from a Kerberized Kafka cluster using Scala/Spark Streaming and making data available in real-time to Kerberized Hadoop cluster data stores including:

  • Hive tables accessible via Impala
  • Solr
  • Developer must be capable of independently working through technical challenges and providing thought-leadership to junior and intermediate developers on the team.

    Developer must have experience creating solutions consistent with security best practices to ensure that sensitive data is secured properly in flight and at rest.

    The candidate must have excellent communication skills, as they will be working directly with the business in an Agile environment to understand and refine technical requirements.

    Other attributes that are valuable for the role include:

  • Proven skills in developing high-quality, highly optimized, high performance and maintainable software for big data solutions specifically in the Hadoop ecosystem
  • Experience in architecture, design, software development, testing, deployment, maintenance, production and operation of data solutions
  • Experience building and testing code in non-production environments. This includes unit, regression, performance and end-to-end testing
  • Working experience developing projects in IntelliJ or Eclipse with Maven and integrating to GitHub
  • Able to follow software development life cycle (SDLC), development and security standards
  • Ability to measure software performance in non-production and production environments and improve its efficiency
  • The ability to troubleshoot connectivity issues between source and target systems, including problems with routing and firewall rules.
  • The ability to support customer issues and incidents regarding the big data platforms through to resolution
  • Building automation for repetitive yet complex tasks through the use of automation technologies to streamline operations
  • Exposure to Continuous Improvement methods
  • Proficient understanding of distributed computing principles
  • The candidate should possess a degree in Engineering, Mathematics, Science or Computer Science or alternatively a diploma in software development with a focus on Big Data languages/tools.

    Notre barre latérale

    Bienvenue sur le nouveau site web de la CTC.