CTC007201 - Data Engineer, Devops, Junior

Secteur industriel: Telecommunications
Type d'emploi: Contract
Durée: Eleven months
Mode de travail: Mixed

Description

We currently work in Hybrid mode (3 days WFH and 2 days out of office, Mississauga). This may change as per the Org guidelines.


Hours per week – 37.5 hrs per week


What the contractor will be doing on a day-to-day basis


Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support


What are the top three functions of this role? (Top duties the contractor will be accountable for)

1. Coding for building an ETL pipeline, Assist on migrating platform to GCP

2. Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis

3. Ensure Big Data practices integrate into overall data architectures and data management principles (e.g. data governance, data security, metadata, data quality)


What will the contractor need to deliver in their initial 1–6 months? Deliver one data ingestion project independently


What are the top 3 skills sets you want to see on a candidate’s resume?

1. Hadoop Developer

2. Knowledge about GCP and Cloud native solutions

3. Analyst in Business Intelligence


Which qualifications are essential, if any, for the role? Knowledge of SQL and couple of programming language


Any testing in the interviews? Yes, we have a 30 min Hackerrank test followed by 2 rounds of interview (minimum)


Any potential to hire full time? Yes



JOB DESCRIPTION:


Primary Responsibilities:


Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support

Develop standardized practices for delivering new products and capabilities using Big Data technologies, including data acquisition, transformation, and analysis

Ensure Big Data practices integrate into overall data architectures and data management principles (e.g. data governance, data security, metadata, data quality)

Create formal written deliverables and other documentation, and ensure designs, code, and documentation are aligned with enterprise direction, principles, and standards

Train and mentor teams in the use of the fundamental components in the Hadoop stack

Assist in the development of comprehensive and strategic business cases used at management and executive levels for funding and scoping decisions on Big Data solutions

Troubleshoot production issues within the Hadoop environment

Performance tuning of a Hadoop processes and applications

Proven experience as a Hadoop Developer/Analyst in Business Intelligence

Strong communication, technology awareness and capability to interact work with senior technology leaders is a must

Good knowledge on Agile Methodology and the Scrum process

Delivery of high-quality work, on time and with little supervision

Critical Thinking/Analytic abilities


Basic Qualifications:

Bachelor in Computer Science, Management Information Systems, or Computer Information Systems is required.

Minimum of 4 years of Building Java apps

Minimum of 2 years of building and coding applications using Hadoop components - HDFS, Hive, Impala, Sqoop, Flume, Kafka, StreamSets, HBase, etc.

Minimum of 2 years of coding Scala / Spark, Spark Streaming, Java, Python, HiveQL

Minimum 4 years understanding of traditional ETL tools & Data Warehousing architecture.

Strong personal leadership and collaborative skills, combined with comprehensive, practical experience and knowledge in end-to-end delivery of Big Data solutions.

Experience in Exadata and other RDBMS is a plus.

Must be proficient in SQL/HiveQL

Hands on expertise in Linux/Unix and scripting skills are required.


Preferred Qualifications:

Strong in-memory database and Apache Hadoop distribution knowledge (e.g. HDFS, MapReduce, Hive, Pig, Flume, Oozie, Spark)

Past experience using Maven, Git, Jenkins, Se, Ansible or other continuous integration tools is a plus

Proficiency with SQL, NoSQL, relational database design and methods

Deep understanding of techniques used in creating and serving schemas at the time of consumption

Identify requirements to apply design patterns like self-documenting data vs. schema-on-read.

Played a leading role in the delivery of multiple end-to-end projects using Hadoop as the data platform.

Successful track record in solution development and growing technology partnerships

Ability to clearly communicate complex technical ideas, regardless of the technical capacity of the audience.

Strong inter-personal and communication skills including written, verbal, and technology illustrations.

Experience working with multiple clients and projects at a time.

Knowledge of predictive analytics techniques (e.g. predictive modeling, statistical programming, machine learning, data mining, data visualization).

Familiarity with different development methodologies (e.g. waterfall, agile, XP, scrum).

Demonstrated capability with business development in big data infrastructure business



Preferred Skills:

Bilingual (French/English) who has the capacity to adapt his communication to most of the situations and audience. Proficient in planning his communications, facilitating a meeting or workshop;

Able to proactively plan his/her work over multiple timeframes – week-months-year and juggle multiple priorities and deliver as per commitments;

Able to plan and execute complex tasks without supervision, identify potential roadblocks and mobilise resources to remove them and achieve goals;

Able to identify and analyze complex problem, identify root cause, provide detailed description and plan, design and deliver workaround/solution;

Capable to evaluate without supervision the effort & time required to complete a deliverable and/or task thru collaboration, teamwork, honesty, commitment and respect;

Comfortable interviewing non-technical people to gather/discuss requirements;

Gets easily acquainted to new technologies e.g. programming language within 2-3 days;

Wireless/Telecom Operations and Engineering business Knowledge including basic understanding of Radio access, Core network and Value added Services technologies and configurations.

Notre barre latérale

Bienvenue sur le nouveau site web de la CTC.