CTC007190 - Big Data Advanced Visualization, Network Big Data CoE

Secteur industriel: Telecommunications
Type d'emploi: Contract
Durée: Eight months
Mode de travail: Mixed

Description

IMPORTANT


This DevOps candidate should have 5+ years of experiences in data analytics and visualization in cloud (GCP) and onprem systems such as MSTR


- What locations you would consider? Creekbank and Montreal

- If bilingual required, will interviews be held in French? NO, only English is enough

- Are there any required days in office? 3days in Office


- Provide a description of the typical day to day in this role. Candidate will do SW development work in Data Analytics space such as GCP Looker, ESD, Trino, MicroStrategy Dashboard development


- What are the top 3 skills sets and qualifications you want to see on a candidate’s resume? MicroStrategy, Python, SQL. Trino, GCP, Looker, Power BI, ML


- What will the interview process look like (how many interviews, assessments, virtual or in person)? 1 Interviews (45min Technical and 15min non-technical)


- What specific projects will be worked on? Data Cloud Migration


- What is the expected hiring date? April 3


- Any potential for extension or to hire full time? YES



Big Data is one of the fastest growing business and technology disciplines. Specifically in Telecom, there are great opportunities to utilize Big Data and advanced analytics solutions to evolve network planning and operation and prepare for the future network.


This is a rare opportunity for seasoned data and analytics professional to join a fast growing team of Data engineers, cloud technologies, hadoop, and analytics experts to design, develop data solution to support the entire Bell Network community.


Job Description:

As a member of the Enterprise Data Platform team, reporting to the EDP AI/ML Senior Manager, the Cloud Analytic Engineer and DevOps will play a leading role in the development of new products, capabilities, and standardized practices using Cloud and Data technologies. Working closely with our business partners, this person will be part of a team that advocates the use of advanced Data technologies to solve business problems, and be a thought-partner in Data space.


Key Responsibilities: Data Analysis and visualization:

• Ability to own and lead your projects through all phases including identifying business requirements, technical design & implementation, and final delivery and refinement with business teams

• Elicit, analyze and interpret business and data requirements to develop complete analytic solutions, includes business process diagram, data mapping, data models (entity relationship diagrams, dimensional data models), ETL and business rules, data life cycle management, governance, lineage, reporting and dashboarding

• Facilitate data discovery workshops, downstream impact analyses and proactively manage stakeholder expectations

• Must have Analytical thought leadership

• Advanced expertise in SQL (BigQuery, Trino, Impala or similar)

• Comfortable with data visualization, and data strategies. Experience with BI Analytic tools like Looker, Microstrategy, Tableau, Kibana or similar.

• Communicate analysis results with effective storytelling.

• Comfortable working in complex and constantly changing environments, with multidisciplinary teams.

• Being able to effectively challenge the Status quo and set standard and direction for the solutions

• Well organized, able to multi-task and manage priorities. Data Engineering:

• Proven experience building and deploying data analytic workflows in a big data environment (Hadoop, Google cloud platform or similar)

• Design and develop ETL workflows based on business requirements and using multiple sources of data in various formats within Hadoop platform or Google Cloud platform

• Clean, manipulate and analyze large, complex data sets, spanning a wide variety of sources.

• Develop scalable and robust analytic solutions that can support the growing volumes of our data environments.

• Build data models, implement business rules, and engineer responsive and scalable data analytics pipeline.

• Design and implement component execution orchestration in Cloud Composer/ Cloud Data Fusion / Oozie / Airflow

• Promote code to different environments using GitLab CICD

• Produce well documented quality codes


Qualifications and Skills:

• Masters degree in Data Science and Analytics, Mathematics, Statistics, Computer Science or related field

• 5+ years of experience in a analytic engineering role, working in different data management disciplines including data integration, modelling, optimization and quality

• 5+ years of experience in coding with Python, SQL, Scala,

• 2+ years of CI/CD deployment, code management platform GitHub, building and coding applications using Hadoop components - HDFS, Hive, Impala, Sqoop, Kafka, HBase, etc.

• 2+ years experienced in working in Big Data Analytical environments/technologies (Hadoop, Hive, Spark), with a deep understanding of data mining and analytical techniques.

• 1+ years experience with Cloud platforms such as Google Platform, AWS, Azure, or Databricks

• 1+ year Experience working with Google Cloud services, including Dataflow, Cloud Composer, Airflow, Cloud Run, Pub/Sub

• 5+ years experience building reports and visualizations in Tableau, Microstyrategy, Looker, Kibana or equivalent.

• 5+ years experience with traditional data warehousing and ETL tools

• Comfortable in version control tools such as Git.


Preferred Qualifications:

• Deep understanding of techniques used in creating and serving schemas at the time of data consumption

• Use AI, ML, and other big-data techniques to provide a competitive edge to the business.

• Advanced Cloud data technologies in ML space (Vertex AI, BQ-ML etc)

• Experience in AWS/Azure data platforms

• Past experience using Maven, Git, Jenkins, Se, Ansible or other CI tools is a plus

• Experience in Exadata and other RDBMS is a plus.

• Knowledge of predictive analytics techniques (e.g. predictive modeling, statistical programming, machine learning, data mining, data visualization).

• Familiarity with different development methodologies (e.g. waterfall, agile, XP, scrum)

• Strong inter-personal and communication skills including written, verbal, and technology illustrations.

• Experience working with multiple clients and projects at a time

Notre barre latérale

Bienvenue sur le nouveau site web de la CTC.