CTC006481 - Data Engineer - DSI

Secteur industriel: Médias
Type d'emploi: Contract
Durée: One year
Mode de travail: On Site

Description

IMPORTANT

Work from home with limited (2 days per week) in office once return to work begins

Hours per week – 37.5

Day to day basis -

  • Build and maintain data pipelines, provision infrastructure and secure reliable data centric services and applications in AWS and Google Cloud Platform
  • Assist in extracting, loading, transforming, cleaning and validation of data
  • Support data pipelines feeding into ML algorithms and models
  • Support ad-hoc data analysis requests to advance our digital analytics practice
  • Top 3 functions -

  • Build and maintain data pipelines, provision infrastructure
  • extracting, loading, transforming, cleaning and validation of data
  • aggregation of data into pre-defined templates and model table structures
  • As part of a team, the engineers will be responsible for ingesting data sources from end to end through their first 6 months across a number of

    Top 3 skill sets -

  • 2 years of hands-on experience designing and implementing data ingestion solutions on AWS / GCP using custom approaches.
  • 2 years of hands-on experience architecting and designing data lakes on AWS / GCP cloud serving analytics and BI application integrations.
  • 2 years of experience in designing and optimizing data models on AWS / GCP
  • Qualifications - AWS professional data engineer certification or GCP professional data engineer certification

    Testing in interviews - Yes, candidates will be tested by our data engineering lead

    Client’s Data Science and Infrastructure team is building out the next generation of experience for customers across Digital and Linear properties. We lead strategic development and execution of day-to-day operations, develop tools and processes to drive performance enhancements, manage customer loyalty and retention, and leverage big data and artificial intelligence to create intellectual property.

    Client’s Data Science & Infrastructure team is responsible for the management and optimization of BI systems used to analyze customer behavior, automate business insight processes, target marketing and provide insight to drive optimal business decisions.

    Our Data Ops & Engineering group builds and maintains the platform that delivers accessible data to power decision-making.

    Key Responsibilities (max 3 to 5 bullets)

  • Build and maintain data pipelines, provision infrastructure and secure reliable data centric services and applications in AWS and Google Cloud Platform
  • Assist in extracting, loading, transforming, cleaning and validation of data
  • Support data pipelines feeding into ML algorithms and models
  • Support ad-hoc data analysis requests to advance our digital analytics practice
  • Minimum Qualifications (max 2 to 4 bullets)

  • AWS Professional Solution Architect Certification or GCP Equivalent.
  • Minimum 2 years of hands-on experience analyzing, re-architecting and re-platforming on-premise data warehouses to data platforms on AWS / GCP
  • Minimum 2 years of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc.
  • Minimum 2 years of architecting and implementing next generation data and analytics platforms on AWS / GCP.
  • Minimum 2 years of designing and implementing data engineering, ingestion and curation functions on AWS / GCP or custom programming.
  • Minimum 2 years experience in performing detail assessments of current state data platforms and creating an appropriate transition path to AWS / GCP.
  • Preferred Qualifications (max 2 to 6 bullets)

  • Bachelor's degree in Computer Science, Engineering, Technical Science or 5+ years of technical architecture and build experience with large scale solutions.
  • Experience with CDPs and Looker for data visualization an Asset
  • Minimum 2 years of experience in architecting large-scale data solutions, performing architectural assessments, crafting architectural options and analysis, finalizing preferred solution alternative working with IT and Business stakeholders.
  • 2 years of hands-on experience designing and implementing data ingestion solutions on AWS / GCP using custom approaches.
  • 2 years of hands-on experience architecting and designing data lakes on AWS / GCP cloud serving analytics and BI application integrations.
  • 2 years of experience in designing and optimizing data models on AWS / GCP
  • 2 years of experience integrating security services with AWS / GCP data services for building secure data solutions.
  • 2 Architecting and implementing data governance and security for data platforms on AWS / GCP.
  • Designing operations architecture and conducting performance engineering for large scale data lakes a production environment.
  • Notre barre latérale

    Bienvenue sur le nouveau site web de la CTC.