CTC006445 - Senior Manager, Data Services Operations and Engineering

Secteur industriel: Médias
Type d'emploi: Contract
Durée: One year
Mode de travail: On Site

Description

IMPORTANT

We are going to 2 days in the office in January, open to remote candidates

40 hours/week

Day-to-day basis. : the contractor will be responsible for completing Data Science and ML models and projects as they relate to client’s existing data on itiatives for digital, linear, ooh, ad ops and others.

As prioritized by their manager, the data scientist will be responsible for delivering 1-2 data science projects from inception to launch and monitoring.

Top 3 skill sets -

  • 3-5 year python for Data Science and have worked as a Data Scientist
  • Experience with developing and delivering learning models on Cloud system like AWS Sagemaker.
  • Hands on experience with SI/ML frameworks.
  • Ideal candidate: Multi-year experience in delivering ML models as a data scientists (previously performed a similar role) with strong technical skills

    Any potential to hire full time? Yes, there may be an option in the future to convert to full time for the right candidate.

    The Data Science and Customer Infrastructure team builds the next generation of customer experience for digital and linear properties. We lead strategic development and implementation of day-to-day operations, develop tools and processes for performance improvement, manage customer retention, and leverage megadata and artificial intelligence to create intellectual property.

    The Data Science and Customer Infrastructure team is responsible for managing and optimizing business intelligence systems that are used to analyze customer behaviors, automate business data analytics processes, target marketing, and provide insights to drive optimal business decisions.

    Our Data Services Operations and Engineering group builds and maintains the platform that provides accessible data to drive decision making.

    Key responsibilities

  • Design and maintain data pipelines, provide infrastructure, and secure reliable data-centric services and applications on AWS and the Google Cloud Platform (GCP).
  • Assume responsibility for the data engineering and operations master plan.
  • Participate in data extraction, loading, transformation, cleansing and validation.
  • Support data pipelines feeding machine learning algorithms and models.
  • Support ad hoc data analysis requests to advance our digital analytics practice.
  • Minimum qualifications

  • AWS Professional Solutions Architect certification.
  • Minimum of five years of hands-on experience in analytics, architecture re-engineering, and reconfiguration of on-premises data warehouses to data platforms on AWS and GCP.
  • Minimum of three to five years of experience designing and developing production data pipelines from integration to consumption within a hybrid megadata related architecture using Java, Python, Scala, etc.
  • Minimum of three to five years of experience architecting and implementing next generation data and analytics platforms on AWS and GCP.
  • Minimum of three to five years of experience designing and implementing data engineering, integration and curation functions on AWS and GCP or using custom programming.
  • Minimum of three to five years of experience in detailed evaluation of current data platforms and creating an appropriate transition path to AWS and GCP.
  • Desired Skills

  • Bachelor's degree in Computer Science, Engineering, or Technical Science, or at least five years of experience in technical architecture and large-scale solution development.
  • Experience with CDP and Looker for data visualization a plus.
  • At least five years of experience architecting large-scale data solutions, conducting architecture assessments, developing architectural options and analyses, and finalizing a preferred alternative solution in collaboration with IT and business partners.
  • Five years of hands-on experience designing and implementing data integration solutions on AWS and GCP using custom approaches.
  • Three to five years of hands-on experience architecting and designing data lakes on AWS and GCP cloud services analytics and BI application integrations.
  • Three to five years of experience designing and optimizing data models on AWS and GCP.
  • Three to five years of experience integrating security services with AWS and GCP data services to create secure data solutions.
  • Three to five years of experience designing and implementing data governance and security for data platforms on AWS and GCP.
  • Operational architecture design and execution of performance engineering tasks for large scale data lakes in a production environment.
  • Notre barre latérale

    Bienvenue sur le nouveau site web de la CTC.