More specifically, the person sought will assume the following responsibilities:
• Participate in the support activities of the big data platform.
• Participate in the administration of Hadoop clusters (setup and tuning).
• Participate in the development of Shell scripts (Linux) for resource and security management of the Hadoop cluster.
• Communicate effectively with technical and non-technical members of the project team.
• Manage multiple tasks and projects simultaneously.
• Work in an agile/iterative context.
• Have at least 5 years experience in Business Intelligence or any other area relevant to data processing.
• Experience in Hadoop clusters administration (mandatory).
• Experience in advanced Linux administration (Redhat 6.x and up).
• Development of bash and Python scripts (Asset)
• Install, config and debug of the Hadoop ecosystem (HDFS, MapReduce, YARN, Zookeeper, Hbase, Hive, Spark, Kafka, Zeppelin).
• Install, config and debug of HDF ecosystem (NiFi, NiFi Registry, Kafka, Schema Registry).
• Install, config and debug of MySQL/MariaDB databases.
• Advanced knowledge of Hadoop administration tools (Cloudera Manager, Ambari).
• Advanced knowledge of Hadoop security (Ranger, Hue, ACL, Knox, Atlas).
• Technologies: LDAP/AD, Kerberos, SSL.
• Have good technical documentation skills.
• Have already worked in an agile project.
• Have good communication skills and popularization of complex concepts.
• Have strong problem-solving skills.
• Be a good team contributor.
• Ensure vigilance to develop and keep up to date.