My client is looking for GDPR Hadoop, and I wanted to know whether this would be of interest to you or not?
Job: GDPR Hadoop(Big Data)
Job Type: Contract OR Permanent
Big Data (Hadoop, Hive), Kafka, Chef, Rundeck, Linux, Public Cloud (AWS and Microsoft Azure and GCP)
Work in an agile environment to manage and operationalize multi datacenter (MDC) Hortonworks Hadoop deployment with main components HDFS, MapReduce2, YARN, Tez, Hive, Flink,oozie, Kafka, Zookeeper.
QUALIFICATIONS / SKILLS
Experience in troubleshooting, and supporting enterprise applications.
Hands-on experience in managing, monitoring, troubleshooting and scaling Hortonworks HDP/HDF/opensource kafka Hadoop distributions or any other Hadoop Platforms.
Experience working with Kerberos and Ranger in Hadoop clusters.
Exposure to Operation Intelligence and experience with tools like Splunk and ElasticSearch.
Experience with public cloud, as such as AWS and Microsoft Azure and GCP.
Experience with Apache Flink and troubleshooting the data pipe line issues.
Experience with Hadoop encryption technologies like Ranger KMS and Ranger KMS with Key Trustee Server.
Job Reference: CR/066018
Salary per: Annum
Job Duration: 6 months
Job Start Date: