• Save Job
  • 10 - 15 Years
  • Posted : 22 days ago

Job Description:

Warm Greetings Career Trackers & Consulting

Our Client is a US based Product Development MNC who is looking for Hadoop Platform Administrator for Noida & Bangalore

Please apply if you have worked as Hadoop Platform Administrator for atleast 7 years & overall experience more than 10 years With Strong experience on Hadoop Ecosystems

If you are interested, Kindly email your resume asap along with the details requested below at itrecruitercareertrackersin

Company A very Prestigious Product Development MNC
Role Hadoop Platform Administrator
Expertise required please refer the detailed Jd below
Exp 10 - 18 yrs
Work Location Noida & Bangalore
Package Best in the Industry
Nature of job Permanent & Fulltime

PLEASE SHARE YOUR BELOW DETAILS FOR FURTHER PROCESSING-

1 Full Name
2 Email ID
3 Mobile
4 Current Designation
5 Current Organization
6Explanation of Gap in Employment (if any)
7 Total Experience
8 Relevant Experience as

a) Hadoop Platform Administrator

b) Number of Nodes Handled

c) larget Data size handled
9 Highest Qualification
10 Current CTC
11 Expected CTC (in figure)
12 Notice Period
13 Current location
14 Open for Noida / Bangalore

We are looking for a passionate, high energy Application Engineer to build pipelines using multiple software tools in a Hybrid Cloud business environment YOU will be responsible for understanding requirements for tools/dashboards and use programming languages and contribute to developing application using tools to meet clients data requirements Engineers to demonstrate skills on one or more programming languages, multiple operating systems, attention to details, time management, accuracy and good communication and documentation Skills

The successful candidate will work in a high availability, this role provides a unique opportunity to plan, design and build up unique software Data engineering through framework and on a breadth of cloud technologies This is a highly visible role and will involve engagement with IT & Engineering Global teams and integrate a variety of other applications, platforms & services, found in very few other opportunities in a high performing team It will be a HUGE Plus if the candidate has Core Development experience on one or more languages like JAVA, PYTHON, SCALA He/she is expected as a Data Engineer to have relevant Tools and a full stack development experience with hands on Multi cluster environment such as HADOOP and Cloud Technologies will be a Plus

What you will do

Responsible for design, Develop, monitoring, tuning and optimizing, governing Large Scale Hadoop Cluster and Hadoop components in a 24*7 team
Design & implement new components and various emerging technologies in Hadoop Echo System, and successful execution of various Proof-Of-Technology (PoT)
Monitor and analyze job performance, file system/disk-space management, cluster & database connectivity and log files
Platform administration activities on various Hadoop services such as Yarn, RM, Zoo keeper, Cloudera manager etc
Harden the cluster to support use cases and self-service in 24x7 model and apply advanced troubleshooting techniques to on critical, highly complex customer problems
Automate deployment and management of Hadoop services including implementing monitoring

What you need to succeed
10 - 15 Years Of strong Linux/Java / BigData experience with Enterprise Data ware housing experience
Must have 5 years of Big data administration experience including upgrades, cluster parameter tuning, configuration changes etc
Well versed with Hadoop challenges related to scaling and self-service analytics
Well versed with Cloudera distributions, hands on experience required for both distributions
Well versed with Kafka, hive, spark, hbase and latest developments in the Hadoop eco system
Excellent knowledge of Hadoop integration points with enterprise BI and EDW tools
Strong Experience with Hadoop cluster management/ administration/ operations using Oozie, Yarn, Ambari, Zookeeper, Tez, Slider
Strong Experience with Hadoop ETL/ Data Ingestion Sqoop, Flume, Hive, Spark, Hbase
Strong experience on SQL
Good to have Experience in Real Time Data Ingestion using Kafka, Storm, Spark or Complex Event Processing (CEP)
Experience in Hadoop Data Consumption and Other Components Hive, Hue HBase, Phoenix, Spark, Mahout, Pig, Impala, Presto
Experience monitoring, troubleshooting and tuning services and applications and operational expertise such as good troubleshooting skills, understanding of systems capacity, bottlenecks, and basics of memory, CPU, OS, storage, and networks
Should be able to take the lead and interact with US team members with minimal supervision
Bachelors Degree in Computer Science, Information Science, Information Technology or Engineering/ Related Field
Good communication skills across distributed team environment
Must be self-motivated, responsive, professional and dedicated to customer success

Would request you to kindly share this email with your friends in case you are not interested in exploring

Please refer to the Job description above

Company Profile

Career Trackers and Consulting

A very Prestigious Product Development US Based MNC

Profile Summary:

Employment Type : Full Time
Eligibility : Any Graduate
Industry : Recruitment/Placement Agencies, Consulting Services
Functional Area : HR / Administration / IR
Role : Software Engineer
Salary : As per Industry Standards
Deadline : 19th May 2020

Key Skills:

Why not try out our free online tutorials and gain an edge?

People who search this job also searched for the following Keywords

Sourced**

Salary trends based on over 1 crore profiles

View Salaries

All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status