• Big Data Administrator Jobs in Navi Mumbai,India - 25313030

  • Save Job
  • 3 - 10 Years
  • Posted : above 1 month

Job Description:

Job Location - Airoli, Navi Mumbai/ Powai, Mumbai/ Magarpatta City, Pune

Role Summary

Providing Administrative Support for Think Big Customers on Hadoop platforms Typically, these customers may have 24/7 contracts, and the successful applicant must be prepared to work in shifts and also be on-call to support customer site/s per contractual obligations

The Hadoop Administrator manages and controls the Hadoop System environment for Teradata customers The Hadoop Administrator requires specific technical knowledge about the administration and control of the Hadoop System, including the associated operating system, related tools, network, and hardware

Experience Requirement


Minimum experience of 3-10 years in Managing and Supporting large scale Production Hadoop environments (configuration management, monitoring, and performance tuning) in any of the Hadoop distributions (Apache, Hortonworks, Cloudera, MapR, IBM BigInsights, Pivotal HD)
3-10 years of experience in Scripting Language (Linux, SQL, Python) Should be proficient in shell scripting
3-10 years of experience on Administrative activities likes-

Administration, maintenance, control, and optimization of Hadoop capacity, security, configuration, process scheduling, and errors

Management of data, users, and job execution on the Hadoop System
Experience in Backup, Archival and Recovery (BAR) and High availability (HA)
Plan for and support hardware and software installation and upgrades
3-10 years of Experience in Hadoop Monitoring tools (Cloudera Manager, and Ambari, Nagios, Ganglia etc)
Experience may include (but is not limited to) build and support including design, configuration, installation (upgrade), monitoring and performance tuning of any of the Hadoop distributions
Hadoop software installation and upgrades
Experience of workload / performance management
Automation - experience in CI / CD (Continuous Integration / Deployment) Jenkins, Ansible, Terraform, Puppet, Chef
Implementing standards and best practices to manage and support data platforms as per distribution
Proficiency in Hive internals (including HCatalog), SQOOP, Pig, Oozie and Flume/Kafka
Experience in MySQL & PostgreSQL databases
ITIL Knowledge

Preferred Skills

Experience with DR (Disaster Recovery) strategies and principles
Development or administration on NoSQL technologies like Hbase, MongoDB, Cassandra, Accumulo, etc
Development or administration on Web or cloud platforms like Amazon S3, EC2, Redshift, Rackspace, OpenShift etc
Development/scripting experience on Configuration management and provisioning tools eg Puppet, Chef
Web/Application Server & SOA administration (Tomcat, JBoss, etc)
Development, Implementation or deployment experience on the Hadoop ecosystem (HDFS, MapReduce, Hive, Hbase)
Experience on any one of the following will be an added advantage
Hadoop integration with large scale distributed data platforms like Teradata, Teradata Aster, Vertica, Greenplum, Netezza, DB2, Oracle, etc
Proficiency with at least one of the following Java, Python, Perl, Ruby, C or Web-related development
Knowledge of Business Intelligence and/or Data Integration (ETL) operations delivery techniques, processes, methodologies
Exposure to tools data acquisition, transformation & integration tools like Talend, Informatica, etc & BI tools like Tableau, Pentaho, etc
Linux Administrator certified

Profile Summary:

Employment Type : Full Time
Eligibility : Any Graduate
Industry : Software Services, Internet/Dot com/ISP
Functional Area : IT Software : Software Products & Services
Role : Software Engineer
Salary : As per Industry Standards
Deadline : 03rd Jun 2020

Key Skills:

Why not try out our free online tutorials and gain an edge?

People who search this job also searched for the following Keywords

Salary trends based on over 1 crore profiles

View Salaries

All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status