Cloudera Administrator Training; Bangalore (21-24 Feb 2015)

Cloudera Administrator Training; Bangalore (21-24 Feb 2015)

 

  • Regular Price-Non Certification Training

    Regular Price-Non Certification Training

    Sale Date Ended

    INR 62000
    Sold Out
  • Regular Price - Certification Training

    Regular Price - Certification Training

    Sale Date Ended

    INR 74400
    Sold Out
  • Early Bird Price-Certification Training

    Early Bird Price-Certification Training

    Sale Date Ended

    INR 66960
    Sold Out
  • Early Bird Price-Non Certification Training

    Early Bird Price-Non Certification Training

    Sale Date Ended

    INR 55800
    Sold Out

Invite friends

Contact Us

Page Views : 1361

About The Event

 

This four-day administrator training course for Apache Hadoop provides a comprehensive understanding of all the steps necessary to operate and maintain Hadoop clusters.

From installation and configuration, through load balancing and tuning your cluster, this Administration course has you covered.

Xebia is an official training partner of Cloudera, the leader in Apache Hadoop-based software and services.

Please note, that you need to bring your own laptop for this training.


Programme and Course Overview

Xebia is an official training partner of Cloudera, the leader in Apache Hadoop-based software and services.

Through lecture and interactive, hands-on exercises, this certified training will cover topics such as:

 

• The internals of MapReduce and HDFS and how to build Hadoop architecture;

• Proper cluster configuration and deployment to integrate with systems and hardware in the data center;

• How to load data into the cluster from dynamically-generated files using Flume and from RDBMS using Sqoop;

• Configuring the FairScheduler to provide service-level agreements for multiple users of a cluster;

• Installing and implementing Kerberos-based security for your cluster;

• Best practices for preparing and maintaining Apache Hadoop in production;

• Troubleshooting, diagnosing, tuning and solving Hadoop issues.


Trainer's Profile
  

 

Swapnil Dubey

Swapnil  has around 6  years of professional experience in software industry. He has around 4 years of experience in object oriented analysis, requirements analysis, system design, development and production support ofJavascript/J2EE based applications. He has around 2+ years of experience in BigData Engineering(Hadoop) projects. Swapnil is Cloudera Trainer forHadoop Developer and Cloudera Trainer  for Hadoop Administrator. He is Cloudera Certified Hadoop Developer and Cloudera Certified Hadoop Administrator. He likes to speak and explore about new technological advancements in Hadoop environment. Swapnil has worked in all phases of software development life cycle and has worked in capacity of developer and Scrum master. He has good understanding of Agile methodologies, specifically SCRUM and Test Driven Development (TDD).

 

He has good understanding of Hadoop(Development and Administration) environment. He has good understanding of Javascript/Java/ J2EE design patterns, Domain Driven Design and Enterprise Application Architecture design patterns .He has contributed to different flavors of projects in different business domains like Data Mining, E-Commerce,Travel & Tourism & Health care domain. 

 

Target Group & Prerequisites:

This course is best suited to system administrators and IT managers who have basic Linux experience. 

Prior knowledge of Apache Hadoop is not required.


You Will Learn


• How the Hadoop Distributed File System and MapReduce work
• What hardware configurations are optimal for Hadoop clusters
• What network considerations to take into account when building out your cluster
• How to configure Hadoop's options for best cluster performance
• How to configure NameNode High Availability
• How to configure NameNode Federation
• How to configure the FairScheduler to provide service-level agreements for multiple users of a cluster
• How to install and implement Kerberos-based security for your cluster
• How to maintain and monitor your cluster
• How to load data into the cluster from dynamically-generated files using Flume and from relational database management systems using Sqoop
• What system administration issues exist with other Hadoop projects such as Hive, Pig, and HBase


Outline


• Introduction
• The Case for Apache Hadoop
• HDFS
• Getting Data into HDFS
• MapReduce
• Planning Your Hadoop Cluster
• Hadoop Installation and Initial Configuration
• Installing and Configuring Hive, Impala, and Pig
• Hadoop Clients
• Cloudera Manager
• Advanced Cluster Configuration
• Hadoop Security
• Managing and Scheduling Jobs
• Cluster Maintenance
• Cluster Monitoring and Troubleshooting
• Conclusion