Cloudera Hadoop Developer Training; Hyderabad (26 Feb-1 Mar, 2015)

Cloudera Hadoop Developer Training; Hyderabad (26 Feb-1 Mar, 2015)

 

  • Regular Price- Non Certification Training

    Regular Price- Non Certification Training

    Sale Date Ended

    INR 62000
    Sold Out
  • Regular Price- Certification Training

    Sale Date Ended

    INR 74400
    Sold Out
  • Early Bird Price-Certification Training

    Sale Date Ended

    INR 66960
    Sold Out
  • Early Bird Price-Non Certification Training

    Sale Date Ended

    INR 55800
    Sold Out

Invite friends

Contact Us

Page Views : 827

About The Event



Course Overview


This four-day training course is for developers who want to learn to use Apache Hadoop to build powerful data processing applications.

 

Prerequisites: This course is appropriate for developers who will be writing, maintaining and/or optimizing Hadoop jobs. Participants should have programming experience; knowledge of Java is highly recommended. Understanding of common computer science concepts is a plus. Prior knowledge of Hadoop is not required.

 

Hands-On Exercises: Throughout the course, students write Hadoop code and perform other hands-on exercises to solidify their understanding of the concepts being presented.

 

Optional Certification Exam: Following successful completion of the training class, attendees can receive a Cloudera Certified Developer for Apache Hadoop (CCDH) practice test. Cloudera training and the practice test together provide the best resources to prepare for the certification exam. A voucher for the training can be acquired in combination with the training. 

 

TargetGroup: This session is appropriate for developers who will be writing, maintaining or optimizing Hadoop jobs. Participants should have programming experience, preferably with Java. Understanding of algorithms and other computer science topics is a plus.  


Information on Cloudera Trainer
 

Sunil Yadav

 

Sunil has multiple years of experience in developing Java & JEE based applications. He has rich experience  in Enterprise search and Enterprise Content Management domain.

He  has worked for various clients in different domains like Healthcare, Publication, Insurance and Government. Since he has worked with the ECM and ES products and know how customer stores there data and what value they want to extract from structured and unstructured data, he looked for the alternatives like Hadoop and NoSql Databases for the different use cases client have. Sunil had mastered technologies like Hadoop , Pig, Hive, Sqoop, Oozie and used them in various projects .He has explored some the newest technologies in big data world like Apache Spark and Kafka

Sunil has delivered many in house trainings for various Big Data technologies.

About Xebia

Xebia is an international IT services organization with offices in Hilversum (NL), Paris, Gurgaon, Santa Monica and Boston. Started in 2002 we now employ 350 people worldwide with revenues of over 45 Millon USD. We offer services and solutions clustered around our key competences: Continuous Delivery & DevOps, Agile Consulting & Training, BigData, Enterprise Web Apps and Enterprise Mobile.

GoDataDriven is a leading consultancy firm in Big Data technologies. GDD offer end to end Big Data implementations from infrastructure via data science to working software. GDD is a Cloudera training and SI partner since 2011, having offices in India and Europe.


Key Promises of this Training
 

  1. The core technologies of Hadoop.
  2. How HDFS and MapReduce work.
  3. How to develop MapReduce applications.
  4. How to unit test MapReduce applications.
  5. How to use MapReduce combiners, partitioners and the distributed cache.
  6. Best practices for developing and debugging MapReduce applications.
  7. How to implement data input and output in MapReduce applications.
  8. Algorithms for common MapReduce tasks.
  9. How to join data sets in MapReduce.
  10. How Hadoop integrates into the data center.
  11. How to use Mahout's machine learning algorithms.
  12. How Hive and Pig can be used for rapid application development.
  13. How to create large workflows using Oozie.
Agenda
  1. Introduction
  2. The Motivation for Hadoop
  3. Hadoop: Basic Concepts
  4. Writing a MapReduce Program
  5. Unit Testing MapReduce Programs
  6. Delving Deeper into the Hadoop API
  7. Practical Development Tips and Techniques
  8. Data Input and Output
  9. Common MapReduce Algorithms
  10. Joining Data Sets in MapReduce Jobs
  11. Integrating Hadoop into the Enterprise Workflow
  12. Machine Learning and Mahout
  13. An Introduction to Hive and Pig
  14. An Introduction to Oozie
  15. Conclusion
  16. Appendix: Graph Processing in Map Reduce

 

Contact : Xebia 9871237360