HDP Certified Administrator (HDPCA) Certification

Sample Video

What's Inside

  • 5 hours 12 minutes Training Videos for all exam objectives (100% syllabus covered)
  • Unlimited Access

HDP Certified Administrator (HDPCA) Certification

$69.95 $29.95
  • (Limited Period Offer)
  • 100% Syllabus covered: All exam objectives
  • Accessed on PC, Mac, iPhone®, iPad®, Android™ Device

Add to cart

Topic-wise content distribution

Topic Video Duration
1. Overview and Introduction 10 mins
2. Big Data Introduction- Big Data Problem 10 mins
3. Cluster Basics 12 mins
4. MapReduce 11 mins
5. Hadoop Architecture 11 mins
5.1. Hadoop Components- Studying core components of Hadoop 15 mins
6. Bash Tutorial 11 mins
6.1  Installation of Hadoop – Installation Modes 32 mins
6.2 Installation – Creating New User 16 mins
6.3 Installation- Setting up Local Repositories 22 mins
6.4 Installation- Setup Apache Server(Local WebServer) 11 mins
6.5 Installation- Automated Ambari Installation and Capacity Planning 30 mins
7.1 Administration(HDPCA Tasks)- Getting to know the Ambari Server Dashboard 11 mins
7.2 Administration(HDPCA Tasks)- Provisioning and installing new services after cluster is setup 34 mins
8. Resource Management 33 mins
9.1 HDFS Operations- HDFS User Creation and Permissions 12 mins
9.2 HDFS Operations- Creation of ACLs 21 mins
10. High Availability-NameNode 10 mins

What is HDP Certified Administrator (HDPCA) Certification?

The HDP certified administrator (HDPCA) exam is designed for Hadoop administrators and operators responsible for installing, configuring and supporting an HDP cluster.

Hortonworks has redesigned its certification program to create an industry-recognized certification where individuals prove their Hadoop knowledge by performing actual hands-on tasks on a Hortonworks Data Platform (HDP) cluster, as opposed to answering multiple-choice questions. The HDP Certified Administrator (HDPCA) exam is designed for Hadoop system administrators and operators responsible for installing, configuring and supporting an HDP cluster.


Do we have multiple choice questions in the HDP Certified Administrator (HDPCA) certification exam?

No. There are no multiple choice questions in the HDPCA certification exam. You will have to solve 5 out of 7 tasks that are assigned to you in the exam.


How to register for this certification?

Create an account at www.examslocal.com. Once registered and logged in, select “Schedule an Exam“, and then enter “Hortonworks“. In the “Search Here” field to locate and select the Hortonworks HDP Certified Administrator exam.


What is the duration of this HDP Certified Administrator exam?

2 hours


Who should take this course?

Big Data Hadoop technology is becoming popular. It’s increasing demand for job opportunities. It becomes necessary for every professional to learn Big Data to be more relevant to their job profile. This course is beneficial for  the following professionals:


What are the prerequisites for taking this Hadoop Certification Training?

There is no pre-requisite to take this Big Data Hadoop training. Anyone who is interested to build his/her career in Big Data Technologies can choose this course. However, but basics of UNIX, SQL and Java would be good.


Will I get placement assistance?

In Whizlabs, we are committed to provide world-class training on various technologies. The course content and training materials are created by industry experts who carefully analyzed the market demands. We can assure that when you are completing our self-study training on Big Data Hadoop, you will be able to work on this technology.


What if I have more queries?

If you have any queries related to this course, payments, etc., please feel free to write here. A member of our support staff will respond as soon as possible.


What will you learn in this HDP Certified Administrator (HDPCA) certification Self-Study training Course?

Topic Description Video Duration
1. Overview and Introduction Introduction on the type of course and topics covered in this series. This section will familiarize the user with contents and a basic outline of the course. 10 mins
2. Big Data Introduction- Big Data Problem What is this BigData problem that companies are so worried of? Why was the switch made from RDBMS to Hadoop framework applications? This section aims at solving all such queries of yours and build a solid foundation base for the reason behind the development of Apache Hadoop. 10 mins
3. Cluster Basics Explaining the Apache Hadoop cluster in detail- Introduction to basic concepts for a beginner- HDFS, Nodes of a cluster like Namenode, Datanode etc, Resource Manager and how Hadoop works. 12 mins
4. MapReduce This section entails all those topics that are necessary to be understood from an Administrator point of view. Administrators should be clearly able to differentiate between the different working levels of the cluster and how these levels interoperate. Cluster components such as NodeManager, Process lifecycles, Resource Allocation have been thoroughly detailed in this section. The second part of this section includes information necessary to distinguish between the advantages and disadvantages offered by any Hadoop service. Developers are recommended to follow this video. 11 mins
5. Hadoop Architecture This section entails all those topics that are necessary to be understood from an Administrator point of view. Administrators should be clearly able to differentiate between the different working levels of the cluster and how these levels interoperate. Cluster components such as NodeManager, Process lifecycles, Resource Allocation have been thoroughly detailed in this section. The second part of this section includes information necessary to distinguish between the advantages and disadvantages offered by any Hadoop service. Developers are recommended to follow this video. 11 mins
5.1. Hadoop Components- Studying core components of Hadoop This video describes the various components and services that are part of the Hadoop Application Stack. A brief introduction has been provided about Hadoop technologies such as Hive, Spark, HBase etc. These components are essentially those services of Hadoop that are used mostly by developers to build applications. Since the Hadoop Application Stack is so featured rich, administrators need to ensure that they select only the component that best suits their requirements. 15 mins
6. Bash Tutorial Basic introduction to bash and a few file system and related shell commands. The better grasp over Bash an Administrator as, better the debugging and fault resolution process. Hence, a quick tutorial for the bash novices, aspiring to be BigData Administrators, with the Bourne Again Shell in RHEL. Towards the end of this section, you should be able to create your own scripts and use that for several purposes as shown in the tutorials. 11 mins
6.1  Installation of Hadoop – Installation Modes Setting the user environment before deploying a cluster. There are several user and system specific operations that need to be performed by the System Administrator and these steps are mandatorily requested by Hadoop. Hence, the first section includes a tutorial on the various cluster deployment options available for an Administrator and we will initiate the installation in Distributed Mode. 32 mins
6.2 Installation – Creating New User Creation of user SSH Tunnel- Most administrators face difficulty in setting up appropriate keys and a proper SSH tunnel for relaying messages between the cluster. This video section illustrates the steps needed to be performed as an administrator for effective communication via SSH n between the hosts. 16 mins
6.3 Installation- Setting up Local Repositories Creation of a local Apache HTTP WebServer to host the repositories for Apache Hadoop. We will be performing the installation presuming that no Internet access would be available in the operating environment of the cluster. Such a configuration is especially recommended for enthusiasts working for closed-door organizations or projects and enhances both speed and productivity of the cluster. 22 mins
6.4 Installation- Setup Apache Server(Local WebServer) Installation of Ambari Hortonworks HDP 2.4 (Ambari 2.2) using a local HTTP WebServer and without any internet access using the Automated Ambari Installer. 11 mins
6.5 Installation- Automated Ambari Installation and Capacity Planning The final video is used to specify the configuration of services which need to be set while installing core Hadoop Services such as Hive Metastore, Knox Secret, Ambari dashboard permissions etc. These are runtime configurations which need to be explicitly declaared by the administrator while installing the cluster. 30 mins
7.1 Administration(HDPCA Tasks)- Getting to know the Ambari Server Dashboard Introduction to the web console- navigation and movement in the console, different widgets and charts, services, hosts and the basic explanation of what you are currently working on is discussed in this section. 11 mins
7.2 Administration(HDPCA Tasks)- Provisioning and installing new services after cluster is setup Performing Cluster Operations such as provisioning new services, remove installed services, add new services on new hosts, remove hosts and many other such administrator related tasks are mentioned in this section. Day to day tasks of a BigData System Administrator will entail these operations. 34 mins
8. Resource Management Resource Management in action – Application States and the Resource Manger UI are elaborately discussed in this section. How to view the logs of a running or failed job, application state/queue of a failed job is entailed in this section. Also, different scheduling styles that Hadoop provides are also covered in this section. Capacity Scheduling and how to use its queues are explained thoroughly here. 33 mins
9.1 HDFS Operations- HDFS User Creation and Permissions This section consists of all Administration tasks such as user home directory creation, snapshots, ACL etc that need to be performed by every Administrator. How to interact with the HDFS and leverage it to the organisation’s advantage is also targeted here. 12 mins
9.2 HDFS Operations- Creation of ACLs The second video illustrates the steps needed for the creation and restoration of Snapshots and creation of Access Control List on HDFS. We will demonstrate how to create and restore data in the face of a datanode failure. The ever-changing scenario of provisioning permissions and setting it as per the organisation rules is also mentioned in this video. 21 mins
10. High Availability-NameNode How to save your organisation in the face of an emergency? This section will answer such questions of yours related to contingency and failover mechanisms that Apache Hadoop has in place. This section also describes how things changed in the latest Hadoop distributions vis-à-vis previous distribution of Hadoop. 10 mins