|1. Overview and Introduction||10 mins|
|2. Big Data Introduction- Big Data Problem||10 mins|
|3. Cluster Basics||12 mins|
|4. MapReduce||11 mins|
|5. Hadoop Architecture||11 mins|
|5.1. Hadoop Components- Studying core components of Hadoop||15 mins|
|6. Bash Tutorial||11 mins|
|6.1 Installation of Hadoop – Installation Modes||32 mins|
|6.2 Installation – Creating New User||16 mins|
|6.3 Installation- Setting up Local Repositories||22 mins|
|6.4 Installation- Setup Apache Server(Local WebServer)||11 mins|
|6.5 Installation- Automated Ambari Installation and Capacity Planning||30 mins|
|7.1 Administration(HDPCA Tasks)- Getting to know the Ambari Server Dashboard||11 mins|
|7.2 Administration(HDPCA Tasks)- Provisioning and installing new services after the cluster is setup||34 mins|
|8. Resource Management||33 mins|
|9.1 HDFS Operations- HDFS User Creation and Permissions||12 mins|
|9.2 HDFS Operations- Creation of ACLs||21 mins|
|10. High Availability-NameNode||10 mins|
The HDP certified administrator (HDPCA) exam is designed for Hadoop administrators and operators responsible for installing, configuring, and supporting an HDP cluster.
Hortonworks has redesigned its certification program to create an industry-recognized certification where individuals prove their Hadoop knowledge by performing actual hands-on tasks on a Hortonworks Data Platform (HDP) cluster, as opposed to answering multiple-choice questions. The HDP Certified Administrator (HDPCA) exam is designed for Hadoop system administrators and operators responsible for installing, configuring, and supporting an HDP cluster.
No. There are no multiple choice questions in the HDPCA certification exam. You will have to solve 5 out of 7 tasks that are assigned to you in the exam.
Note that the exam vouchers are valid for 1 year from the date of purchase.
Big Data Hadoop technology is becoming popular, and thus increasing demand for job opportunities. It becomes necessary for every professional to learn Big Data to be more relevant to their job profile. This course is beneficial for the following professionals:
There is no pre-requisite for this Big Data Hadoop certification exam. Anyone who is interested to build his/her career in Big Data Technologies can choose to go for this certification. However, the basics of UNIX, SQL and Java would be good.
Yes. We write frequently about certification preparation tips on our blog. Here’s how to prepare for HDPCA Certification Exam?
If you have any queries related to this course, payments, etc., please feel free to contact us at Whizlabs Helpdesk. A member of our support staff will respond to you as soon as possible.
|1. Overview and Introduction||Introduction on the type of course and topics covered in this series. This section will familiarize the user with contents and a basic outline of the course.||10 mins|
|2. Big Data Introduction- Big Data Problem||What is this BigData problem that companies are so worried of? Why was the switch made from RDBMS to Hadoop framework applications? This section aims at solving all such queries of yours and build a solid foundation base for the reason behind the development of Apache Hadoop.||10 mins|
|3. Cluster Basics||Explaining the Apache Hadoop cluster in detail- Introduction to basic concepts for a beginner- HDFS, Nodes of a cluster like Namenode, Datanode etc, Resource Manager and how Hadoop works.||12 mins|
|4. MapReduce||This section entails all those topics that are necessary to be understood from an Administrator point of view. Administrators should be clearly able to differentiate between the different working levels of the cluster and how these levels interoperate. Cluster components such as NodeManager, Process lifecycles, Resource Allocation have been thoroughly detailed in this section. The second part of this section includes information necessary to distinguish between the advantages and disadvantages offered by any Hadoop service. Developers are recommended to follow this video.||11 mins|
|5. Hadoop Architecture||This section entails all those topics that are necessary to be understood from an Administrator point of view. Administrators should be clearly able to differentiate between the different working levels of the cluster and how these levels interoperate. Cluster components such as NodeManager, Process lifecycles, Resource Allocation have been thoroughly detailed in this section. The second part of this section includes information necessary to distinguish between the advantages and disadvantages offered by any Hadoop service. Developers are recommended to follow this video.||11 mins|
|5.1. Hadoop Components- Studying the core components of Hadoop||This video describes the various components and services that are part of the Hadoop Application Stack. A brief introduction has been provided about Hadoop technologies such as Hive, Spark, HBase etc. These components are essentially those services of Hadoop that are used mostly by developers to build applications. Since the Hadoop Application Stack is so featured rich, administrators need to ensure that they select only the component that best suits their requirements.||15 mins|
|6. Bash Tutorial||Basic introduction to bash and a few file system and related shell commands. The better grasp over Bash an Administrator as, better the debugging and fault resolution process. Hence, a quick tutorial for the bash novices, aspiring to be BigData Administrators, with the Bourne Again Shell in RHEL. Towards the end of this section, you should be able to create your own scripts and use that for several purposes as shown in the tutorials.||11 mins|
|6.1 Installation of Hadoop – Installation Modes||Setting the user environment before deploying a cluster. There are several user and system specific operations that need to be performed by the System Administrator and these steps are mandatorily requested by Hadoop. Hence, the first section includes a tutorial on the various cluster deployment options available for an Administrator and we will initiate the installation in Distributed Mode.||32 mins|
|6.2 Installation – Creating New User||Creation of user SSH Tunnel- Most administrators face difficulty in setting up appropriate keys and a proper SSH tunnel for relaying messages between the cluster. This video section illustrates the steps needed to be performed as an administrator for effective communication via SSH n between the hosts.||16 mins|
|6.3 Installation- Setting up Local Repositories||Creation of a local Apache HTTP WebServer to host the repositories for Apache Hadoop. We will be performing the installation presuming that no Internet access would be available in the operating environment of the cluster. Such a configuration is especially recommended for enthusiasts working for closed-door organizations or projects and enhances both speed and productivity of the cluster.||22 mins|
|6.4 Installation- Setup Apache Server(Local WebServer)||Installation of Ambari Hortonworks HDP 2.4 (Ambari 2.2) using a local HTTP WebServer and without any internet access using the Automated Ambari Installer.||11 mins|
|6.5 Installation- Automated Ambari Installation and Capacity Planning||The final video is used to specify the configuration of services which need to be set while installing core Hadoop Services such as Hive Metastore, Knox Secret, Ambari dashboard permissions etc. These are runtime configurations which need to be explicitly declared by the administrator while installing the cluster.||30 mins|
|7.1 Administration(HDPCA Tasks)- Getting to know the Ambari Server Dashboard||Introduction to the web console- navigation and movement in the console, different widgets and charts, services, hosts and the basic explanation of what you are currently working on is discussed in this section.||11 mins|
|7.2 Administration(HDPCA Tasks)- Provisioning and installing new services after the cluster is setup||Performing Cluster Operations such as provisioning new services, remove installed services, add new services on new hosts, remove hosts and many other such administrator related tasks are mentioned in this section. Day to day tasks of a BigData System Administrator will entail these operations.||34 mins|
|8. Resource Management||Resource Management in action – Application States and the Resource Manager UI are elaborately discussed in this section. How to view the logs of a running or failed job, application state/queue of a failed job is entailed in this section. Also, different scheduling styles that Hadoop provides are also covered in this section. Capacity Scheduling and how to use its queues are explained thoroughly here.||33 mins|
|9.1 HDFS Operations- HDFS User Creation and Permissions||This section consists of all Administration tasks such as user home directory creation, snapshots, ACL etc that need to be performed by every Administrator. How to interact with the HDFS and leverage it to the organization’s advantage is also targeted here.||12 mins|
|9.2 HDFS Operations- Creation of ACLs||The second video illustrates the steps needed for the creation and restoration of Snapshots and creation of Access Control List on HDFS. We will demonstrate how to create and restore data in the face of a data node failure. The ever-changing scenario of provisioning permissions and set it as per the organization rules is also mentioned in this video.||21 mins|
|10. High Availability-NameNode||How to save your organization the face of an emergency? This section will answer such questions of yours related to contingency and failover mechanisms that Apache Hadoop has in place. This section also describes how things changed in the latest Hadoop distributions vis-à-vis the previous distribution of Hadoop.||10 mins|
What is Spark Developer Certification (HDPCD)?
The HDPCD Spark Developer Certification is a hands-on, performance-intensive certification for Apache Spark Developers on the Hortonworks Data Platform. Apache Spark is a fast, in-memory data computation engine with expressive APIs to facilitate Data Science, Machine Learning, Streaming applications and providing iterative access. It is an extremely sought out technology that is currently being used by Data Barons such as Samsung, TripAdvisor, Yahoo!, eBay and many others. HDPCD Spark Certified Developers have an edge over the rest of the world because examinees perform a specific number of tasks on a live installation platform provided by Hortonworks rather than simply answering questions. Memorizing and reciting the by-hearted concepts doesn’t work with HDPCD- these are the developers that get work done and the world sees them in a different light altogether.
In this certification, with Spark, having an extremely wide application base, Hortonworks recognizes that all tasks to be performed on a live cluster are rather daunting. Hence, they mandate that aspirants work only on SparkCore and SparkSQL applications before appearing for this certification. Developers may choose from either Scala and/or Python as the programming language and create applications using them. This Whizlabs course recommends Scala as the preferred Analytics language due to its simple LINQ type syntax.
Is There Multiple Choice Questions in the Spark Developer (HDPCD) Certification Exam?
No, there are no MCQs; instead, live, performance-based test is conducted to gauge the application of concepts. Usually, there are 7-8 tasks provided, out of which a candidate must perform at least 6. The exam is of 2 hours and costs 250 USD per attempt.
How to Register for HDPCD Spark Developer Certification exam?
Note that the exam vouchers are valid for 1 year from the date of purchase.
What is the duration of HDPCD Spark Developer Certification exam?
HDPCD Spark Certification is valid for a particular version of the Spark. So, for example, if at the time of appearing you worked on Spark v2.2 (current), your certification would hold good until Spark v2.2 is in use.
What are the prerequisites for the HDPCD Spark Developer Exam?
Does Whizlabs Offer any Subscription Plan?
Yes, we offer the Annual Subscription in which you get all the Whizlabs training courses worth $2500+ at $99 only. With Whizlabs subscription, you will get unlimited access to all the courses and hands-on labs with Premium Support for one year.
Do you provide a course completion certificate?
Yes, we provide a course completion certificate for online training courses. Once you watch the video course by 100% / complete an online course, you get a course completion certificate that is signed by our CEO.
How long is the license valid after the purchase?
Our simulators and video courses have a lifetime license/validity. Once you have purchased, you can access it for a lifetime.
No FAQs Found
Who should go for HDPCD Spark Developer Certification?
This certification is open for all, i.e. aspirants who wish to make Data Science as their career path should pursue the certification. Going by usual trends, Analysts and Developers, both alike, usually appear for this certification.
Do you have any preparation guidance for this certification exam?
Yes. We write frequently about certification preparation tips on our blog. Here’s how to prepare for Spark Developer Certification (HDPCD) Exam?
Apart from mock exams/video courses, is there any further assistance I can get from Whizlabs?
Yes, you will get full support for any query related to the certification while preparing through our mock exams/video courses. Your query will be handled by the certified SME (Subject Matter Expert) & response will be provided in due course.
Do you offer a Money-Back Guarantee for Whizlabs training courses?
Yes, we offer a 100% unconditional money-back guarantee for our training courses. If you don’t clear the exam for any reason, you can apply for a full refund.
Please note that we only refund the amount paid for the Whizlabs training course, not the certification exam cost. For more details, we recommend you to check our Refund Policy.
Do you provide any discount on the bulk purchase?
Yes, you can avail up to 50% discount on the purchase of more than 10 products at a time. For more details, please feel free to write here. A member of our support staff will respond back as soon as possible.
What are the payment gateways you provide?
We accept payments through different gateways like CCAvenue, Stripe, etc.
After purchasing the Training Course from Whizlabs, how can I clarify my technical questions?
We have a dedicated team of subject-matter experts (SME) who will answer all of your queries that are submitted through our Learning Management System (LMS) interface. You will receive responses within 24 hours of the submission of the question.
What if I have more queries?
If you have any queries related to the Certification exam, Whizlabs training courses, payments, etc. please feel free to contact us. A member of our support staff will respond to you as soon as possible.