{"id":66879,"date":"2018-07-31T05:05:48","date_gmt":"2018-07-31T05:05:48","guid":{"rendered":"https:\/\/www.whizlabs.com\/blog\/?p=66879"},"modified":"2018-07-31T05:05:48","modified_gmt":"2018-07-31T05:05:48","slug":"apache-hadoop-in-cloud","status":"publish","type":"post","link":"https:\/\/www.whizlabs.com\/blog\/apache-hadoop-in-cloud\/","title":{"rendered":"How to Enable Apache Hadoop in Cloud?"},"content":{"rendered":"<p class=\"p5\" style=\"text-align: justify;\"><span class=\"s1\">Big Data and Cloud Computing combination is the latest trend nowadays, and Hadoop is another name for Big Data. So, it&#8217;s the right time to understand the enabling of Apache Hadoop in Cloud Computing.<\/span><\/p>\n<p class=\"p5\" style=\"text-align: justify;\"><span class=\"s1\">Hadoop has spawned the foundation for many other big data technologies and tools. When industries need virtually unlimited scalability for large data, Hadoop supports a diverse range of workload types. However, with the exponential growth of big data, storage cost and maintainability is a prime question considering the budget of the companies.<\/span><\/p>\n<blockquote><p>Are you a fresher who wants to start a career in Big Data Hadoop? Read our previous blog on how to start\u00a0<a href=\"https:\/\/www.whizlabs.com\/blog\/learning-hadoop-for-beginners\/\" target=\"_blank\" rel=\"noopener\">Learning Hadoop for Beginners.<\/a><\/p><\/blockquote>\n<p class=\"p6\" style=\"text-align: justify;\"><span class=\"s1\">Hence, Apache Hadoop in Cloud computing is the latest trend that today\u2019s industry follow. Big data processing on cloud platforms is especially effective for scenarios where an enterprise wants to save the cost by accelerating the analytic jobs for faster results to bring down the cluster.<\/span><\/p>\n<p class=\"p5\" style=\"text-align: justify;\"><span class=\"s1\">In this blog, we will discuss how we can enable Apache Hadoop in Cloud computing instance.<\/span><\/p>\n<div id=\"ez-toc-container\" class=\"ez-toc-v2_0_76 ez-toc-wrap-left counter-hierarchy ez-toc-counter ez-toc-custom ez-toc-container-direction\">\n<div class=\"ez-toc-title-container\">\n<p class=\"ez-toc-title\" style=\"cursor:inherit\">Table of Contents<\/p>\n<span class=\"ez-toc-title-toggle\"><a href=\"#\" class=\"ez-toc-pull-right ez-toc-btn ez-toc-btn-xs ez-toc-btn-default ez-toc-toggle\" aria-label=\"Toggle Table of Content\"><span class=\"ez-toc-js-icon-con\"><span class=\"\"><span class=\"eztoc-hide\" style=\"display:none;\">Toggle<\/span><span class=\"ez-toc-icon-toggle-span\"><svg style=\"fill: #ea7e02;color:#ea7e02\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" class=\"list-377408\" width=\"20px\" height=\"20px\" viewBox=\"0 0 24 24\" fill=\"none\"><path d=\"M6 6H4v2h2V6zm14 0H8v2h12V6zM4 11h2v2H4v-2zm16 0H8v2h12v-2zM4 16h2v2H4v-2zm16 0H8v2h12v-2z\" fill=\"currentColor\"><\/path><\/svg><svg style=\"fill: #ea7e02;color:#ea7e02\" class=\"arrow-unsorted-368013\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"10px\" height=\"10px\" viewBox=\"0 0 24 24\" version=\"1.2\" baseProfile=\"tiny\"><path d=\"M18.2 9.3l-6.2-6.3-6.2 6.3c-.2.2-.3.4-.3.7s.1.5.3.7c.2.2.4.3.7.3h11c.3 0 .5-.1.7-.3.2-.2.3-.5.3-.7s-.1-.5-.3-.7zM5.8 14.7l6.2 6.3 6.2-6.3c.2-.2.3-.5.3-.7s-.1-.5-.3-.7c-.2-.2-.4-.3-.7-.3h-11c-.3 0-.5.1-.7.3-.2.2-.3.5-.3.7s.1.5.3.7z\"\/><\/svg><\/span><\/span><\/span><\/a><\/span><\/div>\n<nav><ul class='ez-toc-list ez-toc-list-level-1 ' ><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-1\" href=\"https:\/\/www.whizlabs.com\/blog\/apache-hadoop-in-cloud\/#Key_considerations_for_Enabling_Apache_Hadoop_in_Cloud_Computing_Environment\" >Key considerations for Enabling Apache Hadoop in Cloud Computing Environment<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-2\" href=\"https:\/\/www.whizlabs.com\/blog\/apache-hadoop-in-cloud\/#How_to_Configure_Apache_Hadoop_Environment_in_the_Cloud\" >How to Configure Apache Hadoop Environment in the Cloud?<\/a><ul class='ez-toc-list-level-3' ><li class='ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-3\" href=\"https:\/\/www.whizlabs.com\/blog\/apache-hadoop-in-cloud\/#Phase1_Connect_to_EC2_instance_using_PuTTy\" >Phase1:\u00a0 Connect to EC2 instance using PuTTy<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-4\" href=\"https:\/\/www.whizlabs.com\/blog\/apache-hadoop-in-cloud\/#Phase_2_Configuring_Apache_Hadoop_in_cloud_computing\" >Phase 2: Configuring Apache Hadoop in cloud computing<\/a><\/li><\/ul><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-5\" href=\"https:\/\/www.whizlabs.com\/blog\/apache-hadoop-in-cloud\/#Benefits_of_Apache_Hadoop_in_Cloud_Computing\" >Benefits of Apache Hadoop in Cloud Computing<\/a><ul class='ez-toc-list-level-3' ><li class='ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-6\" href=\"https:\/\/www.whizlabs.com\/blog\/apache-hadoop-in-cloud\/#Final_Verdict\" >Final Verdict<\/a><\/li><\/ul><\/li><\/ul><\/nav><\/div>\n<h2 class=\"p7\" style=\"text-align: justify;\"><span class=\"ez-toc-section\" id=\"Key_considerations_for_Enabling_Apache_Hadoop_in_Cloud_Computing_Environment\"><\/span><span class=\"s1\">Key considerations for Enabling Apache Hadoop in Cloud Computing Environment<\/span><span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p class=\"p8\" style=\"text-align: justify;\"><span class=\"s1\">Before you configure Apache Hadoop environment in the cloud below points should be measured:<\/span><\/p>\n<ul class=\"ul1\" style=\"text-align: justify;\">\n<li class=\"li9\"><span class=\"s4\">Security in the public cloud is a concern for Apache Hadoop cloud deployment. Hence, every enterprise must evaluate the security criteria before moving Hadoop cluster data as Hadoop provides very limited security.<\/span><\/li>\n<li class=\"li9\"><span class=\"s4\">The main purpose of Apache Hadoop is data analysis. Hence, Apache Hadoop in cloud computing deployment must support the tools associated with the Hadoop ecosystem, especially analytics and data visualization tools.<\/span><\/li>\n<li class=\"li9\"><span class=\"s4\">Data transmission in a cloud is chargeable. Hence, the place from where data is being loaded to the cloud is an important factor. The cost differs in case the data is going to be loaded from an internal system which is not on the cloud or if the data is already in the cloud.<\/span><\/li>\n<\/ul>\n<h2 class=\"p11\" style=\"text-align: justify;\"><span class=\"ez-toc-section\" id=\"How_to_Configure_Apache_Hadoop_Environment_in_the_Cloud\"><\/span><span class=\"s6\">How to C<\/span><span class=\"s4\">onfigure Apache Hadoop Environment in the Cloud?<\/span><span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p class=\"p12\" style=\"text-align: justify;\"><span class=\"s7\">To start with <\/span><span class=\"s4\">Apache Hadoop cloud configuration, you must have Linux platform installed in the cloud. Here we will discuss how to configure single node cluster of Apache Hadoop in cloud computing using pseudo mode. We have considered <a href=\"https:\/\/www.whizlabs.com\/blog\/aws-csaa-ec2\/\" target=\"_blank\" rel=\"noopener\"><b>AWS EC2<\/b><\/a> as the cloud environment here.<\/span><\/p>\n<h4 class=\"p1\" style=\"text-align: justify;\"><span class=\"s1\">Pre-requisites to configure Apache Hadoop environment in the cloud <\/span><\/h4>\n<ul class=\"ul1\" style=\"text-align: justify;\">\n<li class=\"li12\"><span class=\"s4\">An AWS account in an active state\u00a0<\/span><\/li>\n<li class=\"li12\"><span class=\"s4\">Available private and public keys for the EC2 instance<\/span><\/li>\n<li class=\"li12\"><span class=\"s4\">Running Linux instance.<\/span><\/li>\n<li class=\"li12\"><span class=\"s4\">PuTTy installed and set up on the Linux<\/span><\/li>\n<\/ul>\n<blockquote><p>Want to enhance your enterprise capabilities? Start using <a href=\"https:\/\/www.whizlabs.com\/blog\/big-data-and-cloud-computing\/\" target=\"_blank\" rel=\"noopener\">Big Data and Cloud Computing<\/a> together that makes a perfect combination.<\/p><\/blockquote>\n<p class=\"p14\" style=\"text-align: justify;\"><span class=\"s1\">Next, to enable Apache Hadoop in Cloud computing environment following steps need to be executed in two phases.<\/span><\/p>\n<h3 class=\"p16\" style=\"text-align: justify;\"><span class=\"ez-toc-section\" id=\"Phase1_Connect_to_EC2_instance_using_PuTTy\"><\/span><span class=\"s1\">Phase1:<span class=\"Apple-converted-space\">\u00a0 <\/span>Connect to EC2 instance using PuTTy<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<h5 class=\"p16\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 1: Generate the private key for PuTTy<\/i><\/span><\/h5>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">As you need to connect EC2 instance with PuTTY to configure Apache Hadoop environment, you need a private key for PuTTy which will support AWS private key format (.pem). Using tools like PuTTYyGen we can convert .pem format into the .ppk format which is PuTTY supported. Once the PuTTy private key is generated, using SSH client we can connect to the EC2 instance.<\/span><\/p>\n<h5 class=\"p18\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step2: Start PuTTy session for EC2 instance<\/i><\/span><\/h5>\n<p class=\"p19\" style=\"text-align: justify;\"><span class=\"s1\">Once you start your PuTTy session to connect with EC2, you need to authenticate the connection first. To do so, in the category panel select connection and SSH and then expand it. Next, select Auth and then browse for the .ppk and open it.<\/span><\/p>\n<h5 class=\"p18\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step3: Provide the permission<\/i><\/span><\/h5>\n<p class=\"p19\" style=\"text-align: justify;\"><span class=\"s1\">For the first time user of the instance, it will ask for the permission. Provide the login name as <b>ec2-user.<\/b> Once you press enter, it will start your session. <\/span><\/p>\n<h3 class=\"p21\" style=\"text-align: justify;\"><span class=\"ez-toc-section\" id=\"Phase_2_Configuring_Apache_Hadoop_in_cloud_computing\"><\/span><span class=\"s8\">Phase 2:<\/span><span class=\"s1\"> Configuring Apache Hadoop in cloud computing<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p class=\"p22\" style=\"text-align: justify;\"><span class=\"s9\">Before you proceed to <\/span><span class=\"s4\">configure Apache Hadoop environment in EC2 instance make sure you have downloaded the following software beforehand:<\/span><\/p>\n<ol class=\"ol1\" style=\"text-align: justify;\">\n<li class=\"li23\"><span class=\"s4\">\u00a0Java Package<\/span><\/li>\n<li class=\"li23\"><span class=\"s4\">\u00a0Hadoop Package<\/span><\/li>\n<\/ol>\n<p class=\"p25\" style=\"text-align: justify;\"><span class=\"s1\">Next, follow the below steps:<\/span><\/p>\n<h5 class=\"p26\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 1: Create a Hadoop User on EC2 instance<\/i><\/span><\/h5>\n<p class=\"p25\" style=\"text-align: justify;\"><span class=\"s1\">You need to add a new Hadoop user in your EC2 instance. To do so, you need to have root access. It can be obtained by using the following command:<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66935 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/1.jpg\" alt=\"Creating Hadoop User on EC2\" width=\"297\" height=\"44\" \/><\/p>\n<p class=\"p25\" style=\"text-align: justify;\"><span class=\"s1\">Once you have the root access you can create a new user by using the below command:<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66936 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/2.jpg\" alt=\"Creating Hadoop User on EC2\" width=\"318\" height=\"74\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/2.jpg 318w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/2-300x70.jpg 300w\" sizes=\"(max-width: 318px) 100vw, 318px\" \/><\/p>\n<p class=\"p25\" style=\"text-align: justify;\"><span class=\"s1\">Now add a password to the added user<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66937 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/3.jpg\" alt=\"Creating Hadoop User on EC2\" width=\"318\" height=\"74\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/3.jpg 318w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/3-300x70.jpg 300w\" sizes=\"(max-width: 318px) 100vw, 318px\" \/><\/p>\n<p class=\"p25\" style=\"text-align: justify;\"><span class=\"s1\">Next to make the new user as sudo user, type <b><i>visudo<\/i><\/b> and make the entry to visudo file for the user whizlabs below the line <\/span><span class=\"s11\"><b>&#8220;Allow root to run any commands anywhere.&#8221;<\/b><\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66938 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/4.jpg\" alt=\"Creating Hadoop User on EC2\" width=\"318\" height=\"74\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/4.jpg 318w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/4-300x70.jpg 300w\" sizes=\"(max-width: 318px) 100vw, 318px\" \/><\/p>\n<h5 class=\"p28\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 2: Exit from the root <\/i><\/span><\/h5>\n<p class=\"p28\" style=\"text-align: justify;\"><span class=\"s1\">Go back to the EC2 user and log in to the new user by entering the below commands<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66939 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/5.jpg\" alt=\"Creating Hadoop User on EC2\" width=\"317\" height=\"74\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/5.jpg 317w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/5-300x70.jpg 300w\" sizes=\"(max-width: 317px) 100vw, 317px\" \/><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66962 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/sudo-l.jpg\" alt=\"Creating Hadoop User on EC2\" width=\"317\" height=\"74\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/sudo-l.jpg 317w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/sudo-l-300x70.jpg 300w\" sizes=\"(max-width: 317px) 100vw, 317px\" \/><\/p>\n<h5 class=\"p26\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 3: Transfer the Hadoop and Java dump on the EC2 instance<\/i><\/span><\/h5>\n<p class=\"p25\" style=\"text-align: justify;\"><span class=\"s1\">Java should be installed before you install Hadoop on EC2 instance. Hence, you need to copy the downloaded zip versions of Hadoop and Java from Windows machine to EC2 instance using file transfer tools like WinSCP or FileZilla. To copy the files through these tools \u2013<\/span><\/p>\n<ol class=\"ol1\" style=\"text-align: justify;\">\n<li class=\"li23\"><span class=\"s4\">Launch the tool.<\/span><\/li>\n<li class=\"li23\"><span class=\"s4\">Enter the hostname and username of the EC2 instance and port number. Here the default port number will be 22.<\/span><\/li>\n<li class=\"li23\"><span class=\"s4\">Expand the ssh category and authenticate it using the PuTTy private key (.ppk) file.<\/span><\/li>\n<li class=\"li23\"><span class=\"s4\">Once you are logged in to the EC2 instance locate the Hadoop and Java .zip files from your local system and transfer them.<\/span><\/li>\n<li class=\"li23\"><span class=\"s4\">Once you get the files available into the instance use the command <i>ls<\/i> to confirm their availability.<\/span><\/li>\n<li class=\"li23\"><span class=\"s4\">You need to install Hadoop and Java using the new user you have created for Hadoop. Hence copy the files with <i>the whizlabs<\/i> user through below commands \u2013<\/span><\/li>\n<\/ol>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66941 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/7.jpg\" alt=\"Transfer Hadoop and Java dump on EC2\" width=\"317\" height=\"103\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/7.jpg 317w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/7-300x97.jpg 300w\" sizes=\"(max-width: 317px) 100vw, 317px\" \/><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66942 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/8.jpg\" alt=\"Transfer Hadoop and Java dump on EC2\" width=\"317\" height=\"103\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/8.jpg 317w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/8-300x97.jpg 300w\" sizes=\"(max-width: 317px) 100vw, 317px\" \/><\/p>\n<ol class=\"ol1\" style=\"text-align: justify;\">\n<li class=\"li23\"><span class=\"s4\">Next, unzip the files using the below commands<\/span><\/li>\n<\/ol>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66943 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/9.jpg\" alt=\"Transfer Hadoop and Java dump on EC2\" width=\"317\" height=\"74\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/9.jpg 317w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/9-300x70.jpg 300w\" sizes=\"(max-width: 317px) 100vw, 317px\" \/><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66944 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/10.jpg\" alt=\"Transfer Hadoop and Java dump on EC2\" width=\"317\" height=\"103\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/10.jpg 317w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/10-300x97.jpg 300w\" sizes=\"(max-width: 317px) 100vw, 317px\" \/><\/p>\n<h5 class=\"p30\" style=\"text-align: justify;\"><span class=\"s12\"><i>Step 4:<\/i><\/span><span class=\"s4\"><i> Configure Apache Hadoop environment<\/i><\/span><\/h5>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">To set the necessary environment variables for Hadoop and Java, you need to update .<b><i>bashrc<\/i><\/b> file in the Linux. From \/home\/whizlabs which is the home directory here type the below command<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66945 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/11.jpg\" alt=\"Configure Apache Hadoop Environment\" width=\"317\" height=\"74\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/11.jpg 317w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/11-300x70.jpg 300w\" sizes=\"(max-width: 317px) 100vw, 317px\" \/><\/p>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">To enable environmental variable workable use the command<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66947 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/13.jpg\" alt=\"Configure Apache Hadoop Environment\" width=\"295\" height=\"45\" \/><\/p>\n<h5 class=\"p21\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 5: Create NameNode and DataNode directories<\/i><\/span><\/h5>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">Use the below commands for the NameNode and DataNode storage locations:<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66948 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/14.jpg\" alt=\"Create Namenode Directory\" width=\"416\" height=\"45\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/14.jpg 416w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/14-300x32.jpg 300w\" sizes=\"(max-width: 416px) 100vw, 416px\" \/><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66949 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/15.jpg\" alt=\"Create Datanode Directory\" width=\"416\" height=\"45\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/15.jpg 416w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/15-300x32.jpg 300w\" sizes=\"(max-width: 416px) 100vw, 416px\" \/><\/p>\n<h5 class=\"p21\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 6: Modify the directory permissions to 755<\/i><\/span><\/h5>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">Use the below commands<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66950 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/16.jpg\" alt=\"Modify Namenode Directory\" width=\"431\" height=\"45\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/16.jpg 431w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/16-300x31.jpg 300w\" sizes=\"(max-width: 431px) 100vw, 431px\" \/><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66951 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/17.jpg\" alt=\"Modify Datanode Directory\" width=\"417\" height=\"45\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/17.jpg 417w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/17-300x32.jpg 300w\" sizes=\"(max-width: 417px) 100vw, 417px\" \/><\/p>\n<h5 class=\"p21\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 7: Change the directory location to the Hadoop installation directory<\/i><\/span><\/h5>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">Use the below command<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66952 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/18.jpg\" alt=\"Change Directory Location\" width=\"544\" height=\"45\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/18.jpg 544w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/18-300x25.jpg 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/18-537x45.jpg 537w\" sizes=\"(max-width: 544px) 100vw, 544px\" \/><\/p>\n<h5 class=\"p21\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 8: Add the Hadoop home and Java Home path in Hadoop-env.sh<\/i><\/span><\/h5>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">Use the below command to open hadoop-env.sh file<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66963 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/sudo.jpg\" alt=\"Change Directory Location\" width=\"416\" height=\"45\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/sudo.jpg 416w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/sudo-300x32.jpg 300w\" sizes=\"(max-width: 416px) 100vw, 416px\" \/><\/p>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">[Update Java classpath and version as per your installed version].<\/span><\/p>\n<h5 class=\"p21\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 9: Update Hadoop configuration details<\/i><\/span><\/h5>\n<p class=\"p5\" style=\"text-align: justify;\"><span class=\"s1\">Add the configuration properties in the below-mentioned files:<\/span><\/p>\n<p class=\"p33\" style=\"text-align: justify;\"><span class=\"s13\"><b>core-site.xml file<\/b><\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66953 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/19.jpg\" alt=\"Update Hadoop Configuration Details\" width=\"544\" height=\"190\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/19.jpg 544w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/19-300x105.jpg 300w\" sizes=\"(max-width: 544px) 100vw, 544px\" \/><\/p>\n<p class=\"p21\" style=\"text-align: justify;\"><span class=\"s13\"><b>hdfs-site.xml file<\/b><\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66954 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/20.jpg\" alt=\"Update Hadoop Configuration Details\" width=\"544\" height=\"481\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/20.jpg 544w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/20-300x265.jpg 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/20-475x420.jpg 475w\" sizes=\"(max-width: 544px) 100vw, 544px\" \/><\/p>\n<p class=\"p33\" style=\"text-align: justify;\"><span class=\"s13\"><b>yarn-site.xml<\/b><\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66955 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/21.jpg\" alt=\"Update Hadoop Configuration Details\" width=\"725\" height=\"336\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/21.jpg 725w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/21-300x139.jpg 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/21-640x297.jpg 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/21-681x316.jpg 681w\" sizes=\"(max-width: 725px) 100vw, 725px\" \/><\/p>\n<h5 class=\"p21\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 10: Change the configuration properties of mapred-site.xml<\/i><\/span><\/h5>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">Using the below command copy the content of mapred-site.xml template into mapred-site.xml<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66961 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/sudo-mapred.jpg\" alt=\"Change Configuration Properties\" width=\"725\" height=\"74\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/sudo-mapred.jpg 725w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/sudo-mapred-300x31.jpg 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/sudo-mapred-640x65.jpg 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/sudo-mapred-681x70.jpg 681w\" sizes=\"(max-width: 725px) 100vw, 725px\" \/><\/p>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">Add the below property in the file:<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66956 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/22.jpg\" alt=\"Change Configuration Properties\" width=\"725\" height=\"220\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/22.jpg 725w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/22-300x91.jpg 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/22-640x194.jpg 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/22-681x207.jpg 681w\" sizes=\"(max-width: 725px) 100vw, 725px\" \/><\/p>\n<h5 class=\"p21\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 11: ssh key generation for the Hadoop user<\/i><\/span><\/h5>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">Follow the below steps for ssh key generations:<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66957 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/23.jpg\" alt=\"ssh Key Generation for Hadoop User\" width=\"725\" height=\"162\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/23.jpg 725w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/23-300x67.jpg 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/23-640x143.jpg 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/23-681x152.jpg 681w\" sizes=\"(max-width: 725px) 100vw, 725px\" \/><\/p>\n<h5 class=\"p21\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 12: Format the Namenode<\/i><\/span><\/h5>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">Before you start the daemons, you need to format the namenode and change its location to the Hadoop location using the below commands:<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66958 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/24.jpg\" alt=\"Format the Namenode\" width=\"725\" height=\"133\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/24.jpg 725w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/24-300x55.jpg 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/24-640x117.jpg 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/24-681x125.jpg 681w\" sizes=\"(max-width: 725px) 100vw, 725px\" \/><\/p>\n<h5 class=\"p21\" style=\"text-align: justify;\"><span class=\"s1\"><i>Step 13: Start all the daemons in Hadoop<\/i><\/span><\/h5>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">To bring up Hadoop in the running state you need to start the daemons in the below orders:<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66959 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/25.jpg\" alt=\"Start Daemons in Hadoop\" width=\"725\" height=\"336\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/25.jpg 725w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/25-300x139.jpg 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/25-640x297.jpg 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/25-681x316.jpg 681w\" sizes=\"(max-width: 725px) 100vw, 725px\" \/><\/p>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">And you are all set with Apache Hadoop in the cloud.<\/span><\/p>\n<h2 class=\"p36\" style=\"text-align: justify;\"><span class=\"ez-toc-section\" id=\"Benefits_of_Apache_Hadoop_in_Cloud_Computing\"><\/span><span class=\"s1\">Benefits of Apache Hadoop in Cloud Computing<\/span><span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p class=\"p37\" style=\"text-align: justify;\"><span class=\"s1\">The primary benefit of Cloud computing is &#8211; it enables business agility for enterprises, data scientists, and developers with unlimited scalability. It provides effective cost measurement with no upfront hardware costs. Besides that, it is cost-effective as it follows a pay-as-you-go model.<\/span><\/p>\n<p class=\"p37\" style=\"text-align: justify;\"><span class=\"s1\">Apache Hadoop in cloud computing leverages many benefits as follows:<\/span><\/p>\n<h4 class=\"p38\" style=\"text-align: justify;\"><span class=\"s1\">1. The scale of data analytics needs<\/span><\/h4>\n<p class=\"p37\" style=\"text-align: justify;\"><span class=\"s1\">With the enhanced data analysis requirement within the enterprises,7 to expand the capacity of Hadoop clusters is the need of the hour. While setting up the hardware for this could take weeks or months, the deployment in the cloud takes a few days. As a result, overall data processing scales fast and meets the business needs. <\/span><\/p>\n<h4 class=\"p39\" style=\"text-align: justify;\"><span class=\"s1\">2. Lower cost for innovation<\/span><\/h4>\n<p class=\"p37\" style=\"text-align: justify;\"><span class=\"s1\">To configure Apache Hadoop environment in the cloud is a low capacity investment which does not charge for any upfront hardware cost. Apache Hadoop in cloud computing is specifically an ideal solution for the startups with big data analytics.<\/span><\/p>\n<h4 class=\"p37\" style=\"text-align: justify;\">3. Pay specific to your need<\/h4>\n<p class=\"p37\" style=\"text-align: justify;\"><span class=\"s1\">Apache Hadoop cloud is suitable for the use cases when the requirement is to spin up the job to get the results and then stop or shut down the system. This is a flexible spending feature of the cloud as this only asks to pay for the compute and usage on the go basis.<\/span><\/p>\n<h4 class=\"p37\" style=\"text-align: justify;\"><b><\/b><span class=\"s4\">4. Use the optimum infrastructure for the job<\/span><\/h4>\n<p class=\"p37\" style=\"text-align: justify;\"><span class=\"s1\">Not all the big data processing needs the same hardware infrastructure as memory, I\/O, compute resources, etc. In cloud computing, you can select the suitable instance type for an optimal infrastructure for the solution.<\/span><\/p>\n<p><img decoding=\"async\" class=\"alignnone wp-image-66960 size-full\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/sites\/2\/2018\/07\/iotbig-datacloud.jpg\" alt=\"Apache Hadoop in Cloud Computing\" width=\"886\" height=\"504\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/iotbig-datacloud.jpg 886w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/iotbig-datacloud-300x171.jpg 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/iotbig-datacloud-768x437.jpg 768w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/iotbig-datacloud-738x420.jpg 738w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/iotbig-datacloud-640x364.jpg 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/iotbig-datacloud-681x387.jpg 681w\" sizes=\"(max-width: 886px) 100vw, 886px\" \/><\/p>\n<h4 class=\"p37\" style=\"text-align: justify;\"><span class=\"s4\">5. Using cloud data as the source<\/span><\/h4>\n<p class=\"p37\" style=\"text-align: justify;\"><span class=\"s1\">Nowadays with the more usage of IoT and cloud, most enterprises prefer to store data in the cloud making it the primary source of data. As big data is all about large volumes of data, Apache Hadoop in cloud computing makes a lot of sense<\/span><span class=\"s7\">.<\/span><\/p>\n<h4 class=\"p43\" style=\"text-align: justify;\"><span class=\"s4\">6. Makes your operations simple<\/span><\/h4>\n<p class=\"p37\" style=\"text-align: justify;\"><span class=\"s1\">The cloud computing technology provisions various types of clusters of Apache Hadoop with different characteristics and configurations, which is suitable for a particular set of jobs. This lowers the burden of administrative tasks for an enterprise from managing multiple clusters or implementing multi-tenant policies.<\/span><\/p>\n<h3 class=\"p17\" style=\"text-align: justify;\"><span class=\"ez-toc-section\" id=\"Final_Verdict\"><\/span><span class=\"s1\">Final Verdict<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p class=\"p17\" style=\"text-align: justify;\"><span class=\"s1\">To conclude, setting up Apache Hadoop in cloud computing requires hands-on skills on both Apache Hadoop and AWS. You must be comfortable and aware of both the technologies and techniques. <\/span><\/p>\n<p class=\"p14\" style=\"text-align: justify;\"><span class=\"s1\">Whizlabs leverages the best level of theoretical and hands-on knowledge through its <span class=\"s16\"><a href=\"https:\/\/www.whizlabs.com\/hdpca-certification\/\" target=\"_blank\" rel=\"noopener\">Hadoop certification training<\/a><\/span>\u00a0and <\/span><a href=\"https:\/\/www.whizlabs.com\/aws-certifications\/\" target=\"_blank\" rel=\"noopener\"><span class=\"s14\">Cloud Computing Certification training<\/span><\/a><span class=\"s1\">. Browse through our courses and build up your technical ground with us!<\/span><\/p>\n<p><strong><em>Have any questions? Just mention in the comment section below or submit<a href=\"https:\/\/help.whizlabs.com\/hc\/en-us\/requests\/new\" target=\"_blank\" rel=\"noopener\"> here,<\/a>\u00a0we&#8217;ll be happy to respond back.<\/em><\/strong><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Big Data and Cloud Computing combination is the latest trend nowadays, and Hadoop is another name for Big Data. So, it&#8217;s the right time to understand the enabling of Apache Hadoop in Cloud Computing. Hadoop has spawned the foundation for many other big data technologies and tools. When industries need virtually unlimited scalability for large data, Hadoop supports a diverse range of workload types. However, with the exponential growth of big data, storage cost and maintainability is a prime question considering the budget of the companies. Are you a fresher who wants to start a career in Big Data Hadoop? [&hellip;]<\/p>\n","protected":false},"author":220,"featured_media":66933,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[6],"tags":[144,663],"class_list":["post-66879","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-big-data","tag-apache-hadoop-cloud","tag-configure-apache-hadoop-environment"],"uagb_featured_image_src":{"full":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",600,315,false],"thumbnail":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud-150x150.png",150,150,true],"medium":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud-300x158.png",300,158,true],"medium_large":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",600,315,false],"large":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",600,315,false],"1536x1536":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",600,315,false],"2048x2048":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",600,315,false],"profile_24":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",24,13,false],"profile_48":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",48,25,false],"profile_96":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",96,50,false],"profile_150":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",150,79,false],"profile_300":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",300,158,false],"tptn_thumbnail":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud-250x250.png",250,250,true],"web-stories-poster-portrait":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",600,315,false],"web-stories-publisher-logo":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",96,50,false],"web-stories-thumbnail":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2018\/07\/apache-hadoop-in-cloud.png",150,79,false]},"uagb_author_info":{"display_name":"Aditi Malhotra","author_link":"https:\/\/www.whizlabs.com\/blog\/author\/aditi\/"},"uagb_comment_info":0,"uagb_excerpt":"Big Data and Cloud Computing combination is the latest trend nowadays, and Hadoop is another name for Big Data. So, it&#8217;s the right time to understand the enabling of Apache Hadoop in Cloud Computing. Hadoop has spawned the foundation for many other big data technologies and tools. When industries need virtually unlimited scalability for large&hellip;","_links":{"self":[{"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/posts\/66879","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/users\/220"}],"replies":[{"embeddable":true,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/comments?post=66879"}],"version-history":[{"count":0,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/posts\/66879\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/media\/66933"}],"wp:attachment":[{"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/media?parent=66879"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/categories?post=66879"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/tags?post=66879"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}