{"id":84141,"date":"2022-08-29T23:29:09","date_gmt":"2022-08-30T04:59:09","guid":{"rendered":"https:\/\/www.whizlabs.com\/blog\/?p=84141"},"modified":"2023-06-23T00:51:50","modified_gmt":"2023-06-23T06:21:50","slug":"snowpro-advanced-architect-exam-questions","status":"publish","type":"post","link":"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/","title":{"rendered":"30+ Free Question on Snowflake Snowpro Advanced Architect Certification"},"content":{"rendered":"<p>Are you looking for free questions and answers to prepare for the <a href=\"https:\/\/www.whizlabs.com\/snowflake-snowpro-advanced-architect-certification\/\" target=\"_blank\" rel=\"noopener\">Snowflake Snowpro Advanced Architect exam?<\/a><\/p>\n<p>Here are our\u00a0<strong>newly updated 30+ Free questions<\/strong> on the Snowflake Snowpro Advanced Architect exam which are very similar to the practice test as well as the real exam.<\/p>\n<div id=\"ez-toc-container\" class=\"ez-toc-v2_0_76 ez-toc-wrap-left counter-hierarchy ez-toc-counter ez-toc-custom ez-toc-container-direction\">\n<div class=\"ez-toc-title-container\">\n<p class=\"ez-toc-title\" style=\"cursor:inherit\">Table of Contents<\/p>\n<span class=\"ez-toc-title-toggle\"><a href=\"#\" class=\"ez-toc-pull-right ez-toc-btn ez-toc-btn-xs ez-toc-btn-default ez-toc-toggle\" aria-label=\"Toggle Table of Content\"><span class=\"ez-toc-js-icon-con\"><span class=\"\"><span class=\"eztoc-hide\" style=\"display:none;\">Toggle<\/span><span class=\"ez-toc-icon-toggle-span\"><svg style=\"fill: #ea7e02;color:#ea7e02\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" class=\"list-377408\" width=\"20px\" height=\"20px\" viewBox=\"0 0 24 24\" fill=\"none\"><path d=\"M6 6H4v2h2V6zm14 0H8v2h12V6zM4 11h2v2H4v-2zm16 0H8v2h12v-2zM4 16h2v2H4v-2zm16 0H8v2h12v-2z\" fill=\"currentColor\"><\/path><\/svg><svg style=\"fill: #ea7e02;color:#ea7e02\" class=\"arrow-unsorted-368013\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"10px\" height=\"10px\" viewBox=\"0 0 24 24\" version=\"1.2\" baseProfile=\"tiny\"><path d=\"M18.2 9.3l-6.2-6.3-6.2 6.3c-.2.2-.3.4-.3.7s.1.5.3.7c.2.2.4.3.7.3h11c.3 0 .5-.1.7-.3.2-.2.3-.5.3-.7s-.1-.5-.3-.7zM5.8 14.7l6.2 6.3 6.2-6.3c.2-.2.3-.5.3-.7s-.1-.5-.3-.7c-.2-.2-.4-.3-.7-.3h-11c-.3 0-.5.1-.7.3-.2.2-.3.5-.3.7s.1.5.3.7z\"\/><\/svg><\/span><\/span><\/span><\/a><\/span><\/div>\n<nav><ul class='ez-toc-list ez-toc-list-level-1 ' ><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-1\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Why_do_we_provide_Snowflake_Snowpro_Advanced_Architect_exam_questions_for_free\" >Why do we provide Snowflake Snowpro Advanced Architect exam questions for free?<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-2\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Snowflake_Snowpro_Advanced_Architect_Exam_Questions\" >Snowflake Snowpro Advanced Architect Exam Questions<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-3\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Snowflake_Snowpro_Advanced_Architect_exam_questions\" >Snowflake Snowpro Advanced Architect exam questions<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-4\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture\" >Domain : Snowflake Architecture<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-5\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#%E2%80%8BDomain_Snowflake_Architecture\" >\u200bDomain : Snowflake Architecture\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-6\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Account_and_Security\" >Domain : Account and Security<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-7\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture-2\" >Domain : Snowflake Architecture<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-8\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Account_and_Security-2\" >Domain : Account and Security<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-9\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Data_Engineering\" >Domain : Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-10\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Data_Engineering-2\" >Domain : Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-11\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture-3\" >Domain : Snowflake Architecture<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-12\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture-4\" >Domain : Snowflake Architecture<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-13\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Account_and_Security-3\" >Domain : Account and Security<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-14\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture-5\" >Domain : Snowflake Architecture<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-15\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture-6\" >Domain : Snowflake Architecture<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-16\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture-7\" >Domain : Snowflake Architecture\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-17\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Account_and_Security-4\" >Domain : Account and Security<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-18\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Account_and_Security-5\" >Domain : Account and Security<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-19\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Data_Engineering-3\" >Domain : Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-20\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Performance_Optimization\" >Domain : Performance Optimization<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-21\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Performance_Optimization-2\" >Domain : Performance Optimization<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-22\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Performance_Optimization-3\" >Domain : Performance Optimization<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-23\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Accounts_and_Security\" >Domain : Accounts and Security<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-24\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Data_Engineering-4\" >Domain : Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-25\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture-8\" >Domain : Snowflake Architecture<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-26\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture-9\" >Domain : Snowflake Architecture<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-27\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Data_Engineering-5\" >Domain : Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-28\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Performance_Optimization-4\" >Domain : Performance Optimization<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-29\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#%E2%80%8BDomain%E2%80%8B_%E2%80%8B_%E2%80%8B_Data_Engineering\" >\u200bDomain\u200b \u200b:\u200b Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-30\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#%E2%80%8BDomain%E2%80%8B_%E2%80%8B_%E2%80%8B_Data_Engineering-2\" >\u200bDomain\u200b \u200b:\u200b Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-31\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#%E2%80%8BDomain%E2%80%8B_%E2%80%8B_%E2%80%8B_Data_Engineering-3\" >\u200bDomain\u200b \u200b:\u200b Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-32\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain%E2%80%8B_%E2%80%8B_%E2%80%8B_Data_Engineering\" >Domain\u200b \u200b:\u200b Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-33\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#%E2%80%8BDomain%E2%80%8B_%E2%80%8B_%E2%80%8B_Performance_and_Optimization\" >\u200bDomain\u200b \u200b:\u200b Performance and Optimization<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-34\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_%E2%80%8B_%E2%80%8B_Data_Engineering\" >Domain \u200b:\u200b Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-35\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture-10\" >Domain: Snowflake Architecture<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-36\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Snowflake_Architecture-11\" >Domain: Snowflake Architecture<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-37\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Data_Engineering-6\" >Domain: Data Engineering<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-38\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Domain_Performance_Optimization-5\" >Domain: Performance Optimization<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-3'><a class=\"ez-toc-link ez-toc-heading-39\" href=\"https:\/\/www.whizlabs.com\/blog\/snowpro-advanced-architect-exam-questions\/#Summary\" >Summary<\/a><\/li><\/ul><\/nav><\/div>\n<h3><span class=\"ez-toc-section\" id=\"Why_do_we_provide_Snowflake_Snowpro_Advanced_Architect_exam_questions_for_free\"><\/span>Why do we provide Snowflake Snowpro Advanced Architect exam questions for free?<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>Snowflake Snowpro Advanced Architect exams are designed to test and recognize your skills on\u00a0Snowflake architecture and can design and optimize Snowflake solutions for their organizations<\/p>\n<p>We are giving it for free to help you in passing the Snowflake Snowpro Advanced Architect exam just like your colleagues. It\u2019s a free takeaway from the Whizlabs team for Snowflake certification job seekers during this year 2022.<\/p>\n<h3><span class=\"ez-toc-section\" id=\"Snowflake_Snowpro_Advanced_Architect_Exam_Questions\"><\/span>Snowflake Snowpro Advanced Architect Exam Questions<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>You can find a lot of\u00a0video courses on <a href=\"https:\/\/www.whizlabs.com\/blog\/snowflake-snowpro-advanced-architect-certification\/\" target=\"_blank\" rel=\"noopener\">Snowflake Snowpro Advanced Architect exam<\/a> to learn the exam objectives. And now, it\u2019s the time to test your hard-earned Snowflake Snowpro Advanced Architect\u00a0 skills by studying the exam simulator questions on Snowflake Snowpro Advanced Architect exam.<\/p>\n<p>Our Snowflake certified experts even curated these Snowpro simulator questions carefully which are based on the latest syllabus and very relevant to the real exam. This list of free questions on Snowflake Snowpro Advanced Architect exams can help you in up-skilling the knowledge gaps.<\/p>\n<p>Once you have spent some time learning these Snowflake Snowpro Advanced Architect exam questions, then you can face the real exam with more confidence and ensure passing it in your first attempt itself.<\/p>\n<p><em>Let\u2019s get started!<\/em><\/p>\n<h3 style=\"text-align: center;\"><span class=\"ez-toc-section\" id=\"Snowflake_Snowpro_Advanced_Architect_exam_questions\"><\/span><span style=\"font-weight: 400; color: #ff6600;\">Snowflake Snowpro Advanced Architect exam questions<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture\"><\/span><span style=\"font-weight: 400;\"><br \/>\nDomain : Snowflake Architecture<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q1 : Which of the following statements are TRUE concerning a data consumer account in Snowflake?\u00a0<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>All objects in the shared database are always read-only for the consumer of the share<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>A consumer of a share does not necessarily need an account with Snowflake to be able to consume data<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>A consumer account must be in the same region and on the same cloud provider as the data provider account<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>A consumer needs to pay for the compute and storage of the shared data<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>E. <\/strong>A database \u2018share\u2019 can be \u2018imported\u2019 by at most 10 consumer accounts<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswers: A and B<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A<\/b> <b>is correct<\/b><span style=\"font-weight: 400;\">. A consumer can access a share only in <\/span><b>read-only <\/b><span style=\"font-weight: 400;\">mode. The consumer cannot change information or modify\/drop any object in the shared database.\u00a0<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is correct. <\/b><span style=\"font-weight: 400;\">\u00a0If a consumer of your share does not have a Snowflake account, Snowflake provides a functionality called \u2018reader account\u2019 which are special read-only Snowflake accounts created within the provider account for the sole purpose of accessing a share.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is incorrect. <\/b><span style=\"font-weight: 400;\">\u00a0Snowflake allows a consumer account to be located in any (supported) public cloud region\/cloud provider. Snowflake data replication functionality is used to securely share data in such scenarios. Cross-region data sharing is supported for Snowflake accounts hosted on AWS, MS-Azure, or GCP.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D is incorrect.<\/b><span style=\"font-weight: 400;\">\u00a0 A data consumer doesn\u2019t pay for the storage. The consumer only pays for the compute if the consumer is also a Snowflake account. If the consumer is not a snowflake account, the provider pays for the compute used (by the reader account).<\/span><br \/>\n<b>Option E is incorrect.<\/b><span style=\"font-weight: 400;\">\u00a0 Snowflake does not have restrictions on the number of shares a provider can create or the number of shares that can be imported by a consumer.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/data-sharing-intro.html#introduction-to-secure-data-sharing\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Introduction to Secure Data Sharing<\/span><\/a><span style=\"font-weight: 400;\"> \u2013 Snowflake Documentation<\/span><\/p>\n<h3><span class=\"ez-toc-section\" id=\"%E2%80%8BDomain_Snowflake_Architecture\"><\/span><span style=\"font-weight: 400;\">\u200bDomain : Snowflake Architecture\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 <\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q2 : Which of the following statements are FALSE concerning a data consumer account in Snowflake?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>A single consumer account can contain objects from different providers<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>A consumer account can create clones of a shared database<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>A consumer account can perform time travel on a table within the shared database<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>A consumer account cannot forward (i.e. reshare) the shared databases and objects<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswers: B and C<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<figure id=\"attachment_84496\" aria-describedby=\"caption-attachment-84496\" style=\"width: 913px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"wp-image-84496 size-full\" title=\"snowflake architecture\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-architecture.webp\" alt=\"snowflake architecture\" width=\"913\" height=\"488\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-architecture.webp 913w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-architecture-300x160.webp 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-architecture-768x410.webp 768w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-architecture-786x420.webp 786w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-architecture-640x342.webp 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-architecture-681x364.webp 681w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-architecture-150x80.webp 150w\" sizes=\"(max-width: 913px) 100vw, 913px\" \/><figcaption id=\"caption-attachment-84496\" class=\"wp-caption-text\">Image source: Snowflake documentation (https:\/\/docs.snowflake.com\/en\/user-guide\/data-sharing-intro.html)<\/figcaption><\/figure>\n<p><b>Option A<\/b> <b>is incorrect<\/b><span style=\"font-weight: 400;\"> as it is the true statement.\u00a0 A single consumer account can contain objects from different providers. Also, a provider account can share objects with different consumers. This architecture enables the creation of an interconnected network of data providers and consumers.<\/span><\/p>\n<p><b>Option B is correct. <\/b><span style=\"font-weight: 400;\">\u00a0Creating a clone of a shared database or any schemas\/tables in the database is not supported.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is correct<\/b><span style=\"font-weight: 400;\">.<\/span> <span style=\"font-weight: 400;\">\u00a0Time Travel on a shared database or any schemas\/tables in the database is not supported.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D is incorrect <\/b><span style=\"font-weight: 400;\">as it is the true statement.\u00a0 A database created from an Inbound share cannot be shared further with other accounts. Remember that \u2018You cannot share a share\u2019.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/data-sharing-intro.html#introduction-to-secure-data-sharing\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Introduction to Secure Data Sharing<\/span><\/a><span style=\"font-weight: 400;\"> \u2013 Snowflake Documentation<\/span><\/p>\n<p>&nbsp;<\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Account_and_Security\"><\/span><span style=\"font-weight: 400;\">Domain : Account and Security<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q3 : Which of the following objects will NOT be cloned when cloning a schema?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>A permanent Table<\/span><br \/>\n<strong>B. <\/strong>A temporary Table<br \/>\n<strong>C. <\/strong>A transient Table<br \/>\n<strong>D. <\/strong>An external table<br \/>\n<strong>E. <\/strong>Views<br \/>\n<strong>F. <\/strong>Internal (named) Stage<br \/>\n<span style=\"font-weight: 400;\"><strong>G. <\/strong>Stored Procedures<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswers: B, D and F<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Cloning a schema clones all the contained objects in the schema, except the following object types:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Temporary Tables<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">External tables<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Internal (Snowflake) stages\u00a0<\/span><\/li>\n<\/ul>\n<p><b>Option B, D and F are appropriate choices<\/b><span style=\"font-weight: 400;\">.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/sql\/create-clone.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Create <\/span><i><span style=\"font-weight: 400;\">&lt;object&gt;<\/span><\/i><span style=\"font-weight: 400;\">\u2026CLONE \u2013 Snowflake Documentation<\/span><\/a><\/p>\n<p>&nbsp;<\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture-2\"><\/span><span style=\"font-weight: 400;\">Domain : Snowflake Architecture<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q4 : Select appropriate database objects which can NOT be part of a direct share.<\/strong><\/p>\n<p><strong>A. <\/strong>Tables<br \/>\n<strong>B. <\/strong>External Tables<br \/>\n<strong>C. <\/strong>External Stages<br \/>\n<strong>D. <\/strong>Secure views<br \/>\n<strong>E. <\/strong>Secure Materialized views<br \/>\n<strong>F. <\/strong>Stored Procedure<\/p>\n<p><b>Correct\u200b \u200bAnswers: C and F<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><span style=\"font-weight: 400;\">The following Snowflake database objects can be included in a share:<\/span><\/p>\n<table>\n<tbody>\n<tr>\n<td><b>Tables<\/b><\/td>\n<td><b>Views<\/b><\/td>\n<td><b>Other Objects<\/b><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400;\">Table<\/span><\/p>\n<p><span style=\"font-weight: 400;\">External table<\/span><\/td>\n<td><span style=\"font-weight: 400;\">Secure view<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Secure Materialized view<\/span><\/td>\n<td><span style=\"font-weight: 400;\">Secure UDF<\/span><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p><span style=\"font-weight: 400;\">No other database object (e.g. regular views, stored procedures, tasks, streams, stages etc.) can be included in a share. <\/span><b>Therefore Option C and Option F are appropriate answer choices.<\/b><\/p>\n<p><b>Reference: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/data-sharing-intro.html#introduction-to-secure-data-sharing\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Introduction to Secure Data Sharing<\/span><\/a><\/p>\n<p>&nbsp;<\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Account_and_Security-2\"><\/span><span style=\"font-weight: 400;\">Domain : Account and Security<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q5 : To list all privileges and roles that a role named PYTHON_DEV_ROLE has, which of the following commands is the most appropriate one to use?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>show grants<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>show grants in role PYTHON_DEV_ROLE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>show grants of role PYTHON_DEV_ROLE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>show grants to role PYTHON_DEV_ROLE<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswer: D<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">This SQL command lists all roles granted to the current user executing this command.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is incorrect. <\/b><span style=\"font-weight: 400;\">This SQL command is invalid and will result in the following compilation error.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">SQL compilation error: syntax error line &lt;m&gt; at position &lt;n&gt; unexpected &#8216;in&#8217;.<\/span><\/p>\n<p><b>Option C is incorrect. <\/b><span style=\"font-weight: 400;\">This SQL command lists all users and roles to which the role PYTHON_DEV_ROLE has been granted.<\/span><br \/>\n<b>Option D is correct. <\/b><span style=\"font-weight: 400;\">This SQL command lists all privileges and roles granted to role PYTHON_DEV_ROLE .<\/span><\/p>\n<blockquote><p><b>\u27a4 Exam Tip:<\/b><span style=\"font-weight: 400;\"> Show grants SQL command is equivalent to Show grants to user &lt;current_user&gt;<\/span><\/p><\/blockquote>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/sql\/show-grants.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Show Grants \u2013 Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Data_Engineering\"><\/span><span style=\"font-weight: 400;\">Domain : Data Engineering<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><span style=\"font-weight: 400;\"><strong>Q6 : Which of the following programming languages are supported in Snowflake to write UDFs?\u00a0<\/strong>\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>SQL<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>Java<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>.NET<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>C++<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>E. <\/strong>JavaScript<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>F. <\/strong>Python<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswers: A, B, E and F<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Options A, B, E and F are correct.<\/b><span style=\"font-weight: 400;\"> UDFs allow developers to extend the Snowflake to perform the operations that are not natively available through built-in functions. UDFs (together with Stored Procedures) enable database-level programming in Snowflake.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">At present, Snowflake supports four programming languages to write UDFs: SQL, Java, Javascript, and Python.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/udf-overview.html#supported-programming-languages-for-creating-udfs\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Supported Programming Languages for Creating UDFs<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Data_Engineering-2\"><\/span><span style=\"font-weight: 400;\">Domain : Data Engineering<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q7 : In which of the following languages, can you write a stored procedure in Snowflake?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>SQL<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>Java<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>Scala<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>JavaScript<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>E. <\/strong>Python<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>F. <\/strong>Snowflake Scripting<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswers:<\/b> <b>B, C, D, E and F<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Stored procedures in Snowflake allow a developer to write business logic at the database level by using procedural programming methods. Stored procedures can be written in one of the following languages:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Java (using Snowpark)<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">JavaScript<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Python (using Snowpark)<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Scala (using Snowpark)<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Snowflake Scripting<\/span><\/li>\n<\/ul>\n<p><b>Options B, C, D, E and F are correct.<\/b><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/stored-procedures-overview.html#overview-of-stored-procedures\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Overview of Stored Procedures<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture-3\"><\/span><span style=\"font-weight: 400;\">Domain : Snowflake Architecture<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q8 : Which REST API endpoints are supported by SnowPipe?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>SnowPipe does not support REST<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>\u2018insertFiles\u2019 endpoint. Provides a POST method. Informs Snowflake about the files to be ingested into a table<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>\u2018insertTuples\u2019 endpoint. Provides a POST method. Informs Snowflake about the tuples which need to be ingested from a specific file<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>\u2018insertReport\u2019 endpoint. Provides GET method. Fetches a report of ingested files via \u2018insertFiles\u2019 whose contents were recently added to a table<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>E. <\/strong>\u2018loadHistoryScan\u2019 endpoint. Provides GET method. Fetches a report of ingested files whose contents have been added to a table between two points in time<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswers: B, D and E<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">SnowPipe provides public REST endpoints to load data and retrieve load history reports.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is correct. <\/b><span style=\"font-weight: 400;\">\u2018insertFiles\u2019 is a valid REST endpoint. This endpoint exposes a POST method. Calling this API informs Snowflake about the files to be ingested into a table. A successful response from this endpoint means that Snowflake has recorded the list of files to be added to the table.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is incorrect. <\/b><span style=\"font-weight: 400;\">\u2018insertTuples\u2019 is not a valid REST endpoint.\u00a0<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D is correct. <\/b><span style=\"font-weight: 400;\">\u2018insertReport\u2019 is a valid REST endpoint. This endpoint exposes a GET method. Calling this API retrieves a report of the files whose content has been recently ingested into a table.\u00a0<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option E is correct. <\/b><span style=\"font-weight: 400;\">\u2018loadHistoryScan\u2019 is a valid REST endpoint. This endpoint exposes a GET method. Calling this API retrieves a report of the files whose content has been recently ingested into a table. The difference between \u2018insertReport\u2019<\/span> <span style=\"font-weight: 400;\">and \u2018loadHistoryScan\u2019\u2019 is that the latter retrieves the history between two points in time.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/data-load-snowpipe-rest-apis.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Snowflake REST API &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture-4\"><\/span><span style=\"font-weight: 400;\">Domain : Snowflake Architecture<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q9 : To access a database created from a share, the role must be granted which of the following privileges?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>GRANT REFERENCES PRIVILEGE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>GRANT IMPORTED PRIVILEGE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>GRANT USAGE PRIVILEGE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>GRANT SELECT PRIVILEGE\u00a0<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>E. <\/strong>GRANT USAGE_REFERENCES PRIVILEGE<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswer: B<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option B is correct.<\/b><span style=\"font-weight: 400;\"> On a database created from a share, one can only use GRANT IMPORTED PRIVILEGES and REVOKE IMPORTED PRIVILEGES to grant\/revoke access. No other privileges can be granted on a shared object..<\/span><\/p>\n<p><b>\u27a4 Practical Info: <\/b><span style=\"font-weight: 400;\">By default, all inbound shares can be accessible by the ACCOUNTADMIN role. The ACCOUNTADMIN role will need to create a database from the share and grant access to the users using GRANT IMPORTED PRIVILEGES.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/data-share-consumers.html#granting-privileges-on-a-shared-database\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Granting Privileges on Shared Database<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Account_and_Security-3\"><\/span><span style=\"font-weight: 400;\">Domain : Account and Security<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q10 : When a database is cloned, which privileges of the (original) database are replicated in the cloned database?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>ALL privileges associated with the original database and the child objects<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>Child object-level privileges only<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>Database level privileges only<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>Future privileges only<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswer\u200b: B<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><span style=\"font-weight: 400;\">When cloning a database or a schema, the cloned object replicates <\/span><b><i>all<\/i><\/b><span style=\"font-weight: 400;\"> granted privileges on the child objects. <\/span><b>Option B is correct.<\/b><\/p>\n<p><span style=\"font-weight: 400;\">For example, assume a database D1 which has a schema S1 and a table T1. A role R1 has usage privilege on D1 and S1 and selects privilege on T1.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Now the database D1 is cloned to create a new (cloned) database object D1_CLONE. Resultantly, the role R1 will NOT have a usage privilege on D1_CLONE but will have a usage privilege on D1_CLONE.S1 and a select privilege on D1_CLONE.S1.T1.<\/span><\/p>\n<p><b>Further Reading:<\/b> <a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/object-clone.html#access-control-privileges-for-cloned-objects\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Access Control Privileges on Cloned Objects<\/span><\/a><\/p>\n<p>&nbsp;<\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture-5\"><\/span><span style=\"font-weight: 400;\">Domain : Snowflake Architecture<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q11 : You are working as a Snowflake architect for an FMCG company. The company produces a variety of Ayurveda products. The products are sold through a network of 500+ B2B retailers (online and offline). The company and its retailers are using Snowflake.<\/strong><br \/>\n<strong>Each retailer sends weekly POS (Point of Sale) information to the company. The company collates, aggregates, and reports the sales data to the senior management. The aggregated data is further analyzed by the business analysts and anonymized sales insights are shared with the select set of retailers that have subscribed to the insights.<\/strong><br \/>\n<strong>Given this scenario, what would be the BEST architecture that you (as a Snowflake Architect) would recommend keeping in mind the security, robustness and resilience of your design?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>Set up a data exchange. Invite B2B retailers as providers and consumers. Have them create and publish a data listing of their respective POS information. Secure the POS listing to allow access to the company&#8217;s Snowflake account as a consumer. Have the company publish the aggregated data to exchange through shares with the B2B retailers. Secure the data to allow access only to the B2B retailers subscribing to the insights<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>Have the B2B retailers send their data over using Email, FTP, etc. channels into the company\u2019s Snowflake staging area. From there, use SnowPipe to ingest the data into the Snowflake tables. Have the company send the aggregated data back to the B2B retailers using a direct data share<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>Have the B2B retailers share their POS data with the company using direct data sharing. Have the company send the aggregated data back to the B2B retailers using a direct data share<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>Have the B2B retailers share their POS data with the company using direct data sharing. Have the company publish the data as a personalized listing in the public marketplace where the B2B retailers can request access<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswer: A<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A is correct.\u00a0 <\/b><span style=\"font-weight: 400;\">Data Exchange<\/span> <span style=\"font-weight: 400;\">is a data-sharing product offered by Snowflake that is specifically designed to make data sharing seamless across your ecosystem of suppliers, partners, and customers, as well as business units at your own company. It allows you to control who can join, publish, consume, and access data. You can combine the power of Data Exchange with Secure Views that include the logic to limit the data access to specific consumer accounts (e.g. id = current_account()) and share the secure view with all consumers.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is incorrect. <\/b><span style=\"font-weight: 400;\">Although feasible, this is not the BEST design solution. Given that both \u2013 the company and its B2B retailer network \u2013 are using Snowflake, sharing data using data exchange is a more efficient and robust approach. It is more efficient compared to the email\/FTP option because setting up a data exchange is fast and shared data is available instantaneously. This reduces the manual effort of emailing and FTPing every time. It is more robust because you are not introducing an additional point of failure (Email\/FTP etc.) in your architecture. It is also more secure as Snowflake&#8217;s secure views provide mechanisms to share data with specific consumer accounts.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is incorrect. <\/b><span style=\"font-weight: 400;\">Although feasible, this is also not the BEST architecture. <\/span><span style=\"font-weight: 400;\"><br \/>\n<\/span><span style=\"font-weight: 400;\">Given that there are 500+ B2B retailers. Setting up individual data shares for each retailer is effort-intensive and has significant maintenance overheads.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D is incorrect. <\/b><span style=\"font-weight: 400;\">Although feasible, this is also not the BEST architecture. Snowflake Marketplace&#8217;s offering is geared to sharing data with third parties and the world at large. Snowflake Data Exchange makes more sense for data sharing within a closed group of business partners and data consumers (as in this case).<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/data-sharing-product-offerings.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Data Sharing Product Offerings in Snowflake<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture-6\"><\/span><span style=\"font-weight: 400;\">Domain : Snowflake Architecture<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q12 : As a data provider, a role owns the objects contained in a share but does not own the share itself. In this situation, how can the data provider remove the object from the share?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>Revoking usage and\/or select privileges from the share owner role<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>Revoking usage and\/or select privileges from the share owner role with CASCADE option<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>User must request the share owner to remove the object from the share<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>Revoking usage and\/or select privileges from the share owner role with RESTRICT option<\/span><b>\u00a0<\/b><\/p>\n<p><b>Correct\u200b \u200bAnswer:<\/b> <b>B<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option B is correct. <\/b><span style=\"font-weight: 400;\">If the data provider role owns the objects that are included in the share but does not own the share itself (share is owned by some other non-ACCOUNTADMIN role), the data provider can remove objects from the share by revoking the \u2018USAGE or SELECT privilege with CASCADE\u2019 option on the objects.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Syntax:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">REVOKE USAGE ON &lt;schema_name&gt; FROM ROLE &lt;role_name&gt; CASCADE<\/span><\/p>\n<p><span style=\"font-weight: 400;\">REVOKE SELECT ON &lt;table_name&gt; FROM ROLE &lt;role_name&gt; CASCADE<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/security-access-privileges-shares.html#blocking-access-to-objects-in-a-share\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Enabling non-ACCOUNTADMIN roles to Perform Data Sharing Tasks<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture-7\"><\/span><span style=\"font-weight: 400;\">Domain : Snowflake Architecture\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 <\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q13 : Determine the below statements as TRUE\/FALSE:<\/strong><br \/>\n<strong>A. It is possible for Data consumers to enable change tracking on a share by creating streams in their own databases created from a share<\/strong><br \/>\n<strong>B. A new object created in a database that is already included in a share is automatically available to consumers of the share<\/strong><br \/>\n<strong>C. If a non-ACCOUNTADMIN role has the privilege to create a share (CRATE SHARE privilege) and has the USAGE privilege on a database DB1, it can add DB1 into the share<\/strong><br \/>\n<strong>D. <a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/parameters.html#label-simulated-data-sharing-consumer\" target=\"_blank\" rel=\"nofollow noopener\">SIMULATED_DATA_SHARING_CONSUMER<\/a> session parameter can be used to simulate the access to a share by a specific consumer account<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>T, F, F, T<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>T, F, T, F<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>F, T, F, F<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>F, T, T, T<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswer\u200b: A<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A is TRUE. <\/b><span style=\"font-weight: 400;\">It is<\/span> <span style=\"font-weight: 400;\">possible to create streams on shared objects (secure views or tables) to track DML changes made in those objects. This is akin to creating and using streams on \u201clocal\u201d objects.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is FALSE. <\/b><span style=\"font-weight: 400;\">A new object created in a database that is part of a share will not become automatically available to the consumers of the share. The owner of the share must explicitly add the new object to the share.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is FALSE. <\/b><span style=\"font-weight: 400;\">To add an object to a share, the non-ACCOUNTADMIN role must have the following privileges:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">OWNERSHIP of the share, <\/span><b><i>and<\/i><\/b><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">OWNERSHIP <\/span><b><i>or<\/i><\/b><span style=\"font-weight: 400;\"> USAGE\/SELECT WITH GRANT OPTION on <\/span><b><i>each<\/i><\/b><span style=\"font-weight: 400;\"> of the objects to be added in the share:<\/span><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">In the scenario given in the question, the roles do not have USAGE WITH GRANT OPTION on DB1 and therefore the grant operation will fail.<\/span><\/p>\n<p><b>Option D is TRUE.<\/b><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/parameters.html#label-simulated-data-sharing-consumer\" target=\"_blank\" rel=\"nofollow noopener\"> <span style=\"font-weight: 400;\">SIMULATED_DATA_SHARING_CONSUMER<\/span><\/a><span style=\"font-weight: 400;\"> session parameter can be set to the name of the consumer account. This will simulate the access for that consumer account within the provider account itself. This allows the data provider to test and ensure the data is properly filtered by account in the secure view.<\/span><\/p>\n<p><b>Further Reading:<\/b> <a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/data-sharing-secure-views.html#step-4-create-a-share\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Using Secure Objects to Control Data Access \u2014 Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Account_and_Security-4\"><\/span><span style=\"font-weight: 400;\">Domain : Account and Security<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q14 : A user can change object parameters using which of the following roles?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>ACCOUNTADMIN, SECURITYADMIN, USER with PRIVILEGE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>ACCOUNTADMIN, USER with PRIVILEGE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>SECURITYADMIN, USER with PRIVILEGE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>ACCOUNTADMIN, SECURITYADMIN<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswer\u200b: B<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Snowflake parameters are name-value pairs that control the behaviour of your account, user sessions, and objects. There are 3 types of parameters in Snowflake:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Account Parameters<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Session Parameters<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Object Parameters<\/span><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">The following diagram shows these parameter types and how the parameters can be overridden at each level:<\/span><\/p>\n<figure id=\"attachment_84497\" aria-describedby=\"caption-attachment-84497\" style=\"width: 934px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"wp-image-84497 size-full\" title=\"snowflake account &amp; security\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-account-security.webp\" alt=\"snowflake account &amp; security\" width=\"934\" height=\"639\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-account-security.webp 934w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-account-security-300x205.webp 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-account-security-768x525.webp 768w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-account-security-614x420.webp 614w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-account-security-640x438.webp 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-account-security-681x466.webp 681w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/snowflake-account-security-150x103.webp 150w\" sizes=\"(max-width: 934px) 100vw, 934px\" \/><figcaption id=\"caption-attachment-84497\" class=\"wp-caption-text\">Image Source: www.snowflake.com<\/figcaption><\/figure>\n<p><span style=\"font-weight: 400;\">Image source: Snowflake documentation (<\/span><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/parameters.html#parameter-hierarchy-and-types\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Parameters \u2014 Snowflake Documentation<\/span><\/a><span style=\"font-weight: 400;\">)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The top blue box in the diagram (Account box) shows all three types of parameters. This indicates that account administrators (i.e. users with ACCOUNTADMIN role) can change all three types of parameters.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The blue box on the right side shows that object parameters (warehouse level or database level) can be overridden by users with CREATE OBJECT or ALTER OBJECT privileges.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">These two scenarios are depicted in the boxes with green borders.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Hence, we can understand from this discussion that account administrators (i.e. ACCOUNTADMIN role), and individual users with privileges at object level can set\/override object parameters. Therefore <\/span><b>Option B is correct<\/b><span style=\"font-weight: 400;\">.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/parameters.html#object-parameters\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Parameters \u2014 Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Account_and_Security-5\"><\/span><span style=\"font-weight: 400;\">Domain : Account and Security<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q15 : A user can change session parameters using which of the following roles:<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>ACCOUNTADMIN, SECURITYADMIN, USER with PRIVILEGE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>ACCOUNTADMIN, USER with PRIVILEGE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>SECURITYADMIN, USER with PRIVILEGE<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>ACCOUNTADMIN, SECURITYADMIN<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswer: A<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Snowflake parameters are name-value pairs that control the behaviour of your account, user sessions, and objects. There are 3 types of parameters in Snowflake:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Account Parameters<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Session Parameters<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Object Parameters<\/span><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">The following diagram shows the relationship between these parameter types and how the parameters can be overridden at each level:<\/span><\/p>\n<figure id=\"attachment_84498\" aria-describedby=\"caption-attachment-84498\" style=\"width: 975px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"wp-image-84498 size-full\" title=\"snowflake snowpro advanced architecture user\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/33.webp\" alt=\"snowflake snowpro advanced architecture exam\" width=\"975\" height=\"705\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/33.webp 975w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/33-300x217.webp 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/33-768x555.webp 768w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/33-581x420.webp 581w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/33-640x463.webp 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/33-681x492.webp 681w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/33-150x108.webp 150w\" sizes=\"(max-width: 975px) 100vw, 975px\" \/><figcaption id=\"caption-attachment-84498\" class=\"wp-caption-text\">Image Source: www.snowflake.com<\/figcaption><\/figure>\n<p><span style=\"font-weight: 400;\">Image source: Snowflake documentation (<\/span><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/parameters.html#parameter-hierarchy-and-types\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Parameters \u2014 Snowflake Documentation<\/span><\/a><span style=\"font-weight: 400;\">)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The top blue box in the diagram (Account box) shows all three types of parameters. This indicates that account administrators (i.e. users with ACCOUNTADMIN role) can change all three types of parameters.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The blue box on the middle-left side shows those session parameters can be overridden by users with CREATE USER or ALTER USER privileges. These privileges are granted to the SECURITYADMIN role.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The blue box on the bottom-left side shows those session parameters can be overridden by users with ALTER SESSION privilege. This privilege is available to logged-in users.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">These three scenarios are depicted in boxes with green borders.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Hence, we can understand from this discussion that, account administrators (i.e. ACCOUNTADMIN role), administrators with CREATE USER and ALTER USER privilege (typically SECURITYADMIN role) and individual users with ALTER SESSION privilege can set\/override session parameters. Therefore <\/span><b>Option A is correct<\/b><span style=\"font-weight: 400;\">.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/parameters.html#object-parameters\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Parameters \u2014 Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Data_Engineering-3\"><\/span><span style=\"font-weight: 400;\">Domain : Data Engineering<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q16 : A table is created and 9 rows are inserted as given below:<\/strong><br \/>\n<strong>create or replace table null_count_test(col1 integer, col2 integer);<\/strong><br \/>\n<strong>insert into null_count_test(col1, col2) values<\/strong><br \/>\n<strong>\u00a0 \u00a0 (null, null), \u00a0 &#8212; two all NULL values<\/strong><br \/>\n<strong>\u00a0 \u00a0 (null, null),<\/strong><br \/>\n<strong>\u00a0 \u00a0 (null, 1),\u00a0 \u00a0 \u00a0 &#8212; one NULL value<\/strong><br \/>\n<strong>\u00a0 \u00a0 (1, null),\u00a0 \u00a0 \u00a0 &#8212; one NULL value<\/strong><br \/>\n<strong>\u00a0 \u00a0 (1, 1),<\/strong><br \/>\n<strong>\u00a0 \u00a0 (2, 2),<\/strong><br \/>\n<strong>\u00a0 \u00a0 (3, 3),<\/strong><br \/>\n<em><span style=\"font-weight: 400;\">\u00a0 \u00a0 (4, 4),<\/span><\/em><br \/>\n<em><span style=\"font-weight: 400;\">\u00a0 \u00a0 (5, 5);<\/span><\/em><br \/>\n<em><span style=\"font-weight: 400;\">What would be the output of the following query?<\/span><\/em><br \/>\n<em><span style=\"font-weight: 400;\">select count(*), count (nct.*), count(col1), count(distinct col1), count(distinct col1, col2), approx_count_distinct(*) from null_count_test nct;<\/span><\/em><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>Syntax error: Invalid function approx_count_distinct(*)<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>8, 5, 6, 6, 6, 5<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>9, 5, 6, 6, 6, 7<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>9, 5, 6, 5, 5, 5<\/span><\/p>\n<p><b>Correct Answer: D<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">count (*) returns a total number of records which is 9 in this table.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">count (alias.*) returns the total number of rows that do not contain any null value in any of the columns which is \u20185\u2019 in this case.\u00a0\u00a0<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">count() function when used with a column name returns the count of non-null records for the specified column. Therefore count (col1) returns 6.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">count(distinct col1) returns the count of rows that contain only unique and non-null values for the specified column. Therefore count (distinct col1) returns 5.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">count (distinct col1, col2) returns a count of rows that contain only unique combinations for the specified columns where none of the two columns should have null values. Therefore count (distinct col1, col2) returns 5.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">approx_count_distinct(*) returns the same output as count (distinct col1, col2,&#8230;.) except that it is an approximation. Therefore, approx_count_distinct(*) returns 5.<\/span><\/li>\n<\/ul>\n<blockquote><p><span style=\"font-weight: 400;\">\u00a0\u27bd <\/span><b>Exam Tip\u00a0<\/b><\/p>\n<p><span style=\"font-weight: 400;\">approx_count_distinct() uses HyperLogLog algorithm to estimate the cardinality. This function can be used when you just need an approximation of the cardinality (and not a precise value). This function is cheaper to execute than count(distinct). <\/span><b>\u00a0<\/b><\/p><\/blockquote>\n<p><b>Further Readings: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/functions\/count.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">COUNT() function &#8211; Snowflake Documentation<\/span><\/a>,\u00a0<a href=\"https:\/\/community.snowflake.com\/s\/article\/COUNT-DISTINCT-and-NULLs\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">COUNT(), DISTINCT and NULL &#8211; Snowflake knowledge base article<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Performance_Optimization\"><\/span><span style=\"font-weight: 400;\">Domain : Performance Optimization<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q17 : Select the best practices of adding Search Optimization service to a table.<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>It is best to register all tables in a schema to search optimization<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>Before adding search optimization to a large table, get an estimate of costs involved<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>Batching of DML operations on the table<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>The table should not be clustered or the clustering key is different from the columns frequently used in queries<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>E. <\/strong>At most one of the columns in the table used in the filter operation has at least 100k-200k distinct values<\/span><\/p>\n<p><b>Correct Answers: B, C and D<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">Search optimization is a table-level property. Search optimization service consumes storage and compute credits. For every table you register to the service, the credit consumption would increase. Therefore it is recommended to add search optimization to only a few tables initially and monitor the costs and benefits before registering more tables to the service.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is correct.<\/b><span style=\"font-weight: 400;\"> It is best to get an estimate of the costs involved and weigh that against the optimization benefits before enabling search optimization. To estimate the cost of adding search optimization to a table, Snowflake provides a function named SYSTEM$ESTIMATE_SEARCH_OPTIMIZATION_COSTS function.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is correct. <\/b><span style=\"font-weight: 400;\">Snowflake recommends batching of DML operations on the table to reduce the cost of the search optimization service.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D is correct. <\/b><span style=\"font-weight: 400;\">Search optimization delivers the best results when the table registered to the service is not clustered or the clustering key is different from the columns frequently used in the query.\u00a0<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option E is incorrect. <\/b><span style=\"font-weight: 400;\">Search optimization service is most useful when the columns used in the filter operation has at least 100k-200k distinct values. At most means 1 or less than 1 whereas at least means 1 or more than 1.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/search-optimization-service.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Using Search Optimization Service &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Performance_Optimization-2\"><\/span><span style=\"font-weight: 400;\">Domain : Performance Optimization<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q18 : Which of the following SQL removes search optimization services from a table?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>ALTER TABLE [IF EXISTS] &lt;table_name&gt; drop SEARCH OPTIMIZATION;<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>ALTER TABLE [IF EXISTS] &lt;table_name&gt; disable SEARCH OPTIMIZATION;<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>ALTER TABLE [IF EXISTS] &lt;table_name&gt; remove SEARCH OPTIMIZATION;<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>ALTER TABLE [IF EXISTS] &lt;table_name&gt; delete SEARCH OPTIMIZATION;<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>E. <\/strong>ALTER TABLE [IF EXISTS] &lt;table_name&gt; stop SEARCH OPTIMIZATION;<\/span><\/p>\n<p><b>Correct Answer: A<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A is correct. <\/b><span style=\"font-weight: 400;\">To remove search optimization property from a table, you use the DROP command.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">ALTER TABLE [IF EXISTS] &lt;table_name&gt; drop SEARCH OPTIMIZATION;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Note that To add or remove search optimization for a table, you must have the following privileges:\u00a0<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">You must have OWNERSHIP privilege on the table.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">You must have ADD SEARCH OPTIMIZATION privilege on the schema that contains the table.<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400;\">\u27bd <\/span><b>Exam Tip &#8211;\u00a0<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Kindly note that point-b in the explanation. To add or remove search optimization property on a table, you need ADD SEARCH OPTIMIZATION <\/span><b>on the schema where the table resides<\/b><span style=\"font-weight: 400;\">.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/search-optimization-service.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Using Search Optimization Service &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Performance_Optimization-3\"><\/span><span style=\"font-weight: 400;\">Domain : Performance Optimization<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q19 : A table is created with biometric data of the citizens of Genovia (as given below). There are 230 million records inserted in this table.\u00a0<\/strong><br \/>\n<strong>Create table people_master (<\/strong><br \/>\n<strong>UniqueID number(11),\u00a0<\/strong><br \/>\n<strong>FirstName varchar(50),<\/strong><br \/>\n<strong>LastName varchar(50),<\/strong><br \/>\n<strong>DoB timestamp, &#8212; format is DDMMYYYY hh:mi:ss<\/strong><br \/>\n<strong>Address varchar(200),<\/strong><br \/>\n<strong>City varchar(50),<\/strong><br \/>\n<strong>State char(2),<\/strong><br \/>\n<strong>Current_Country char(2), &#8211;populated for non-resident- citizen<\/strong><br \/>\n<strong>PassportNo string);<\/strong><br \/>\n<strong>Following are some of the observations on the query patterns of this table.\u00a0<\/strong><br \/>\n<strong>A. UniqueID, FirstName, LastName, and DOB are frequently retrieved columns (i.e. used in SELECT clause) and City is frequently used for sorting data.\u00a0<\/strong><br \/>\n<strong>B. The table is queried based on State and Birth Year for demographic categorization (WHERE clause)<\/strong><br \/>\n<strong>C. The table is heavily joined with another table called TRAVEL_LOG with the join column being PassportNo.<\/strong><br \/>\n<strong>D. The data is frequently grouped by the Current_Country column<\/strong><br \/>\n<strong>You, as a Snowflake Architect, are asked to review the clustering status and recommend reclustering (if required). How would you approach this problem?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>Check SYSTEM$CLUSTERING_DEPTH. The smaller the number, the better clustered the table is<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>Select UniqueID, FirstName, LastName, and DOB are preferred candidates for a clustering key over PassportNo column<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>Select State, Birth Year would be an appropriate combination for the clustering key.<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>Do not use DOB in the clustering key<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>E. <\/strong>Once a clustering key is chosen, carry out reclustering using alter table &lt;table name&gt; recluster<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswers:\u200b A and C<\/b><\/p>\n<p><b>Option A is correct. <\/b><span style=\"font-weight: 400;\">SYSTEM$CLUSTERING_DEPTH<\/span> <span style=\"font-weight: 400;\">computes the average depth of the table according to the specified clustering key. The smaller the average depth, the better clustered the table is with regards to the specified columns.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is incorrect. <\/b><span style=\"font-weight: 400;\">The columns used in the SELECT clause are not helpful in a clustering key. Instead, you should choose the join column (PassportNo) as it is an excellent candidate for a clustering key to speed up join queries.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is correct. <\/b><span style=\"font-weight: 400;\">\u00a0Columns used in the WHERE clauses are worthy candidates to be in the clustering key.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D is incorrect. <\/b><span style=\"font-weight: 400;\">\u00a0If you have queries that filter the data based on date criteria, you can use the date column in the clustering key. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">However, you will need to check the cardinality of the date column before using it in the clustering key. Using expressions may help reduce cardinality. For example, you can use the SUBSTR expression to get a relevant portion of a string column for use in the clustering key.\u00a0<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option E is incorrect. <\/b><span style=\"font-weight: 400;\">Reclustering happens automatically once a clustering key is defined for a table. Manual reclustering is not required.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/tables-clustering-keys.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Clustering Keys and Clustered Tables &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Accounts_and_Security\"><\/span><span style=\"font-weight: 400;\">Domain : Accounts and Security<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q20 : A d<em>atabase DEV_DB has two schemas. STAGING schema and DATAMART schema. There is a warehouse named COMPUTE_WH of size M. There are two groups of developers in the project \u2013 junior developers and senior developers. There are two functional roles associated with these developers &#8211; JR_DEV and SR_DEV.<\/em><\/strong><br \/>\n<strong><em>Following are the security design requirements:<\/em><\/strong><br \/>\n<strong><em>A. Junior developers require read-only access to all objects within the DATAMART schema.<\/em><\/strong><br \/>\n<strong><em>B. Senior developers require DML access on STAGING schema and read-only access to DATAMART schema.<\/em><\/strong><br \/>\n<strong><em>C. Both groups require access to a virtual warehouse for computational requirements.<\/em><\/strong><br \/>\n<strong><em>Which of the following 4 diagrams BEST represents the RBAC design you will recommend?<\/em><\/strong><\/p>\n<p><b>Option A: Diagram A<\/b><\/p>\n<figure id=\"attachment_84499\" aria-describedby=\"caption-attachment-84499\" style=\"width: 1230px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"wp-image-84499 size-full\" title=\"snowpro advanced architect exam Q&amp;A\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/44.webp\" alt=\"snowpro advanced architect exam questions\" width=\"1230\" height=\"1006\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/44.webp 1230w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/44-300x245.webp 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/44-1024x838.webp 1024w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/44-768x628.webp 768w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/44-514x420.webp 514w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/44-640x523.webp 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/44-681x557.webp 681w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/44-150x123.webp 150w\" sizes=\"(max-width: 1230px) 100vw, 1230px\" \/><figcaption id=\"caption-attachment-84499\" class=\"wp-caption-text\">Image Source: www.snowflake.com<\/figcaption><\/figure>\n<p><b>Option B: Diagram B<\/b><\/p>\n<figure id=\"attachment_84500\" aria-describedby=\"caption-attachment-84500\" style=\"width: 1230px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"wp-image-84500 size-full\" title=\"snowpro account &amp; security\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/55.webp\" alt=\"snowpro account &amp; security\" width=\"1230\" height=\"1015\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/55.webp 1230w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/55-300x248.webp 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/55-1024x845.webp 1024w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/55-768x634.webp 768w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/55-509x420.webp 509w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/55-640x528.webp 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/55-681x562.webp 681w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/55-150x124.webp 150w\" sizes=\"(max-width: 1230px) 100vw, 1230px\" \/><figcaption id=\"caption-attachment-84500\" class=\"wp-caption-text\">Image Source: www.snowflake.com<\/figcaption><\/figure>\n<p><b>Option C: Diagram C<\/b><\/p>\n<figure id=\"attachment_84501\" aria-describedby=\"caption-attachment-84501\" style=\"width: 1230px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"wp-image-84501 size-full\" title=\"snowpro advanced architect exam\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/66.webp\" alt=\"snowpro advanced architect exam\" width=\"1230\" height=\"1015\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/66.webp 1230w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/66-300x248.webp 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/66-1024x845.webp 1024w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/66-768x634.webp 768w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/66-509x420.webp 509w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/66-640x528.webp 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/66-681x562.webp 681w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/66-150x124.webp 150w\" sizes=\"(max-width: 1230px) 100vw, 1230px\" \/><figcaption id=\"caption-attachment-84501\" class=\"wp-caption-text\">Image Source: www.snowflake.com<\/figcaption><\/figure>\n<p><b>Option D: Diagram D<\/b><\/p>\n<figure id=\"attachment_84502\" aria-describedby=\"caption-attachment-84502\" style=\"width: 1230px\" class=\"wp-caption alignnone\"><img decoding=\"async\" class=\"wp-image-84502 size-full\" title=\"snowflake certifications exam\" src=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/77.webp\" alt=\"snowflake certifications exam\" width=\"1230\" height=\"1014\" srcset=\"https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/77.webp 1230w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/77-300x247.webp 300w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/77-1024x844.webp 1024w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/77-768x633.webp 768w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/77-509x420.webp 509w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/77-640x528.webp 640w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/77-681x561.webp 681w, https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/77-150x124.webp 150w\" sizes=\"(max-width: 1230px) 100vw, 1230px\" \/><figcaption id=\"caption-attachment-84502\" class=\"wp-caption-text\">Image Source: www.snowflake.com<\/figcaption><\/figure>\n<p><b>Correct Answer: C<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><span style=\"font-weight: 400;\">This question tests your understanding of three key principles of access control in Snowflake<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Principle of least privilege<\/span><span style=\"font-weight: 400;\"> &#8211; this principle states that a role should be assigned the least required privileges to perform the activities.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Aligning privileges with business functions<\/span><span style=\"font-weight: 400;\"> &#8211; Snowflake recommends creating a combination of object access roles with different permissions on objects and assigning them as appropriate to functional roles:<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Using role hierarchy and privilege inheritance to build an access control model<\/span><span style=\"font-weight: 400;\"> &#8211; This refers to granting lower-level functional roles to higher-level functional roles in a parent-child relationship where the parent roles map to business functions that should subsume the permissions of the child roles.<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400;\">Applying these rules to the given scenario in question, you would need to\u00a0<\/span><\/p>\n<ol>\n<li><span style=\"font-weight: 400;\"> create a role combination of object access roles i.e. JRDEV AND SRDEV to mirror functional roles played by senior developers and junior developers.\u00a0<\/span><\/li>\n<li><span style=\"font-weight: 400;\"> assign the least set of privileges to both roles necessary to carry out their tasks\u00a0\u00a0<\/span><\/li>\n<li><span style=\"font-weight: 400;\"> create role hierarchy between JRDEV and SRDEV and have privilege inheritance provide additional privileges that SRDEV needs on the DATAMART schema and USAGE privilege on the warehouse<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400;\">In diagram C, this model is implemented. You can see that there\u2019s a role hierarchy setup between STDEV as the parent role and JRDEV as child role. Consequently, SRDEV inherits the USAGE privilege on the warehouse and the database and also inherits USAGE privilege on STAGING schema and read-only privilege i.e. SELECT privilege on STAGING schema tables. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">Now it needs additional DML operations privileges on the DATAMART schema which is directly assigned to the SRDEV role. As this object access model satisfies all three key principles of the object access in Snowflake, this is the BEST design. <\/span><\/p>\n<p><b>Therefore Option C is the correct answer.<\/b><\/p>\n<p><span style=\"font-weight: 400;\">In diagram A, you can observe that SRDEV role gets USAGE privilege on the warehouse and the database, usage privilege on STAGING schema, and select privilege on STAGING schema tables directly assigned. This model does not take advantage of role hierarchy and privilege inheritance. <\/span><\/p>\n<p><b>Therefore Option A is incorrect.<\/b><\/p>\n<p><span style=\"font-weight: 400;\">In diagram B, you can observe that JRDEV role gets SELECT privilege on the STAGING schema. This is an invalid privilege. One can assign USAGE privilege on the schema and SELECT privilege on the tables of that schema.<\/span><\/p>\n<p><b>Therefore Option B is incorrect.<\/b><\/p>\n<p><span style=\"font-weight: 400;\">In diagram D, you can observe that the role hierarchy is set up but the SRDEV role has duplicated privileges assigned to it through the direct assignment and through privilege inheritance. There are clear redundancies in this object access model. <\/span><\/p>\n<p><b>Therefore Option D is incorrect.<\/b><\/p>\n<p><b>Further Reading:<\/b> <a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/security-access-control-considerations.html#aligning-object-access-with-business-functions\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Aligning object access with business functions &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Data_Engineering-4\"><\/span><span style=\"font-weight: 400;\">Domain : Data Engineering<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q21 : Which of the following are TRUE statements with respect to the Snowflake Connector for Kafka?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>The Kafka connector relies on both \u2013 key pair authentication and username\/password authentication<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>It is recommended to run your Kafka Connect instance in the same cloud provider region as your Snowflake account<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>Kafka connect cluster is a separate cluster from the Kafka cluster<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>The typical pattern with Kafka connect cluster in Snowflake is that one topic supplies messages (rows) for many Snowflake tables and many Kafka topics supply messages to one Snowflake table (i.e. many-to-many relationship)<\/span><\/p>\n<p><b>Correct Answers: B and C<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">The Kafka connector relies on key pair authentication rather than basic authentication (i.e. username and password).<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is correct. <\/b><span style=\"font-weight: 400;\">While it is not mandatory, it is highly recommended that to improve throughput, Kafka connect instance should be in the same cloud provider and the same region as your Snowflake account<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is correct. <\/b><span style=\"font-weight: 400;\">A Kafka Connect cluster is a separate cluster from the Kafka cluster. A Kafka cluster hosts queues that connect the publisher to the subscriber. The Kafka Connect cluster supports running and scaling out connectors that connect Kafka with external systems.\u00a0<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D is incorrect. <\/b><span style=\"font-weight: 400;\">The typical pattern with the Kafka connect cluster in Snowflake is that one topic supplies messages (rows) for one Snowflake table.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/kafka-connector-install.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Installing and Configuring the Kafka Connector &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture-8\"><\/span><span style=\"font-weight: 400;\">Domain : Snowflake Architecture<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q22 : Which of the following are TRUE statements with respect to database replication in Snowflake?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>It works across regions of the same cloud provider. Cross-cloud provider data sharing is not supported<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>It is available in Snowflake Business Critical editions and above<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>It is supported for databases only. Other types of objects in an account are not supported<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>A database created from a share cannot be replicated<\/span><\/p>\n<p><b>Correct Answers: C and D<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">Database replication is supported from any (supported) cloud region to any other (supported) cloud region AS WELL AS from any cloud provider (AWS, Azure, GCP) to any other cloud provider (AWS, Azure, GCP).<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is incorrect. <\/b><span style=\"font-weight: 400;\">Replication is available for all Snowflake editions. However, database failover\/failback (a related feature) is available for the Business critical edition onwards.\u00a0<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is correct. <\/b><span style=\"font-weight: 400;\">Currently, Snowflake has announced replication support for databases only (data and DDL). Other types of objects in an account cannot be replicated, including Users, Roles, Warehouses, Resource monitors, and Shares.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D is correct. <\/b><span style=\"font-weight: 400;\">A database created from a share cannot be replicated.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/database-replication-intro.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Introduction to Database Replication &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture-9\"><\/span><span style=\"font-weight: 400;\">Domain : Snowflake Architecture<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q23 : Which of the following are TRUE statements with respect to database replication in Snowflake?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>Data transfer costs are an additional cost component of database replication in addition to the compute and storage costs<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>Data replication, if enabled, is billed at fixed monthly billing irrespective of the amount of data replicated<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>Replication fails if the primary database contains an external table<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>Privileges given to the primary database objects are replicated to a secondary database<\/span><\/p>\n<p><b>Correct Answers: A and C<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A\u00a0 is correct. <\/b><span style=\"font-weight: 400;\">Database replication involves transferring data from one cloud region to another cloud region or across cloud providers. This is known as data egress. Cloud providers charge for data egress. To recover this expense, Snowflake charges back the customer for data transfer. This is charged on a per-byte basis. The rate depends on the region where your primary database is hosted.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B\u00a0 is incorrect. <\/b><span style=\"font-weight: 400;\">Data replication bills are not based on a fixed charge. Instead, the cost is variable and depends upon the amount of data replicated and the frequency of synchronization between primary and secondary.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C\u00a0 is correct. <\/b><span style=\"font-weight: 400;\">Replication operation fails if there\u2019s an external table present in the primary database. To workaround this limitation, Snowflake suggests creating external tables outside the replicated database.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D\u00a0 is incorrect. <\/b><span style=\"font-weight: 400;\">When a database is replicated, the privileges associated with the replicated objects are NOT copied over to the secondary database. They have to be granted explicitly.<\/span><\/p>\n<blockquote><p><span style=\"font-weight: 400;\">\u27bd <\/span><b>Exam Tip <\/b><span style=\"font-weight: 400;\">&#8211; Snowflake provides the following ways to view the actual replication costs incurred.\u00a0<\/span><\/p><\/blockquote>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\">\n<blockquote><p><span style=\"font-weight: 400;\">REPLICATION_USAGE_HISTORY table function (in the Snowflake Information_schema)<\/span><\/p><\/blockquote>\n<\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\">\n<blockquote><p><span style=\"font-weight: 400;\">REPLICATION_USAGE_HISTORY View view (in Account Usage)<\/span><\/p><\/blockquote>\n<\/li>\n<\/ol>\n<p><b>Further Readings: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/database-replication-billing.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Understanding costs model for replication &#8211; Snowflake Documentation<\/span><\/a>,\u00a0<a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/database-replication-considerations.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Database replication considerations &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<p>&nbsp;<\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Data_Engineering-5\"><\/span><span style=\"font-weight: 400;\">Domain : Data Engineering<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q24 : Which of the following is NOT a valid choice for ON_ERROR while loading data into Snowflake using COPY INTO &lt;table&gt; command?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"> \u00a0 <strong>A. <\/strong>ON_ERROR = CONTINUE<\/span><br \/>\n<span style=\"font-weight: 400;\"> \u00a0 <strong>B. <\/strong>ON_ERROR = ABORT_STATEMENT<\/span><br \/>\n<span style=\"font-weight: 400;\"> \u00a0 <strong>C. <\/strong>ON_ERROR = SKIP_FILE_&lt;num&gt;<\/span><br \/>\n<span style=\"font-weight: 400;\"> \u00a0 <strong>D. <\/strong>ON_ERROR = ROLLBACK<\/span><\/p>\n<p><b>Correct Answer: D<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A is incorrect.<\/b><span style=\"font-weight: 400;\"> ON_ERROR = CONTINUE is a valid copy option while loading data into Snowflake. This option instructs Snowflake to continue loading the data even if an error is encountered while copying.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is incorrect.<\/b><span style=\"font-weight: 400;\"> ON_ERROR = ABORT_STATEMENT is a valid copy option while loading data into Snowflake. This option instructs Snowflake to abort the loading operation if any error is encountered while copying.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is incorrect.<\/b><span style=\"font-weight: 400;\"> ON_ERROR = SKIP_FILE_&lt;num&gt; is a valid copy option while loading data into Snowflake. This option instructs Snowflake to skip the loading from a file if the number of errors encountered is greater or equal to the number specified in &lt;num&gt;.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D is correct. <\/b><span style=\"font-weight: 400;\">ON_ERROR = ROLLBACK is invalid. In order to perform a rollback, we need to use ABORT_STATEMENT.<\/span><\/p>\n<blockquote><p><span style=\"font-weight: 400;\">\u27bd <\/span><b>Exam Tip \u2013\u00a0<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Note the default options given below for the ON_ERROR clause.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/span><\/p>\n<p><b>Bulk loading using COPY\u00a0 \u00a0 <\/b> <span style=\"font-weight: 400;\">ABORT_STATEMENT<\/span><\/p>\n<p><b>Snowpipe \u00a0 \u00a0 <\/b> <b>\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 <\/b> <span style=\"font-weight: 400;\">SKIP_FILE<\/span><\/p><\/blockquote>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/sql\/copy-into-table.html#copy-options-copyoptions\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Copy Options in COPY INTO <\/span><i><span style=\"font-weight: 400;\">&lt;table&gt; &#8211; <\/span><\/i><span style=\"font-weight: 400;\">Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Performance_Optimization-4\"><\/span><span style=\"font-weight: 400;\">Domain : Performance Optimization<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q25 : Which of the following is TRUE about the Search Optimization service in Snowflake?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\"><strong>A. <\/strong>You must have ADD SEARCH OPTIMIZATION privilege on the table which you want to register for search optimization<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>B. <\/strong>A search access path becomes invalid if you add, drop, or rename a column<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>C. <\/strong>Search optimization service does not support materialized views<\/span><br \/>\n<span style=\"font-weight: 400;\"><strong>D. <\/strong>Search optimization service does not support VARCHAR and TIMESTAMP data types<\/span><\/p>\n<p><b>Correct Answer: C<\/b><\/p>\n<p><b>Explanation<\/b><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">You must have ADD SEARCH OPTIMIZATION privilege on the schema that contains the table that you want to register for search optimization.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option B is incorrect.<\/b><span style=\"font-weight: 400;\"> A search access path remains valid if you add, drop, or rename a column. Search optimization background service automatically updates the search access path when you add, drop or rename a column.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option C is correct. <\/b><span style=\"font-weight: 400;\">Search optimization service does not support materialized views and external tables.<\/span><br \/>\n<b><\/b><\/p>\n<p><b>Option D is incorrect. <\/b><span style=\"font-weight: 400;\">Search optimization service supports VARCHAR and TIMESTAMP data types.<\/span><\/p>\n<p><b>Further Reading: <\/b><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/search-optimization-service.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Using Search Optimization Service &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"%E2%80%8BDomain%E2%80%8B_%E2%80%8B_%E2%80%8B_Data_Engineering\"><\/span>\u200bDomain\u200b \u200b:\u200b Data Engineering<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q 26: Row-level security within a Snowflake account in Snowflake can be implemented using\u2026 (Select TWO)<\/strong><\/p>\n<p><span style=\"font-weight: 400;\">A. Masking policy<\/span><\/p>\n<p><span style=\"font-weight: 400;\">B. Row access policy<\/span><\/p>\n<p><span style=\"font-weight: 400;\">C. Tri-secret Secure<\/span><\/p>\n<p><span style=\"font-weight: 400;\">D. Secure data sharing<\/span><\/p>\n<p><span style=\"font-weight: 400;\">E. Secure views<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswer\u200b <\/b><span style=\"font-weight: 400;\">\u200b\u2013 B and E<\/span><\/p>\n<p><b>Explanation<\/b><span style=\"font-weight: 400;\">:<\/span><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">Masking policy is a column-level security object (and not a row-level security object)<\/span><\/p>\n<p><b>Option B is correct.<\/b><span style=\"font-weight: 400;\"> Snowflake enterprise edition and above supports a schema-level object called row access policy that helps you control whether a given row in a table or a view shows up in the query result. The policy is evaluated at query runtime and the objects that pass the access test defined in the policy will be visible to the user. It is important to remember that this policy also includes the object owner who normally has full access to the underlying data.\u00a0<\/span><\/p>\n<p><b>Option C is incorrect. <\/b><span style=\"font-weight: 400;\">Tri-secret secure is a data encryption mechanism that combines the customer-managed key with a Snowflake-managed key to create a composite master key.\u00a0<\/span><\/p>\n<p><b>Option D is incorrect. <\/b><span style=\"font-weight: 400;\">Secure Data Sharing enables sharing selected objects in a database in your account with other Snowflake accounts. It does not aid in controlling row-level security in your account.<\/span><\/p>\n<p><b>Option E is correct. <\/b><span style=\"font-weight: 400;\">You can use Snowflake context functions<\/span><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/functions\/current_role.html\" target=\"_blank\" rel=\"noopener\"> <span style=\"font-weight: 400;\">CURRENT_ROLE<\/span><\/a><span style=\"font-weight: 400;\">() and<\/span><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/functions\/current_user.html\" target=\"_blank\" rel=\"noopener\"> <span style=\"font-weight: 400;\">CURRENT_USER<\/span><\/a><span style=\"font-weight: 400;\">() in the secure view query to implement row-level security. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">See an example below:<\/span><\/p>\n<p><em><span style=\"font-weight: 400;\">create or replace secure view movies_for_you as<\/span><\/em><\/p>\n<p><em><span style=\"font-weight: 400;\">select * from movies_table m where upper(m.role_name) = current_role();<\/span><\/em><\/p>\n<p><span style=\"font-weight: 400;\">Here, the secure view returns the list of movies that the logged-in is allowed to see by using the CURRENT_ROLE() context function to filter the movies list.<\/span><\/p>\n<p><b>Reference:<\/b><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/security-row-intro.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Understanding row access policies in Snowflake<\/span><\/a><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/views-secure.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Working with Secure Views \u2014 Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"%E2%80%8BDomain%E2%80%8B_%E2%80%8B_%E2%80%8B_Data_Engineering-2\"><\/span>\u200bDomain\u200b \u200b:\u200b Data Engineering<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q 27: Which of the following statements are TRUE concerning Secure View objects? (Select TWO)<\/strong><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">The following statement will change a secure view into a standard view.<\/span><\/li>\n<\/ol>\n<p><em><span style=\"font-weight: 400;\">\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 ALTER VIEW &lt;view name&gt; UNSET SECURE;<\/span><\/em><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">A secure view may have slower performance compared to a standard view.\u00a0<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">To add a view in your outbound share, it has to be created as a secure view<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">A materialized view cannot be created as a secure view.<\/span><\/li>\n<\/ol>\n<p><b>Correct\u200b \u200bAnswer\u200b : <\/b><span style=\"font-weight: 400;\">\u00a0A and B<\/span><\/p>\n<p><b>Explanation<\/b><span style=\"font-weight: 400;\">:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">When you create a secure view, Snowflake makes sure that <\/span><b>a)<\/b><span style=\"font-weight: 400;\"> the view definition is not visible to the users of the view and <\/span><b>b)<\/b><span style=\"font-weight: 400;\"> the users have no access to underlying data of the base tables. This feature is the differentiating feature of a secure view.\u00a0<\/span><\/p>\n<p><b>Option A is correct<\/b><span style=\"font-weight: 400;\">. A secure view can be converted to a regular view and vice versa using ALTER VIEW command as given below:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0<\/span><span style=\"font-weight: 400;\">&#8212; to convert a regular view into a secure view<\/span><\/p>\n<p><span style=\"font-weight: 400;\">ALTER VIEW &lt;name&gt; SET SECURE<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8212; to convert a secure view into a regular view<\/span><\/p>\n<p><span style=\"font-weight: 400;\">ALTER VIEW &lt;name&gt; UNSET SECURE\u00a0<\/span><\/p>\n<p><b>Option B is correct. <\/b><span style=\"font-weight: 400;\">When you create a secure view, Snowflake bypasses certain query optimizations. These optimizations are available for other regular views. As a result, a secure view has slower performance compared to a regular view.<\/span><\/p>\n<p><b>Option C is incorrect. <\/b><span style=\"font-weight: 400;\">To add a view in an outbound share, you may create the view as a materialized view or a secure view.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The following Snowflake database objects can be included in a share:\u00a0<\/span><\/p>\n<table>\n<tbody>\n<tr>\n<td><b>Tables<\/b><\/td>\n<td><b>Views<\/b><\/td>\n<td><b>Other Objects<\/b><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400;\">Table<\/span><\/p>\n<p><span style=\"font-weight: 400;\">External table<\/span><\/td>\n<td><b>Secure view<\/b><\/p>\n<p><b>Materialized view<\/b><\/td>\n<td><span style=\"font-weight: 400;\">Secure UDF<\/span><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p><span style=\"font-weight: 400;\"> No other database objects (e.g. regular views, stored procedures, tasks, streams, etc.) can be included in a share<\/span><\/p>\n<p><b>Option D is incorrect. <\/b><span style=\"font-weight: 400;\">A materialized view can be created as a secure view.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Reference:<\/span><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/views-secure.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Working with Secure Views \u2014 Snowflake Documentation<\/span><\/a><\/p>\n<h1><\/h1>\n<h3><span class=\"ez-toc-section\" id=\"%E2%80%8BDomain%E2%80%8B_%E2%80%8B_%E2%80%8B_Data_Engineering-3\"><\/span>\u200bDomain\u200b \u200b:\u200b Data Engineering<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><b>Q 28<\/b><span style=\"font-weight: 400;\">\u200b \u200b<\/span><b>: <\/b><strong>An external table ext_weather_source is created in Snowflake. This table points to a data lake created on Amazon S3.\u00a0 The following 7 events occur in the data lake in sequence.<\/strong><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">File 1 with 7 rows added.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">A stream object ext_weather_stream is created on the external table.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">File 2 with 4 rows added.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">File 3 with 3 rows added.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">File 2 deleted.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">File 3 is appended with 2 more rows.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">ext_weather _source table is refreshed manually<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400;\">What will be the output of the following queries based on the above sequence?<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Select count(1) from ext_weather_source;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Select count(1) from ext_weather_stream;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A. 12, 9<\/span><\/p>\n<p><span style=\"font-weight: 400;\">B. 16, 9<\/span><\/p>\n<p><span style=\"font-weight: 400;\">C. 9, 16<\/span><\/p>\n<p><span style=\"font-weight: 400;\">D. 9, 12<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswer\u200b :<\/b><span style=\"font-weight: 400;\">\u00a0A<\/span><\/p>\n<p><b>Explanation<\/b><span style=\"font-weight: 400;\">:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">An external table in Snowflake is a reference to your data files in external cloud storage (e.g. data lake). Each external table has at least three columns as shown below:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">External table structure :<\/span><\/p>\n<table>\n<tbody>\n<tr>\n<td><span style=\"font-weight: 400;\">METADATA$FILENAME<\/span><\/td>\n<td><span style=\"font-weight: 400;\">A pseudocolumn column that identifies the name of each data file in the datalake, including its path.<\/span><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400;\">METADATA$FILE_ROW_NUMBER<\/span><\/td>\n<td><span style=\"font-weight: 400;\">A pseudocolumn that shows the row number for each record in the data file.<\/span><\/td>\n<\/tr>\n<tr>\n<td><span style=\"font-weight: 400;\">VALUE<\/span><\/td>\n<td><span style=\"font-weight: 400;\">A VARIANT column that represents a single row in the data file.<\/span><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p><span style=\"font-weight: 400;\">The external table needs to be refreshed either automatically or manually. When you refresh an external table, Snowflake synchronizes the metadata with the latest set of files in the external stage and data within these files, i.e.:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">New files in the path are added to the table metadata<\/span><\/li>\n<\/ul>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Changes to files in the path are updated in the table metadata.<\/span><\/li>\n<\/ul>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Files no longer in the path are removed from the table metadata.<\/span><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">In the given sequence of events in the datalake, (after the manual refresh) the external table ext_weather_source will have 7 rows for File 1, 0 rows for File 2, and 5 rows for File 3. This will make a total of 12 rows in the external table.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Snowflake supports insert-only stream objects on external tables. These objects record insertions only. They do not record deletions. Append operations are treated as new file insertions.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In the given question, the stream object is created in the 2nd step after adding 7 rows in File 1. Therefore this stream object starts recording any changes to the external table after the 1st step. The ext_weather_stream will record 4 rows inserted in File 2 and 5 rows inserted in File 3 for a total row count of 9.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Based on the above discussion, we can see that <\/span><b>Option A is correct.<\/b><\/p>\n<p><b>Reference:<\/b><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/tables-external-intro.html#introduction-to-external-tables\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Introduction to external tables<\/span><\/a><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/streams-intro.html#types-of-streams\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Insert-only streams on external table<\/span><\/a><\/p>\n<h1><\/h1>\n<h3><span class=\"ez-toc-section\" id=\"Domain%E2%80%8B_%E2%80%8B_%E2%80%8B_Data_Engineering\"><\/span>Domain<span style=\"font-weight: 400;\">\u200b \u200b<\/span>:<span style=\"font-weight: 400;\">\u200b Data Engineering<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q 29\u200b \u200b: Which of the following statements are TRUE concerning cancellation of a long-running query in Snowflake? (select all that apply)<\/strong><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">It is not possible to selectively cancel long-running queries in Snowflake without aborting the session.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Only SYSADMIN or a higher role has the privileges to cancel a long-running query using SnowSQL.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">The recommended way for a user to cancel a long-running query is to use the application interface (e.g. Snowflake web interface) or the cancellation API provided by the Snowflake ODBC or JDBC driver.\u00a0<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">A long-running query will be auto-canceled by default by Snowflake after 2 days\u00a0<\/span><\/li>\n<\/ol>\n<p><b>Correct\u200b \u200bAnswer\u200b :<\/b><span style=\"font-weight: 400;\">\u00a0C and D<\/span><\/p>\n<p><b>Explanation<\/b><span style=\"font-weight: 400;\">:<\/span><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">It is possible for a user to cancel a specific long-running query without aborting the session.<\/span><\/p>\n<p><b>Option B is incorrect.<\/b><span style=\"font-weight: 400;\"> Snowflake provides two SnowSQL functions for canceling long-running queries. Any user can use these functions provided by Snowflake to cancel their own long-running queries. These functions are:\u00a0<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">SYSTEM$CANCEL_ALL_QUERIES<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">SYSTEM$CANCEL_QUERY<\/span><\/li>\n<\/ul>\n<p><b>Option C is correct.<\/b><span style=\"font-weight: 400;\"> The recommended way for a user to cancel a long-running query is to use the application interface (e.g. Snowflake web interface) or the cancellation API provided by the Snowflake ODBC or JDBC driver.<\/span><\/p>\n<p><b>Option D is correct.<\/b> <span style=\"font-weight: 400;\">Snowflake provides a parameter\u00a0 STATEMENT_TIMEOUT_IN_SECONDS at an account, session, object, or warehouse level. The value of this parameter controls how long a query may run before being automatically timed out by Snowflake.\u00a0 The default value is for this parameter is 2 days (though it can be modified to a lower value)<\/span><\/p>\n<p><b>Reference:<\/b><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/querying-cancel-statements.html\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">Cancelling a long-running query<\/span><\/a><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/parameters.html#statement-timeout-in-seconds\" target=\"_blank\" rel=\"nofollow noopener\"><span style=\"font-weight: 400;\">STATEMENT_TIMEOUT_IN_SECONDS<\/span><\/a><\/p>\n<h1><\/h1>\n<h3><span class=\"ez-toc-section\" id=\"%E2%80%8BDomain%E2%80%8B_%E2%80%8B_%E2%80%8B_Performance_and_Optimization\"><\/span>\u200bDomain\u200b \u200b:\u200b Performance and Optimization<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q 30\u200b \u200b: A table is created in the Snowflake database as below:<\/strong><\/p>\n<p><span style=\"font-weight: 400;\">CREATE TABLE\u00a0 &#8220;AGENTS&#8221;\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0(<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0\u00a0&#8220;AGENT_CODE&#8221; CHAR(6) NOT NULL PRIMARY KEY,\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8220;AGENT_NAME&#8221; CHAR(40),\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8220;WORKING_AREA&#8221; CHAR(35),\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8220;COMMISSION_PERCENT&#8221; NUMBER(10,2),\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8220;CONTACT_NO&#8221; CHAR(15),\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8220;COUNTRY_CODE&#8221; VARCHAR2(25)\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u00a0\u00a0\u00a0\u00a0);<\/span><\/p>\n<p><strong>There are 2150 agent records inserted into this table.\u00a0<\/strong><\/p>\n<p><strong>Next, the following query is executed on this table.\u00a0<\/strong><\/p>\n<p><strong>Select * from agents;<\/strong><\/p>\n<p><strong>Which of the following queries can be completed subsequently without the use of a virtual warehouse? (Select TWO)<\/strong><\/p>\n<p><span style=\"font-weight: 400;\">A. Select * from agents;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">B. select * from agents limit 100;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">C. Select count(1) from agents;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">D. Select country_code, count(1) from agents group by country_code;<\/span><\/p>\n<p><b>Correct\u200b \u200bAnswer\u200b <\/b><span style=\"font-weight: 400;\">\u200b\u2013 A and C<\/span><\/p>\n<p><b>Explanation<\/b><span style=\"font-weight: 400;\">:<\/span><\/p>\n<p><b>Option A is correct<\/b><span style=\"font-weight: 400;\">. The exact same query is already issued once earlier by the same user role. Therefore Snowflake will be able to retrieve this information from the results cache. Hence, a virtual warehouse will not be used.<\/span><\/p>\n<p><b>Option B is incorrect<\/b><span style=\"font-weight: 400;\">. This query will require the use of a virtual warehouse to limit the resultset count to the first 100 rows.<\/span><\/p>\n<p><b>Option C is correct.<\/b><span style=\"font-weight: 400;\"> The count aggregate function that returns the total number of rows in a table is a metadata-only operation and will not require the use of a virtual warehouse or local cache.\u00a0<\/span><\/p>\n<p><b>Option D is incorrect.<\/b><span style=\"font-weight: 400;\"> In this query, the agent data is to be grouped by country code. This requires additional processing and will need a virtual warehouse.<\/span><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_%E2%80%8B_%E2%80%8B_Data_Engineering\"><\/span>Domain \u200b:\u200b <span style=\"font-weight: 400;\">Data Engineering<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q 31: Which of the following are TRUE statements with respect to the Snowflake Connector for Kafka? (Select TWO)<\/strong><\/p>\n<p><span style=\"font-weight: 400;\">A. The Kafka connector relies on both \u2013 key pair authentication and username\/password authentication.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">B. It is recommended to run your Kafka Connect instance in the same cloud provider region as your Snowflake account.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">C. Kafka connect cluster is a separate cluster from the Kafka cluster<\/span><\/p>\n<p><span style=\"font-weight: 400;\">D. The typical pattern with Kafka connect cluster in Snowflake is that one topic supplies messages (rows) for many Snowflake tables and many Kafka topics supply messages to one Snowflake table (i.e. many-to-many relationship).<\/span><\/p>\n<p><b>Correct Answer &#8211; <\/b><span style=\"font-weight: 400;\">B, C<\/span><\/p>\n<p><b>Explanation:<\/b><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">The Kafka connector relies on key pair authentication rather than basic authentication (i.e. username and password).<\/span><\/p>\n<p><b>Option B is correct. <\/b><span style=\"font-weight: 400;\">While it is not mandatory, it is highly recommended that to improve throughput, Kafka connect instance should be in the same cloud provider and the same region as your Snowflake account<\/span><\/p>\n<p><b>Option C is correct. <\/b><span style=\"font-weight: 400;\">A Kafka Connect cluster is a separate cluster from the Kafka cluster. A Kafka cluster hosts queues that connect the publisher to the subscriber. The Kafka Connect cluster supports running and scaling out connectors that connect Kafka with external systems.\u00a0<\/span><\/p>\n<p><b>Option D is incorrect. <\/b><span style=\"font-weight: 400;\">The typical pattern with the Kafka connect cluster in Snowflake is that one topic supplies messages (rows) for one Snowflake table.<\/span><\/p>\n<p><b>Further Reading:<\/b><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/kafka-connector-install.html\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">Installing and Configuring the Kafka Connector &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture-10\"><\/span>Domain: <span style=\"font-weight: 400;\">Snowflake Architecture<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><b>Q 32\u200b \u200b: <\/b><span style=\"font-weight: 400;\">Which of the following are TRUE statements with respect to database replication in Snowflake? (Select TWO)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A. It works across regions of the same cloud provider. Cross-cloud provider data sharing is not supported.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">B. It is available in Snowflake Business Critical editions and above.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">C. It is supported for databases only. Other types of objects in an account are not supported.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">D. A database created from a share cannot be replicated<\/span><\/p>\n<p><b>Correct Answer &#8211; C and D<\/b><\/p>\n<p><b>Explanation:<\/b><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">Database replication is supported from any (supported) cloud region to any other (supported) cloud region AS WELL AS from any cloud provider (AWS, Azure, GCP) to any other cloud provider (AWS, Azure, GCP).<\/span><\/p>\n<p><b>Option B is incorrect. <\/b><span style=\"font-weight: 400;\">Replication is available for all Snowflake editions. However, database failover\/failback (a related feature) is available for the Business critical edition onwards.\u00a0<\/span><\/p>\n<p><b>Option C is correct. <\/b><span style=\"font-weight: 400;\">Currently, Snowflake has announced replication support for databases only (data and DDL). Other types of objects in an account cannot be replicated, including Users, Roles, Warehouses, Resource monitors, and Shares.<\/span><\/p>\n<p><b>Option D is correct. <\/b><span style=\"font-weight: 400;\">A database created from a share cannot be replicated.<\/span><\/p>\n<p><b>Further Reading:<\/b><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/database-replication-intro.html\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">Introduction to Database Replication &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Snowflake_Architecture-11\"><\/span>Domain: <span style=\"font-weight: 400;\">Snowflake Architecture<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q 33: Which of the following are TRUE statements with respect to database replication in Snowflake? (Select TWO)<\/strong><\/p>\n<p><span style=\"font-weight: 400;\">A. Data transfer costs are an additional cost component of database replication in addition to the compute and storage costs.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">B. Data replication, if enabled, is billed at fixed monthly billing irrespective of the amount of data replicated.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">C. Replication fails if the primary database contains an external table.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">D. Privileges given to the primary database objects are replicated to a secondary database.<\/span><\/p>\n<p><b>Correct Answer &#8211; A and C<\/b><\/p>\n<p><b>Explanation:<\/b><\/p>\n<p><b>Option A\u00a0 is correct. <\/b><span style=\"font-weight: 400;\">Database replication involves transferring data from one cloud region to another cloud region or across cloud providers. This is known as data egress. Cloud providers charge for data egress. To recover this expense, Snowflake charges back the customer for data transfer. This is charged on a per-byte basis. The rate depends on the region where your primary database is hosted.<\/span><\/p>\n<p><b>Option B\u00a0 is incorrect. <\/b><span style=\"font-weight: 400;\">Data replication bills are not based on a fixed charge. Instead, the cost is variable and depends upon the amount of data replicated and the frequency of synchronization between primary and secondary.<\/span><\/p>\n<p><b>Option C\u00a0 is correct. <\/b><span style=\"font-weight: 400;\">Replication operation fails if there\u2019s an external table present in the primary database. To workaround this limitation, Snowflake suggests creating external tables outside the replicated database.<\/span><\/p>\n<p><b>Option D\u00a0 is incorrect. <\/b><span style=\"font-weight: 400;\">When a database is replicated, the privileges associated with the replicated objects are NOT copied over to the secondary database. They have to be granted explicitly.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u27bd <\/span><b>Exam Tip <\/b><span style=\"font-weight: 400;\">&#8211; Snowflake provides the following ways to view the actual replication costs incurred.\u00a0<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">REPLICATION_USAGE_HISTORY table function (in the Snowflake Information_schema)<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">REPLICATION_USAGE_HISTORY View view (in Account Usage)<\/span><\/li>\n<\/ol>\n<p><b>Further Reading:<\/b><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/database-replication-billing.html\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">Understanding costs model for replication &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/database-replication-considerations.html\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">Database replication considerations &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Data_Engineering-6\"><\/span><span style=\"font-weight: 400;\">Domain: Data Engineering<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q 34 : Which of the following is NOT a valid choice for ON_ERROR while loading data into Snowflake using COPY INTO <i>&lt;table&gt;<\/i> command?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\">A.\u00a0 ON_ERROR = CONTINUE<\/span><\/p>\n<p><span style=\"font-weight: 400;\">B. ON_ERROR = ABORT_STATEMENT<\/span><\/p>\n<p><span style=\"font-weight: 400;\">C. ON_ERROR = SKIP_FILE_&lt;num&gt;<\/span><\/p>\n<p><span style=\"font-weight: 400;\">D. ON_ERROR = ROLLBACK<\/span><\/p>\n<p><b>Correct Answer \u2013 D<\/b><\/p>\n<p><b>Explanation:<\/b><\/p>\n<p><b>Option A is incorrect.<\/b><span style=\"font-weight: 400;\"> ON_ERROR = CONTINUE is a valid copy option while loading data into Snowflake. This option instructs Snowflake to continue loading the data even if an error is encountered while copying.<\/span><\/p>\n<p><b>Option B is incorrect.<\/b><span style=\"font-weight: 400;\"> ON_ERROR = ABORT_STATEMENT is a valid copy option while loading data into Snowflake. This option instructs Snowflake to abort the loading operation if any error is encountered while copying.<\/span><\/p>\n<p><b>Option C is incorrect.<\/b><span style=\"font-weight: 400;\"> ON_ERROR = SKIP_FILE_&lt;num&gt; is a valid copy option while loading data into Snowflake. This option instructs Snowflake to skip the loading from a file if the number of errors encountered is greater or equal to the number specified in &lt;num&gt;.<\/span><\/p>\n<p><b>Option D is correct. <\/b><span style=\"font-weight: 400;\">ON_ERROR = ROLLBACK is invalid. In order to perform a rollback, we need to use ABORT_STATEMENT.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u27bd <\/span><b>Exam Tip \u2013\u00a0<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Note the default options given below for the ON_ERROR clause.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/span><\/p>\n<p><b>Bulk loading using COPY\u00a0 \u00a0 <\/b> <span style=\"font-weight: 400;\">ABORT_STATEMENT<\/span><\/p>\n<p><b>Snowpipe \u00a0 \u00a0 <\/b> <b>\u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 <\/b> <span style=\"font-weight: 400;\">SKIP_FILE<\/span><\/p>\n<p><b>Further Reading:<\/b><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/sql-reference\/sql\/copy-into-table.html#copy-options-copyoptions\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">Copy Options in COPY INTO <\/span><i><span style=\"font-weight: 400;\">&lt;table&gt; &#8211; <\/span><\/i><span style=\"font-weight: 400;\">Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Domain_Performance_Optimization-5\"><\/span><span style=\"font-weight: 400;\">Domain: Performance Optimization<\/span><span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p><strong>Q 35 : Which of the following is TRUE about the Search Optimization service in Snowflake?<\/strong><\/p>\n<p><span style=\"font-weight: 400;\">A. You must have ADD SEARCH OPTIMIZATION privilege on the table which you want to register for search optimization.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">B. A search access path becomes invalid if you add, drop, or rename a column.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">C. Search optimization service does not support materialized views.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">D. Search optimization service does not support VARCHAR and TIMESTAMP data types.<\/span><\/p>\n<p><b>Correct Answer<\/b><span style=\"font-weight: 400;\"> &#8211; C<\/span><\/p>\n<p><b>Explanation:<\/b><\/p>\n<p><b>Option A is incorrect. <\/b><span style=\"font-weight: 400;\">You must have ADD SEARCH OPTIMIZATION privilege on the schema that contains the table that you want to register for search optimization.<\/span><\/p>\n<p><b>Option B is incorrect.<\/b><span style=\"font-weight: 400;\"> A search access path remains valid if you add, drop, or rename a column. Search optimization background service automatically updates the search access path when you add, drop or rename a column.<\/span><\/p>\n<p><b>Option C is correct. <\/b><span style=\"font-weight: 400;\">Search optimization service does not support materialized views and external tables.<\/span><\/p>\n<p><b>Option D is incorrect. <\/b><span style=\"font-weight: 400;\">Search optimization service supports VARCHAR and TIMESTAMP data types.<\/span><\/p>\n<p><b>Further Reading:<\/b><\/p>\n<p><a href=\"https:\/\/docs.snowflake.com\/en\/user-guide\/search-optimization-service.html\" target=\"_blank\" rel=\"noopener\"><span style=\"font-weight: 400;\">Using Search Optimization Service &#8211; Snowflake Documentation<\/span><\/a><\/p>\n<h3><span class=\"ez-toc-section\" id=\"Summary\"><\/span>Summary<span class=\"ez-toc-section-end\"><\/span><\/h3>\n<p>We hope the above list of questions on Snowflake Snowpro Advanced Architect exams are helpful for you. The Snowflake SnowPro Advanced Architect Certification is the highest level of certification offered by Snowflake which is intended for experienced professionals.<\/p>\n<p>It is strongly recommended to ensure that we have covered all the objectives of the certification exam, so that you can pass the exam at ease and in your first attempt. Hence, keep practicing until you are confident to take the real exams. You can also try Whizlabs newly updated <a href=\"https:\/\/www.whizlabs.com\/snowflake-snowpro-advanced-architect-certification\/\" target=\"_blank\" rel=\"noopener\">practice test on Snowflake Snowpro Advanced Architect<\/a> exam.<\/p>\n<div id=\"wpautbox-below\" class=\"a-tabs\">\n<div class=\"a-tab-container\"><\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Are you looking for free questions and answers to prepare for the Snowflake Snowpro Advanced Architect exam? Here are our\u00a0newly updated 30+ Free questions on the Snowflake Snowpro Advanced Architect exam which are very similar to the practice test as well as the real exam. Why do we provide Snowflake Snowpro Advanced Architect exam questions for free? Snowflake Snowpro Advanced Architect exams are designed to test and recognize your skills on\u00a0Snowflake architecture and can design and optimize Snowflake solutions for their organizations We are giving it for free to help you in passing the Snowflake Snowpro Advanced Architect exam just [&hellip;]<\/p>\n","protected":false},"author":359,"featured_media":84509,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"default","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"default","adv-header-id-meta":"","stick-header-meta":"default","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"var(--ast-global-color-4)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[4881],"tags":[4916],"class_list":["post-84141","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-snowflake","tag-snowpro-advanced-architect-exam-questions"],"uagb_featured_image_src":{"full":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam.webp",1280,720,false],"thumbnail":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam-150x150.webp",150,150,true],"medium":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam-300x169.webp",300,169,true],"medium_large":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam-768x432.webp",768,432,true],"large":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam-1024x576.webp",1024,576,true],"1536x1536":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam.webp",1280,720,false],"2048x2048":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam.webp",1280,720,false],"profile_24":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam.webp",24,14,false],"profile_48":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam.webp",48,27,false],"profile_96":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam.webp",96,54,false],"profile_150":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam.webp",150,84,false],"profile_300":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam.webp",300,169,false],"tptn_thumbnail":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam-250x250.webp",250,250,true],"web-stories-poster-portrait":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam-640x720.webp",640,720,true],"web-stories-publisher-logo":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam-96x96.webp",96,96,true],"web-stories-thumbnail":["https:\/\/www.whizlabs.com\/blog\/wp-content\/uploads\/2022\/08\/Snowflake-Snowpro-Advanced-Architect-certification-Exam-150x84.webp",150,84,true]},"uagb_author_info":{"display_name":"Abilesh Premkumar","author_link":"https:\/\/www.whizlabs.com\/blog\/author\/abilesh\/"},"uagb_comment_info":3,"uagb_excerpt":"Are you looking for free questions and answers to prepare for the Snowflake Snowpro Advanced Architect exam? Here are our\u00a0newly updated 30+ Free questions on the Snowflake Snowpro Advanced Architect exam which are very similar to the practice test as well as the real exam. Why do we provide Snowflake Snowpro Advanced Architect exam questions&hellip;","_links":{"self":[{"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/posts\/84141","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/users\/359"}],"replies":[{"embeddable":true,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/comments?post=84141"}],"version-history":[{"count":20,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/posts\/84141\/revisions"}],"predecessor-version":[{"id":89702,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/posts\/84141\/revisions\/89702"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/media\/84509"}],"wp:attachment":[{"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/media?parent=84141"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/categories?post=84141"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.whizlabs.com\/blog\/wp-json\/wp\/v2\/tags?post=84141"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}