Weekend Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: netbudy65

ARA-R01 SnowPro Advanced: Architect Recertification Exam Questions and Answers

Questions 4

A Snowflake Architect is designing a multiple-account design strategy.

This strategy will be MOST cost-effective with which scenarios? (Select TWO).

Options:

A.

The company wants to clone a production database that resides on AWS to a development database that resides on Azure.

B.

The company needs to share data between two databases, where one must support Payment Card Industry Data Security Standard (PCI DSS) compliance but the other one does not.

C.

The company needs to support different role-based access control features for the development, test, and production environments.

D.

The company security policy mandates the use of different Active Directory instances for the development, test, and production environments.

E.

The company must use a specific network policy for certain users to allow and block given IP addresses.

Buy Now
Questions 5

A company is designing high availability and disaster recovery plans and needs to maximize redundancy and minimize recovery time objectives for their critical application processes. Cost is not a concern as long as the solution is the best available. The plan so far consists of the following steps:

1. Deployment of Snowflake accounts on two different cloud providers.

2. Selection of cloud provider regions that are geographically far apart.

3. The Snowflake deployment will replicate the databases and account data between both cloud provider accounts.

4. Implementation of Snowflake client redirect.

What is the MOST cost-effective way to provide the HIGHEST uptime and LEAST application disruption if there is a service event?

Options:

A.

Connect the applications using the - URL. Use the Business Critical Snowflake edition.

B.

Connect the applications using the - URL. Use the Virtual Private Snowflake (VPS) edition.

C.

Connect the applications using the -<accountLocator> URL. Use the Enterprise Snowflake edition.

D.

Connect the applications using the -<accountLocator> URL. Use the Business Critical Snowflake edition.

Buy Now
Questions 6

Based on the Snowflake object hierarchy, what securable objects belong directly to a Snowflake account? (Select THREE).

Options:

A.

Database

B.

Schema

C.

Table

D.

Stage

E.

Role

F.

Warehouse

Buy Now
Questions 7

An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).

Options:

A.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

B.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.

C.

Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

D.

Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.

E.

Configure the client application to issue a COPY INTO

command to Snowflake when new files have arrived in Amazon S3 Glacier storage.

Buy Now
command to Snowflake when new files have arrived in Amazon S3 Glacier storage. This option is not relevant because it does not use Snowpipe, but rather the standard COPY command, which is a batch loading method. Moreover, the COPY command also does not support ingesting files from Amazon S3 Glacier storage7 References:
  • 1: SnowPro Advanced: Architect | Study Guide 8
  • 2: Snowflake Documentation | Snowpipe Overview 9
  • 3: Snowflake Documentation | Using the Snowpipe REST API 10
  • 4: Snowflake Documentation | Loading Data Using Snowpipe and AWS Lambda 11
  • 5: Snowflake Documentation | Supported File Formats and Compression for Staged Data Files 12
  • 6: Snowflake Documentation | Using Cloud Notifications to Trigger Snowpipe 13
  • 7: Snowflake Documentation | Loading Data Using COPY into a Table
  • : SnowPro Advanced: Architect | Study Guide
  • : Snowpipe Overview
  • : Using the Snowpipe REST API
  • : Loading Data Using Snowpipe and AWS Lambda
  • : Supported File Formats and Compression for Staged Data Files
  • : Using Cloud Notifications to Trigger Snowpipe
  • : Loading Data Using COPY into a Table
  • Questions 8

    A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.

    What is the recommended way to validate data accessibility by the consumers?

    Options:

    A.

    Create reader accounts as shown below and impersonate the consumers by logging in with their credentials.

    create managed account reader_acctl admin_name = userl , adroin_password ■ 'Sdfed43da!44T , type = reader;

    B.

    Create a row access policy as shown below and assign it to the data share.

    create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acctl_role' = current_role() then true else false end;

    C.

    Set the session parameter called SIMULATED_DATA_SHARING_C0NSUMER as shown below in order to impersonate the consumer accounts.

    alter session set simulated_data_sharing_consumer - 'Consumer Acctl*

    D.

    Alter the share settings as shown below, in order to impersonate a specific consumer account.

    alter share sales share set accounts = 'Consumerl’ share restrictions = true

    Buy Now
    Questions 9

    Which of the following are characteristics of Snowflake’s parameter hierarchy?

    Options:

    A.

    Session parameters override virtual warehouse parameters.

    B.

    Virtual warehouse parameters override user parameters.

    C.

    Table parameters override virtual warehouse parameters.

    D.

    Schema parameters override account parameters.

    Buy Now
    Questions 10

    The following table exists in the production database:

    A regulatory requirement states that the company must mask the username for events that are older than six months based on the current date when the data is queried.

    How can the requirement be met without duplicating the event data and making sure it is applied when creating views using the table or cloning the table?

    Options:

    A.

    Use a masking policy on the username column using a entitlement table with valid dates.

    B.

    Use a row level policy on the user_events table using a entitlement table with valid dates.

    C.

    Use a masking policy on the username column with event_timestamp as a conditional column.

    D.

    Use a secure view on the user_events table using a case statement on the username column.

    Buy Now
    Questions 11

    A company has several sites in different regions from which the company wants to ingest data.

    Which of the following will enable this type of data ingestion?

    Options:

    A.

    The company must have a Snowflake account in each cloud region to be able to ingest data to that account.

    B.

    The company must replicate data between Snowflake accounts.

    C.

    The company should provision a reader account to each site and ingest the data through the reader accounts.

    D.

    The company should use a storage integration for the external stage.

    Buy Now
    Questions 12

    When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP() what will occur?

    Options:

    A.

    All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

    B.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

    C.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

    D.

    All rows loaded using a specific COPY statement will have the same timestamp value.

    Buy Now
    Questions 13

    Which data models can be used when modeling tables in a Snowflake environment? (Select THREE).

    Options:

    A.

    Graph model

    B.

    Dimensional/Kimball

    C.

    Data lake

    D.

    lnmon/3NF

    E.

    Bayesian hierarchical model

    F.

    Data vault

    Buy Now
    Questions 14

    When loading data from stage using COPY INTO, what options can you specify for the ON_ERROR clause?

    Options:

    A.

    CONTINUE

    B.

    SKIP_FILE

    C.

    ABORT_STATEMENT

    D.

    FAIL

    Buy Now
    Questions 15

    An Architect is implementing a CI/CD process. When attempting to clone a table from a production to a development environment, the cloning operation fails.

    What could be causing this to happen?

    Options:

    A.

    The table is transient.

    B.

    The table has a masking policy.

    C.

    The retention time for the table is set to zero.

    D.

    Tables cannot be cloned from a higher environment to a lower environment.

    Buy Now
    Questions 16

    A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.

    What steps should be taken to allow the IP addresses to be accessed? (Select TWO).

    Options:

    A.

    ALTERROLEANALYST_ROLESETNETWORK_POLICY='ANALYST_POLICY';

    B.

    ALTERUSERANALYSTJJSERSETNETWORK_POLICY='ANALYST_POLICY';

    C.

    ALTERUSERANALYST_USERSETNETWORK_POLICY='10.1.1.20';

    D.

    USE ROLE SECURITYADMIN;

    CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

    E.

    USE ROLE USERADMIN;

    CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY

    ALLOWED_IP_LIST = ('10.1.1.20');

    Buy Now
    Questions 17

    A DevOps team has a requirement for recovery of staging tables used in a complex set of data pipelines. The staging tables are all located in the same staging schema. One of the requirements is to have online recovery of data on a rolling 7-day basis.

    After setting up the DATA_RETENTION_TIME_IN_DAYS at the database level, certain tables remain unrecoverable past 1 day.

    What would cause this to occur? (Choose two.)

    Options:

    A.

    The staging schema has not been setup for MANAGED ACCESS.

    B.

    The DATA_RETENTION_TIME_IN_DAYS for the staging schema has been set to 1 day.

    C.

    The tables exceed the 1 TB limit for data recovery.

    D.

    The staging tables are of the TRANSIENT type.

    E.

    The DevOps role should be granted ALLOW_RECOVERY privilege on the staging schema.

    Buy Now
    Questions 18

    A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

    Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

    Options:

    A.

    Use, at minimum, the Business Critical edition of Snowflake.

    B.

    Create Dynamic Data Masking policies and apply them to columns that contain PHI.

    C.

    Use the Internal Tokenization feature to obfuscate sensitive data.

    D.

    Use the External Tokenization feature to obfuscate sensitive data.

    E.

    Rewrite SQL queries to eliminate projections of PHI data based on current_role().

    F.

    Avoid sharing data with partner organizations.

    Buy Now
    Questions 19

    A company has built a data pipeline using Snowpipe to ingest files from an Amazon S3 bucket. Snowpipe is configured to load data into staging database tables. Then a task runs to load the data from the staging database tables into the reporting database tables.

    The company is satisfied with the availability of the data in the reporting database tables, but the reporting tables are not pruning effectively. Currently, a size 4X-Large virtual warehouse is being used to query all of the tables in the reporting database.

    What step can be taken to improve the pruning of the reporting tables?

    Options:

    A.

    Eliminate the use of Snowpipe and load the files into internal stages using PUT commands.

    B.

    Increase the size of the virtual warehouse to a size 5X-Large.

    C.

    Use an ORDER BY command to load the reporting tables.

    D.

    Create larger files for Snowpipe to ingest and ensure the staging frequency does not exceed 1 minute.

    Buy Now
    Questions 20

    Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

    Options:

    A.

    Developers create their own datasets to work against transformed versions of the live data.

    B.

    Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

    C.

    Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

    D.

    Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

    E.

    The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

    Buy Now
    Questions 21

    A data platform team creates two multi-cluster virtual warehouses with the AUTO_SUSPEND value set to NULL on one. and '0' on the other. What would be the execution behavior of these virtual warehouses?

    Options:

    A.

    Setting a '0' or NULL value means the warehouses will never suspend.

    B.

    Setting a '0' or NULL value means the warehouses will suspend immediately.

    C.

    Setting a '0' or NULL value means the warehouses will suspend after the default of 600 seconds.

    D.

    Setting a '0' value means the warehouses will suspend immediately, and NULL means the warehouses will never suspend.

    Buy Now
    Questions 22

    What integration object should be used to place restrictions on where data may be exported?

    Options:

    A.

    Stage integration

    B.

    Security integration

    C.

    Storage integration

    D.

    API integration

    Buy Now
    Questions 23

    An Architect needs to improve the performance of reports that pull data from multiple Snowflake tables, join, and then aggregate the data. Users access the reports using several dashboards. There are performance issues on Monday mornings between 9:00am-11:00am when many users check the sales reports.

    The size of the group has increased from 4 to 8 users. Waiting times to refresh the dashboards has increased significantly. Currently this workload is being served by a virtual warehouse with the following parameters:

    AUTO-RESUME = TRUE AUTO_SUSPEND = 60 SIZE = Medium

    What is the MOST cost-effective way to increase the availability of the reports?

    Options:

    A.

    Use materialized views and pre-calculate the data.

    B.

    Increase the warehouse to size Large and set auto_suspend = 600.

    C.

    Use a multi-cluster warehouse in maximized mode with 2 size Medium clusters.

    D.

    Use a multi-cluster warehouse in auto-scale mode with 1 size Medium cluster, and set min_cluster_count = 1 and max_cluster_count = 4.

    Buy Now
    Questions 24

    Company A would like to share data in Snowflake with Company B. Company B is not on the same cloud platform as Company A.

    What is required to allow data sharing between these two companies?

    Options:

    A.

    Create a pipeline to write shared data to a cloud storage location in the target cloud provider.

    B.

    Ensure that all views are persisted, as views cannot be shared across cloud platforms.

    C.

    Setup data replication to the region and cloud platform where the consumer resides.

    D.

    Company A and Company B must agree to use a single cloud platform: Data sharing is only possible if the companies share the same cloud provider.

    Buy Now
    Questions 25

    A Data Engineer is designing a near real-time ingestion pipeline for a retail company to ingest event logs into Snowflake to derive insights. A Snowflake Architect is asked to define security best practices to configure access control privileges for the data load for auto-ingest to Snowpipe.

    What are the MINIMUM object privileges required for the Snowpipe user to execute Snowpipe?

    Options:

    A.

    OWNERSHIP on the named pipe, USAGE on the named stage, target database, and schema, and INSERT and SELECT on the target table

    B.

    OWNERSHIP on the named pipe, USAGE and READ on the named stage, USAGE on the target database and schema, and INSERT end SELECT on the target table

    C.

    CREATE on the named pipe, USAGE and READ on the named stage, USAGE on the target database and schema, and INSERT end SELECT on the target table

    D.

    USAGE on the named pipe, named stage, target database, and schema, and INSERT and SELECT on the target table

    Buy Now
    Questions 26

    An Architect has a design where files arrive every 10 minutes and are loaded into a primary database table using Snowpipe. A secondary database is refreshed every hour with the latest data from the primary database.

    Based on this scenario, what Time Travel query options are available on the secondary database?

    Options:

    A.

    A query using Time Travel in the secondary database is available for every hourly table version within the retention window.

    B.

    A query using Time Travel in the secondary database is available for every hourly table version within and outside the retention window.

    C.

    Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) in the retention window.

    D.

    Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) and outside the retention window.

    Buy Now
    Questions 27

    Consider the following COPY command which is loading data with CSV format into a Snowflake table from an internal stage through a data transformation query.

    ARA-R01 Question 27

    This command results in the following error:

    SQL compilation error: invalid parameter 'validation_mode'

    Assuming the syntax is correct, what is the cause of this error?

    Options:

    A.

    The VALIDATION_MODE parameter supports COPY statements that load data from external stages only.

    B.

    The VALIDATION_MODE parameter does not support COPY statements with CSV file formats.

    C.

    The VALIDATION_MODE parameter does not support COPY statements that transform data during a load.

    D.

    The value return_all_errors of the option VALIDATION_MODE is causing a compilation error.

    Buy Now
    Questions 28

    When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP () what will occur?

    Options:

    A.

    All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

    B.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

    C.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

    D.

    All rows loaded using a specific COPY statement will have the same timestamp value.

    Buy Now
    Questions 29

    How does a standard virtual warehouse policy work in Snowflake?

    Options:

    A.

    It conserves credits by keeping running clusters fully loaded rather than starting additional clusters.

    B.

    It starts only if the system estimates that there is a query load that will keep the cluster busy for at least 6 minutes.

    C.

    It starts only f the system estimates that there is a query load that will keep the cluster busy for at least 2 minutes.

    D.

    It prevents or minimizes queuing by starting additional clusters instead of conserving credits.

    Buy Now
    Questions 30

    What are characteristics of Dynamic Data Masking? (Select TWO).

    Options:

    A.

    A masking policy that Is currently set on a table can be dropped.

    B.

    A single masking policy can be applied to columns in different tables.

    C.

    A masking policy can be applied to the value column of an external table.

    D.

    The role that creates the masking policy will always see unmasked data In query results

    E.

    A masking policy can be applied to a column with the GEOGRAPHY data type.

    Buy Now
    Questions 31

    What is a characteristic of event notifications in Snowpipe?

    Options:

    A.

    The load history is stored In the metadata of the target table.

    B.

    Notifications identify the cloud storage event and the actual data in the files.

    C.

    Snowflake can process all older notifications when a paused pipe Is resumed.

    D.

    When a pipe Is paused, event messages received for the pipe enter a limited retention period.

    Buy Now
    Questions 32

    When activating Tri-Secret Secure in a hierarchical encryption model in a Snowflake account, at what level is the customer-managed key used?

    ARA-R01 Question 32

    Options:

    A.

    At the root level (HSM)

    B.

    At the account level (AMK)

    C.

    At the table level (TMK)

    D.

    At the micro-partition level

    Buy Now
    Questions 33

    A user is executing the following command sequentially within a timeframe of 10 minutes from start to finish:

    ARA-R01 Question 33

    What would be the output of this query?

    Options:

    A.

    Table T_SALES_CLONE successfully created.

    B.

    Time Travel data is not available for table T_SALES.

    C.

    The offset -> is not a valid clause in the clone operation.

    D.

    Syntax error line 1 at position 58 unexpected 'at’.

    Buy Now
    Questions 34

    Consider the following scenario where a masking policy is applied on the CREDICARDND column of the CREDITCARDINFO table. The masking policy definition Is as follows:

    ARA-R01 Question 34

    Sample data for the CREDITCARDINFO table is as follows:

    NAME EXPIRYDATE CREDITCARDNO

    JOHN DOE 2022-07-23 4321 5678 9012 1234

    if the Snowflake system rotes have not been granted any additional roles, what will be the result?

    Options:

    A.

    The sysadmin can see the CREDICARDND column data in clear text.

    B.

    The owner of the table will see the CREDICARDND column data in clear text.

    C.

    Anyone with the Pl_ANALYTICS role will see the last 4 characters of the CREDICARDND column data in dear text.

    D.

    Anyone with the Pl_ANALYTICS role will see the CREDICARDND column as*** 'MASKED* **'.

    Buy Now
    Questions 35

    The data share exists between a data provider account and a data consumer account. Five tables from the provider account are being shared with the consumer account. The consumer role has been granted the imported privileges privilege.

    What will happen to the consumer account if a new table (table_6) is added to the provider schema?

    Options:

    A.

    The consumer role will automatically see the new table and no additional grants are needed.

    B.

    The consumer role will see the table only after this grant is given on the consumer side:

    grant imported privileges on database PSHARE_EDW_4TEST_DB to DEV_ROLE;

    C.

    The consumer role will see the table only after this grant is given on the provider side:

    use role accountadmin;

    Grant select on table EDW.ACCOUNTING.Table_6 to share PSHARE_EDW_4TEST;

    D.

    The consumer role will see the table only after this grant is given on the provider side:

    use role accountadmin;

    grant usage on database EDW to share PSHARE_EDW_4TEST ;

    grant usage on schema EDW.ACCOUNTING to share PSHARE_EDW_4TEST ;

    Grant select on table EDW.ACCOUNTING.Table_6 to database PSHARE_EDW_4TEST_DB ;

    Buy Now
    Questions 36

    A group of Data Analysts have been granted the role analyst role. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.

    How should these requirements be met?

    Options:

    A.

    Grant ANALYST_R0LE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.

    B.

    Grant SYSADMIN ownership of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.

    C.

    Make every schema in the database a managed access schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.

    D.

    Grant ANALYST_ROLE ownership on the database, but grant the ownership on future [object type] s in database privilege to SYSADMIN.

    Buy Now
    Questions 37

    A retail company has 2000+ stores spread across the country. Store Managers report that they are having trouble running key reports related to inventory management, sales targets, payroll, and staffing during business hours. The Managers report that performance is poor and time-outs occur frequently.

    Currently all reports share the same Snowflake virtual warehouse.

    How should this situation be addressed? (Select TWO).

    Options:

    A.

    Use a Business Intelligence tool for in-memory computation to improve performance.

    B.

    Configure a dedicated virtual warehouse for the Store Manager team.

    C.

    Configure the virtual warehouse to be multi-clustered.

    D.

    Configure the virtual warehouse to size 4-XL

    E.

    Advise the Store Manager team to defer report execution to off-business hours.

    Buy Now
    Questions 38

    What is a characteristic of loading data into Snowflake using the Snowflake Connector for Kafka?

    Options:

    A.

    The Connector only works in Snowflake regions that use AWS infrastructure.

    B.

    The Connector works with all file formats, including text, JSON, Avro, Ore, Parquet, and XML.

    C.

    The Connector creates and manages its own stage, file format, and pipe objects.

    D.

    Loads using the Connector will have lower latency than Snowpipe and will ingest data in real time.

    Buy Now
    Questions 39

    An Architect needs to design a Snowflake account and database strategy to store and analyze large amounts of structured and semi-structured data. There are many business units and departments within the company. The requirements are scalability, security, and cost efficiency.

    What design should be used?

    Options:

    A.

    Create a single Snowflake account and database for all data storage and analysis needs, regardless of data volume or complexity.

    B.

    Set up separate Snowflake accounts and databases for each department or business unit, to ensure data isolation and security.

    C.

    Use Snowflake's data lake functionality to store and analyze all data in a central location, without the need for structured schemas or indexes

    D.

    Use a centralized Snowflake database for core business data, and use separate databases for departmental or project-specific data.

    Buy Now
    Questions 40

    A retail company has over 3000 stores all using the same Point of Sale (POS) system. The company wants to deliver near real-time sales results to category managers. The stores operate in a variety of time zones and exhibit a dynamic range of transactions each minute, with some stores having higher sales volumes than others.

    Sales results are provided in a uniform fashion using data engineered fields that will be calculated in a complex data pipeline. Calculations include exceptions, aggregations, and scoring using external functions interfaced to scoring algorithms. The source data for aggregations has over 100M rows.

    Every minute, the POS sends all sales transactions files to a cloud storage location with a naming convention that includes store numbers and timestamps to identify the set of transactions contained in the files. The files are typically less than 10MB in size.

    How can the near real-time results be provided to the category managers? (Select TWO).

    Options:

    A.

    All files should be concatenated before ingestion into Snowflake to avoid micro-ingestion.

    B.

    A Snowpipe should be created and configured with AUTO_INGEST = true. A stream should be created to process INSERTS into a single target table using the stream metadata to inform the store number and timestamps.

    C.

    A stream should be created to accumulate the near real-time data and a task should be created that runs at a frequency that matches the real-time analytics needs.

    D.

    An external scheduler should examine the contents of the cloud storage location and issue SnowSQL commands to process the data at a frequency that matches the real-time analytics needs.

    E.

    The copy into command with a task scheduled to run every second should be used to achieve the near-real time requirement.

    Buy Now
    Questions 41

    An Architect needs to automate the daily Import of two files from an external stage into Snowflake. One file has Parquet-formatted data, the other has CSV-formatted data.

    How should the data be joined and aggregated to produce a final result set?

    Options:

    A.

    Use Snowpipe to ingest the two files, then create a materialized view to produce the final result set.

    B.

    Create a task using Snowflake scripting that will import the files, and then call a User-Defined Function (UDF) to produce the final result set.

    C.

    Create a JavaScript stored procedure to read. join, and aggregate the data directly from the external stage, and then store the results in a table.

    D.

    Create a materialized view to read, Join, and aggregate the data directly from the external stage, and use the view to produce the final result set

    Buy Now
    Questions 42

    What are characteristics of the use of transactions in Snowflake? (Select TWO).

    Options:

    A.

    Explicit transactions can contain DDL, DML, and query statements.

    B.

    The autocommit setting can be changed inside a stored procedure.

    C.

    A transaction can be started explicitly by executing a begin work statement and end explicitly by executing a commit work statement.

    D.

    A transaction can be started explicitly by executing a begin transaction statement and end explicitly by executing an end transaction statement.

    E.

    Explicit transactions should contain only DML statements and query statements. All DDL statements implicitly commit active transactions.

    Buy Now
    Questions 43

    A user has the appropriate privilege to see unmasked data in a column.

    If the user loads this column data into another column that does not have a masking policy, what will occur?

    Options:

    A.

    Unmasked data will be loaded in the new column.

    B.

    Masked data will be loaded into the new column.

    C.

    Unmasked data will be loaded into the new column but only users with the appropriate privileges will be able to see the unmasked data.

    D.

    Unmasked data will be loaded into the new column and no users will be able to see the unmasked data.

    Buy Now
    Questions 44

    A table for IOT devices that measures water usage is created. The table quickly becomes large and contains more than 2 billion rows.

    ARA-R01 Question 44

    The general query patterns for the table are:

    1. DeviceId, lOT_timestamp and Customerld are frequently used in the filter predicate for the select statement

    2. The columns City and DeviceManuf acturer are often retrieved

    3. There is often a count on Uniqueld

    Which field(s) should be used for the clustering key?

    Options:

    A.

    lOT_timestamp

    B.

    City and DeviceManuf acturer

    C.

    Deviceld and Customerld

    D.

    Uniqueld

    Buy Now
    Questions 45

    What built-in Snowflake features make use of the change tracking metadata for a table? (Choose two.)

    Options:

    A.

    The MERGE command

    B.

    The UPSERT command

    C.

    The CHANGES clause

    D.

    A STREAM object

    E.

    The CHANGE_DATA_CAPTURE command

    Buy Now
    Questions 46

    An Architect Is designing a data lake with Snowflake. The company has structured, semi-structured, and unstructured data. The company wants to save the data inside the data lake within the Snowflake system. The company is planning on sharing data among Its corporate branches using Snowflake data sharing.

    What should be considered when sharing the unstructured data within Snowflake?

    Options:

    A.

    A pre-signed URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with no time limit for the URL.

    B.

    A scoped URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 24-hour time limit for the URL.

    C.

    A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 7-day time limit for the URL.

    D.

    A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with the "expiration_time" argument defined for the URL time limit.

    Buy Now
    Questions 47

    A company's Architect needs to find an efficient way to get data from an external partner, who is also a Snowflake user. The current solution is based on daily JSON extracts that are placed on an FTP server and uploaded to Snowflake manually. The files are changed several times each month, and the ingestion process needs to be adapted to accommodate these changes.

    What would be the MOST efficient solution?

    Options:

    A.

    Ask the partner to create a share and add the company's account.

    B.

    Ask the partner to use the data lake export feature and place the data into cloud storage where Snowflake can natively ingest it (schema-on-read).

    C.

    Keep the current structure but request that the partner stop changing files, instead only appending new files.

    D.

    Ask the partner to set up a Snowflake reader account and use that account to get the data for ingestion.

    Buy Now
    Exam Code: ARA-R01
    Exam Name: SnowPro Advanced: Architect Recertification Exam
    Last Update: Sep 15, 2025
    Questions: 162

    PDF + Testing Engine

    $134.99

    Testing Engine

    $99.99

    PDF (Q&A)

    $84.99