Topic 2: Exam pool B
A user has unloaded data from a Snowflake table to an external stage. Which command can be used to verify if data has been uploaded to the external stage named my_stage?
A.
view @my_stage
B.
list @my_stage
C.
show @my_stage
D.
display @my_stage
list @my_stage
Which Snowflake architectural layer is responsible for a query execution plan?
A.
Compute
B.
Data storage
C.
Cloud services
D.
Cloud provider
Cloud provider
What is the MINIMUM edition of Snowflake that is required to use a SCIM security integration?
A.
Business Critical Edition
B.
Standard Edition
C.
Virtual Private Snowflake (VPS)
D.
Enterprise Edition
Enterprise Edition
What actions will prevent leveraging of the ResultSet cache? (Choose two.)
A.
Removing a column from the query SELECT list
B.
Stopping the virtual warehouse that the query is running against
C.
Clustering of the data used by the query
D.
Executing the RESULTS_SCAN() table function
E.
Changing a column that is not in the cached query
Removing a column from the query SELECT list
Executing the RESULTS_SCAN() table function
Network policies can be set at which Snowflake levels? (Choose two.)
A.
Role
B.
Schema
C.
User
D.
Database
E.
Account
F.
Tables
User
Account
What is the minimum Snowflake edition that has column-level security enabled?
A.
Standard
B.
Enterprise
C.
Business Critical
D.
Virtual Private Snowflake
Enterprise
What are the responsibilities of Snowflake's Cloud Service layer? (Choose three.)
A.
Authentication
B.
Resource management
C.
Virtual warehouse caching
D.
Query parsing and optimization
E.
Query execution
F.
Physical storage of micro-partitions
Authentication
Query parsing and optimization
Query execution
What features that are part of the Continuous Data Protection (CDP) feature set in Snowflake do not require additional configuration? (Choose two.)
A.
Row level access policies
B.
Data masking policies
C.
Data encryption
D.
Time Travel
E.
External tokenization
Data encryption
Time Travel
A table needs to be loaded. The input data is in JSON format and is a concatenation of multiple JSON documents. The file size is 3 GB. A warehouse size small is being used.
The following COPY INTO command was executed:
COPY INTO SAMPLE FROM @~/SAMPLE.JSON (TYPE=JSON)
The load failed with this error:
Max LOB size (16777216) exceeded, actual size of parsed column is 17894470.
How can this issue be resolved?
A.
Compress the file and load the compressed file.
B.
Split the file into multiple files in the recommended size range (100 MB - 250 MB).
C.
Use a larger-sized warehouse.
D.
Set STRIP_OUTER_ARRAY=TRUE in the COPY INTO command.
Compress the file and load the compressed file.
What is the purpose of multi-cluster virtual warehouses?
A.
To create separate data warehouses to increase query optimization
B.
To allow users the ability to choose the type of compute nodes that make up a virtual warehouse cluster
C.
To eliminate or reduce Queuing of concurrent queries
D.
To allow the warehouse to resize automatically
To eliminate or reduce Queuing of concurrent queries
Explanation: https://docs.snowflake.com/en/user-guide/warehouses-multicluster.html#:~:text=Multi%2Dcluster-warehouses-enable-you,during-peak-and-off-hours.
Page 15 out of 50 Pages |
Previous |