Snowflake snowpro advanced architect practice test

snowpro advanced architect

Last exam update: Oct 11 ,2024
Page 1 out of 6. Viewing questions 1-10 out of 61

Question 1

An Architect is designing a pipeline to stream event data into Snowflake using the Snowflake Kafka connector. The Architects highest priority is to configure the connector to stream data in the MOST cost-effective manner.
Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?

  • A. Utilize a higher Buffer.flush.time in the connector configuration.
  • B. Utilize a higher Buffer.size.bytes in the connector configuration.
  • C. Utilize a lower Buffer.size.bytes in the connector configuration.
  • D. Utilize a lower Buffer.count.records in the connector configuration.
Answer:

d

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 2

A company has a Snowflake account named ACCOUNTA in AWS us-east-1 region. The company stores its marketing data in a Snowflake database named MARKET_DB. One of the companys business partners has an account named PARTNERB in Azure East US 2 region. For marketing purposes the company has agreed to share the database MARKET_DB with the partner account.
Which of the following steps MUST be performed for the account PARTNERB to consume data from the MARKET_DB database?

  • A. Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA create a share of database MARKET_DB, create a new database out of this share locally in AWS us-east-1 region, and replicate this new database to AZABC123 account. Then set up data sharing to the PARTNERB account.
  • B. From account ACCOUNTA create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then make this database the provider and share it with the PARTNERB account.
  • C. Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA replicate the database MARKET_DB to AZABC123 and from this account set up the data sharing to the PARTNERB account.
  • D. Create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then replicate this database to the partners account PARTNERB.
Answer:

c -

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 3

What integration object should be used to place restrictions on where data may be exported?

  • A. Stage integration
  • B. Security integration
  • C. Storage integration
  • D. API integration
Answer:

c

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 4

A DevOps team has a requirement for recovery of staging tables used in a complex set of data pipelines. The staging tables are all located in the same staging schema. One of the requirements is to have online recovery of data on a rolling 7-day basis.
After setting up the DATA_RETENTION_TIME_IN_DAYS at the database level, certain tables remain unrecoverable past 1 day.
What would cause this to occur? (Choose two.)

  • A. The staging schema has not been setup for MANAGED ACCESS.
  • B. The DATA_RETENTION_TIME_IN_DAYS for the staging schema has been set to 1 day.
  • C. The tables exceed the 1 TB limit for data recovery.
  • D. The staging tables are of the TRANSIENT type.
  • E. The DevOps role should be granted ALLOW_RECOVERY privilege on the staging schema.
Answer:

bd

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
Discussions
vote your answer:
A
B
C
D
E
0 / 1000

Question 5

A company has an inbound share set up with eight tables and five secure views. The company plans to make the share part of its production data pipelines.
Which actions can the company take with the inbound share? (Choose two.)

  • A. Clone a table from a share.
  • B. Grant modify permissions on the share.
  • C. Create a table from the shared database.
  • D. Create additional views inside the shared database.
  • E. Create a table stream on the shared table.
Answer:

ce

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
Discussions
vote your answer:
A
B
C
D
E
0 / 1000

Question 6

What Snowflake features should be leveraged when modeling using Data Vault?

  • A. Snowflakes support of multi-table inserts into the data models Data Vault tables
  • B. Data needs to be pre-partitioned to obtain a superior data access performance
  • C. Scaling up the virtual warehouses will support parallel processing of new source loads
  • D. Snowflakes ability to hash keys so that hash key joins can run faster than integer joins
Answer:

c

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 7

Which steps are recommended best practices for prioritizing cluster keys in Snowflake? (Choose two.)

  • A. Choose columns that are frequently used in join predicates.
  • B. Choose lower cardinality columns to support clustering keys and cost effectiveness.
  • C. Choose TIMESTAMP columns with nanoseconds for the highest number of unique rows.
  • D. Choose cluster columns that are most actively used in selective filters.
  • E. Choose cluster columns that are actively used in the GROUP BY clauses.
Answer:

ad

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
Discussions
vote your answer:
A
B
C
D
E
0 / 1000

Question 8

What are purposes for creating a storage integration? (Choose three.)

  • A. Control access to Snowflake data using a master encryption key that is maintained in the cloud providers key management service.
  • B. Store a generated identity and access management (IAM) entity for an external cloud provider regardless of the cloud provider that hosts the Snowflake account.
  • C. Support multiple external stages using one single Snowflake object.
  • D. Avoid supplying credentials when creating a stage or when loading or unloading data.
  • E. Create private VPC endpoints that allow direct, secure connectivity between VPCs without traversing the public internet.
  • F. Manage credentials from multiple cloud providers in one single Snowflake object.
Answer:

bcd

User Votes:
A
50%
B
50%
C
50%
D
50%
E
50%
F
50%
Discussions
vote your answer:
A
B
C
D
E
F
0 / 1000

Question 9

At which object type level can the APPLY MASKING POLICY, APPLY ROW ACCESS POLICY and APPLY SESSION POLICY privileges be granted?

  • A. Global
  • B. Database
  • C. Schema
  • D. Table
Answer:

d

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000

Question 10

The following DDL command was used to create a task based on a stream:

Assuming MY_WH is set to auto_suspend 60 and used exclusively for this task, which statement is true?

  • A. The warehouse MY_WH will be made active every five minutes to check the stream.
  • B. The warehouse MY_WH will only be active when there are results in the stream.
  • C. The warehouse MY_WH will never suspend.
  • D. The warehouse MY_WH will automatically resize to accommodate the size of the stream.
Answer:

a

User Votes:
A
50%
B
50%
C
50%
D
50%
Discussions
vote your answer:
A
B
C
D
0 / 1000
To page 2