Topic 6: Misc. Questions
You have an on-premises multi-tier application named App1 that includes a web tier, an application tier, and a Microsoft SQL Server tier. All the tiers run on Hyper-V virtual machines. Your new disaster recovery plan requires that all business-critical applications can be recovered to Azure. You need to recommend a solution to fail over the database tier of App1 to Azure. The
solution must provide the ability to test failover to Azure without affecting the current environment.
What should you include in the recommendation?
A. Azure Backup
B. Azure Information Protection
C. Windows Server Failover Cluster
D. Azure Site Recovery
You are developing an application that uses Azure Data Lake Storage Gen 2.
You need to recommend a solution to grant permissions to a specific application for a limited time period. What should you include in the recommendation?
A. role assignments
B. account keys
C. shared access signatures (SAS)
D. Azure Active Directory (Azure AD) identities
Explanation:
A shared access signature (SAS) provides secure delegated access to resources in your
storage account. With a SAS, you have granular control over how a client can access your
data. For example:
What resources the client may access.
You have an Azure Synapse Analytics dedicated SQL pool named Pool1 and a database
named DB1. DB1 contains a fact table named Table.
You need to identify the extent of the data skew in Table1.
What should you do in Synapse Studio?
A. Connect to Pool1 and query sys.dm_pdw_nodes_db_partition_stats.
B. Connect to the built-in pool and run DBCC CHECKALLOC.
C. Connect to Pool1 and run DBCC CHECKALLOC.
D. Connect to the built-in pool and query sys.dm_pdw_nodes_db_partition_stats.
You have an Azure SQL Database managed instance.
The instance starts experiencing performance issues.
You need to identify which query is causing the issue and retrieve the execution plan for the query. The solution must minimize administrative effort.
What should you use?
A. the Azure portal
B. Extended Events
C. Query Store
D. dynamic management views
Note: This question is part of a series of questions that present the same scenario.
Each question in the series contains a unique solution that might meet the stated
goals. Some question sets might have more than one correct solution, while others
might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a
result, these questions will not appear in the review screen.
You have an Azure Data Lake Storage account that contains a staging zone.
You need to design a daily process to ingest incremental data from the staging zone,
transform the data by executing an R script, and then insert the transformed data into a
data warehouse in Azure Synapse Analytics.
Solution: You use an Azure Data Factory schedule trigger to execute a pipeline that
executes mapping data flow, and then inserts the data into the data warehouse.
Does this meet the goal?
A. Yes
B. No
You have an Azure Synapse Analytics workspace named WS1 that contains an Apache Spark pool named Pool1.
You plan to create a database named DB1 in Pool1.
You need to ensure that when tables are created in DB1, the tables are available
automatically as external tables to the built-in serverless SQL pool.
Which format should you use for the tables in DB1?
A. JSON
B. CSV
C. Parquet
D. ORC
Explanation:
Serverless SQL pool can automatically synchronize metadata from Apache Spark. A
serverless SQL pool database will be created for each database existing in serverless
Apache Spark pools.
For each Spark external table based on Parquet and located in Azure Storage, an external
table is created in a serverless SQL pool database. As such, you can shut down your
Spark pools and still query Spark external tables from serverless SQL pool.
You have an Azure SQL managed instance named SQL1 and two Azure web apps named
App1 and App2.
You need to limit the number of IOPs that App2 queries generate on SQL1.
Which two actions should you perform on SQL1? Each correct answer presents part of the
solution.
NOTE: Each correct selection is worth one point.
A. Enable query optimizer fixes.
B. Enable Resource Governor.
C. Enable parameter sniffing.
D. Create a workload group.
E. Configure In-memory OLTP.
F. Run the Database Engine Tuning Advisor.
G. Reduce the Max Degree of Parallelism value.
Your company uses Azure Stream Analytics to monitor devices.
The company plans to double the number of devices that are monitored.
You need to monitor a Stream Analytics job to ensure that there are enough processing
resources to handle the
additional load.
Which metric should you monitor?
A. Input Deserialization Errors
B. Late Input Events
C. Early Input Events
D. Watermark delay
Explanation:
The Watermark delay metric is computed as the wall clock time of the processing node
minus the largest watermark it has seen so far.
The watermark delay metric can rise due to:
1. Not enough processing resources in Stream Analytics to handle the volume of input
events.
2. Not enough throughput within the input event brokers, so they are throttled.
You have an Azure Databricks workspace named workspace1 in the Standard pricing tier.
Workspace1 contains an all-purpose cluster named cluster1.
You need to reduce the time it takes for cluster1 to start and scale up. The solution must minimize costs. What should you do first?
A. Upgrade workspace1 to the Premium pricing tier.
B. Configure a global init script for workspace1.
C. Create a pool in workspace1.
D. Create a cluster policy in workspace1.
You are designing an anomaly detection solution for streaming data from an Azure IoT hub. The solution must meet the following requirements:
Send the output to an Azure Synapse.
Identify spikes and dips in time series data.
Minimize development and configuration effort.
Which should you include in the solution?
A. Azure SQL Database
B. Azure Databricks
C. Azure Stram Analytics
You deploy a database to an Azure SQL Database managed instance. You need to prevent read queries from blocking queries that are trying to write to the database. Which database option should set?
A. PARAMETERIZATION to FORCED
B. PARAMETERIZATION to SIMPLE
C. Delayed Durability to Forced
D. READ_COMMITTED_SNAPSHOT to ON
Explanation:
In SQL Server, you can also minimize locking contention while protecting transactions from
dirty reads of uncommitted data modifications using either:
The READ COMMITTED isolation level with the
READ_COMMITTED_SNAPSHOT database option set to ON.
The SNAPSHOT isolation level.
If READ_COMMITTED_SNAPSHOT is set to ON (the default on SQL Azure Database),
the Database Engine uses row versioning to present each statement with a transactionally
consistent snapshot of the data as it existed at the start of the statement. Locks are not
used to protect the data from updates by other transactions.
You are designing an enterprise data warehouse in Azure Synapse Analytics that will
contain a table named Customers. Customers will contain credit card information.
You need to recommend a solution to provide salespeople with the ability to view all the
entries in Customers.
The solution must prevent all the salespeople from viewing or inferring the credit card information.
What should you include in the recommendation?
A. row-level security
B. data masking
C. Always Encrypted
D. column-level security
Explanation:
Azure SQL Database, Azure SQL Managed Instance, and Azure Synapse Analytics
support dynamic data masking. Dynamic data masking limits sensitive data exposure by
masking it to non-privileged users.
The Credit card masking method exposes the last four digits of the designated fields and
adds a constant string as a prefix in the form of a credit card.
Example:
XXXX-XXXX-XXXX-1234
Page 9 out of 28 Pages |
Previous |