DP-700 Practice Test Questions

96 Questions


Topic 3: Misc. Questions Set

You have a Fabric workspace that contains a warehouse named Warehouse1.
While monitoring Warehouse1, you discover that query performance has degraded during the last 60 minutes.
You need to isolate all the queries that were run during the last 60 minutes. The results must include the username of the users that submitted the queries and the query statements. What should you use?


A. the Microsoft Fabric Capacity Metrics app


B. views from the queryinsights schema


C. Query activity


D. the sys.dm_exec_requests dynamic management view





B.
  views from the queryinsights schema

You have a Fabric warehouse named DW1 that loads data by using a data pipeline named Pipeline1. Pipeline1 uses a Copy data activity with a dynamic SQL source. Pipeline1 is scheduled to run every 15 minutes.
You discover that Pipeline1 keeps failing.
You need to identify which SQL query was executed when the pipeline failed.
What should you do?


A. From Monitoring hub, select the latest failed run of Pipeline1, and then view the output JSON.


B. From Monitoring hub, select the latest failed run of Pipeline1, and then view the input JSON.


C. From Real-time hub, select Fabric events, and then review the details of Microsoft.Fabric.ItemReadFailed.


D. From Real-time hub, select Fabric events, and then review the details of Microsoft. Fabric.ItemUpdateFailed.





B.
  From Monitoring hub, select the latest failed run of Pipeline1, and then view the input JSON.

Explanation: The input JSON contains the configuration details and parameters passed to the Copy data activity during execution, including the dynamically generated SQL query.
Viewing the input JSON for the failed pipeline run provides direct insight into what query was executed at the time of failure.

You have a Fabric workspace that contains a lakehouse named Lakehouse1. Data is ingested into Lakehouse1 as one flat table. The table contains the following columns.



You plan to load the data into a dimensional model and implement a star schema. From the original flat table, you create two tables named FactSales and DimProduct. You will track changes in DimProduct.
You need to prepare the data.
Which three columns should you include in the DimProduct table? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.


A. Date


B. ProductName


C. ProductColor


D. TransactionID


E. SalesAmount


F. ProductID





B.
  ProductName

C.
  ProductColor

F.
  ProductID

You have three users named User1, User2, and User3. You have the Fabric workspaces shown in the following table.



You have a security group named Group1 that contains User1 and User3. The Fabric admin creates the domains shown in the following table.



User1 creates a new workspace named Workspace3.
You add Group1 to the default domain of Domain1.
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE: Each correct selection is worth one point.






You have a Fabric workspace that contains a warehouse named Warehouse1. Data is loaded daily into Warehouse1 by using data pipelines and stored procedures.
You discover that the daily data load takes longer than expected.
You need to monitor Warehouse1 to identify the names of users that are actively running queries.
Which view should you use?


A. sys.dm_exec_connections


B. sys.dm_exec_requests


C. queryinsights.long_running_queries


D. queryinsights.frequently_run_queries


E. sys.dm_exec_sessions





E.
  sys.dm_exec_sessions

Explanation: sys.dm_exec_sessions provides real-time information about all active sessions, including the user, session ID, and status of the session. You can filter on session status to see users actively running queries.

You have a Fabric warehouse named DW1. DW1 contains a table that stores sales data and is used by multiple sales representatives.
You plan to implement row-level security (RLS).
You need to ensure that the sales representatives can see only their respective data.
Which warehouse object do you require to implement RLS?


A. ISTORED PROCEDURE


B. CONSTRAINT


C. SCHEMA


D. FUNCTION





D.
  FUNCTION

Explanation: To implement Row-Level Security (RLS) in a Fabric warehouse, you need to use a function that defines the security logic for filtering the rows of data based on the user's identity or role. This function can be used in conjunction with a security policy to control access to specific rows in a table.
In the case of sales representatives, the function would define the filtering criteria (e.g., based on a column such as SalesRepID or SalesRepName), ensuring that each representative can only see their respective data.

You have a Fabric workspace that contains a lakehouse named Lakehouse1.
In an external data source, you have data files that are 500 GB each. A new file is added every day.
You need to ingest the data into Lakehouse1 without applying any transformations. The solution must meet the following requirements
Trigger the process when a new file is added.
Provide the highest throughput.
Which type of item should you use to ingest the data?


A. Event stream


B. Dataflow Gen2


C. Streaming dataset


D. Data pipeline





A.
  Event stream

Explanation: To ingest large files (500 GB each) from an external data source into Lakehouse1 with high throughput and to trigger the process when a new file is added, an Eventstream is the best solution.
An Eventstream in Fabric is designed for handling real-time data streams and can efficiently ingest large files as soon as they are added to an external source. It is optimized for high throughput and can be configured to trigger upon detecting new files, allowing for fast and continuous ingestion of data with minimal delay.

You have a Fabric workspace named Workspacel that contains the following items:

• A Microsoft Power Bl report named Reportl
• A Power Bl dashboard named Dashboardl
• A semantic model named Modell
• A lakehouse name Lakehouse1

Your company requires that specific governance processes be implemented for the items.
Which items can you endorse in Fabric?


A. Lakehouse1, Modell, and Dashboard1 only


B. Lakehouse1, Modell, Report1 and Dashboard1


C. Report1 and Dashboard1 only


D. Model1, Report1, and Dashboard1 only


E. Lakehouse1, Model1, and Report1 only





B.
  Lakehouse1, Modell, Report1 and Dashboard1

You have a Fabric workspace that contains a warehouse named Warehouse1. Warehouse! contains a table named Customer. Customer contains the following data.



You have an internal Microsoft Entra user named User1 that has an email address of user1@contoso.com.
You need to provide User1 with access to the Customer table. The solution must prevent User1 from accessing the CreditCard column.
How should you complete the statement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.






You have a Fabric workspace that contains an eventhouse and a KQL database named Database1. Database1 has the following:

A table named Table1
A table named Table2
An update policy named Policy1
Policy1 sends data from Table1 to Table2.
The following is a sample of the data in Table2.



Recently, the following actions were performed on Table1:

An additional element named temperature was added to the StreamData column.
The data type of the Timestamp column was changed to date.
The data type of the DeviceId column was changed to string.
You plan to load additional records to Table2.
Which two records will load from Table1 to Table2? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.


A. Option A


B. Option B


C. Option C


D. Option D





B.
  Option B

D.
  Option D

Explanation: Changes to Table1 Structure:

StreamData column: An additional temperature element was added.
Timestamp column: Data type changed from datetime to date.
DeviceId column: Data type changed from guid to string.

Impact of Changes:

Only records that comply with Table2’s structure will load.
Records that deviate from Table2’s column data types or structure will be rejected.

Record B:

Timestamp: Matches Table2 (datetime format).
DeviceId: Matches Table2 (guid format).
StreamData: Contains only the index and eventid, which matches Table2.
Accepted because it fully matches Table2’s structure and data types.

Record D:

Timestamp: Matches Table2 (datetime format).
DeviceId: Matches Table2 (guid format).
StreamData: Matches Table2’s structure.
Accepted because it fully matches Table2’s structure and data types.

You have a Fabric workspace named Workspace1 that contains a warehouse named Warehouse1.
You plan to deploy Warehouse1 to a new workspace named Workspace2.
As part of the deployment process, you need to verify whether Warehouse1 contains invalid references. The solution must minimize development effort.
What should you use?


A. a database project


B. a deployment pipeline


C. a Python script


D. a T-SQL script





C.
  a Python script

Explanation: A deployment pipeline in Fabric allows you to deploy assets like warehouses, datasets, and reports between different workspaces (such as from Workspace1 to Workspace2). One of the key features of a deployment pipeline is the ability to check for invalid references before deployment. This can help identify issues with assets, such as broken links or dependencies, ensuring the deployment is successful without introducing errors. This is the most efficient way to verify references and manage the deployment with minimal development effort.

You have a Fabric workspace that contains a Real-Time Intelligence solution and an eventhouse.
Users report that from OneLake file explorer, they cannot see the data from the eventhouse.
You enable OneLake availability for the eventhouse.
What will be copied to OneLake?


A. only data added to new databases that are added to the eventhouse


B. only the existing data in the eventhouse


C. no data


D. both new data and existing data in the eventhouse


E. only new data added to the eventhouse





D.
  both new data and existing data in the eventhouse

Explanation: When you enable OneLake availability for an eventhouse, both new and existing data in the eventhouse will be copied to OneLake. This feature ensures that data, whether newly ingested or already present, becomes available for access through OneLake, making it easier for users to interact with and explore the data directly from OneLake file explorer.


Page 2 out of 8 Pages
Previous