You have a Fabric tenant that contains a warehouse. The warehouse uses row-level security (RLS). You create a Direct Lake semantic model that uses the Delta tables and RLS of the warehouse. When users interact with a report built from the model, which mode will be used by the DAX queries?
You have a Microsoft Power Bl semantic model that contains measures. The measures use multiple calculate functions and a filter function.
You are evaluating the performance of the measures.
In which use case will replacing the filter function with the keepfilters function reduce execution time?
You have source data in a CSV file that has the following fields:
• SalesTra nsactionl D
• SaleDate
• CustomerCode
• CustomerName
• CustomerAddress
• ProductCode
• ProductName
• Quantity
• UnitPrice
You plan to implement a star schema for the tables in WH1. Thedimension tables in WH1 will implement Type 2 slowly changing dimension (SCD) logic.
You need to design the tables that will be used for sales transaction analysis and load the source data.
Which type of target table should you specify for the CustomerName, CustomerCode, and SaleDate fields? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
CC
You have a Fabric tenant that contains two workspaces named Workspace1 and Workspace2.
Workspace1 is used as the development environment.
Workspace2 is used as the production environment.
Each environment uses a different storage account.
Workspace1 contains a Dataflow Gen2 named Dataflow1. The data source of Dataflow1 is a CSV file in blob storage.
You plan to implement a deployment pipeline to deploy items from Workspace1 to Workspace2.
You need to ensure that the data source references the correct location in the production environment.
What should you do?
You have a Microsoft Power Bl project that contains a file named definition.pbir. definition.pbir contains the following JSON.

For each of the following statements, select Yes if the statement is true. Otherwise, select No. NOTE: Each correct selection is worth one point.

You have a Fabric tenant that contains a new semantic model in OneLake.
You use a Fabric notebook to read the data into a Spark DataFrame.
You need to evaluate the data to calculate the min, max, mean, and standard deviation values for all the string and numeric columns.
Solution: You use the following PySpark expression:
df.explain()
Does this meet the goal?