Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: bigdisc65

DP-700 Exam Dumps - Microsoft Certified: Fabric Data Engineer Associate Questions and Answers

Question # 14

You have a Fabric workspace named Workspace1 that contains a lakehouse named Lakehouse1. Lakehouse1 contains the following tables:

Orders

Customer

Employee

The Employee table contains Personally Identifiable Information (PII).

A data engineer is building a workflow that requires writing data to the Customer table, however, the user does NOT have the elevated permissions required to view the contents of the Employee table.

You need to ensure that the data engineer can write data to the Customer table without reading data from the Employee table.

Which three actions should you perform? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.

Share Lakehouse1 with the data engineer.

B.

Assign the data engineer the Contributor role for Workspace2.

C.

Assign the data engineer the Viewer role for Workspace2.

D.

Assign the data engineer the Contributor role for Workspace1.

E.

Migrate the Employee table from Lakehouse1 to Lakehouse2.

F.

Create a new workspace named Workspace2 that contains a new lakehouse named Lakehouse2.

G.

Assign the data engineer the Viewer role for Workspace1.

Buy Now
Question # 15

You need to ensure that processes for the bronze and silver layers run in isolation How should you configure the Apache Spark settings?

Options:

A.

Modify the number of executors.

B.

Disable high concurrency.

C.

Create a custom pool.

D.

Set the default environment.

Buy Now
Question # 16

You need to recommend a method to populate the POS1 data to the lakehouse medallion layers.

What should you recommend for each layer? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Question # 17

You need to schedule the population of the medallion layers to meet the technical requirements.

What should you do?

Options:

A.

Schedule a data pipeline that calls other data pipelines.

B.

Schedule a notebook.

C.

Schedule an Apache Spark job.

D.

Schedule multiple data pipelines.

Buy Now
Question # 18

You need to recommend a solution for handling old files. The solution must meet the technical requirements. What should you include in the recommendation?

Options:

A.

a data pipeline that includes a Copy data activity

B.

a notebook that runs the VACUUM command

C.

a notebook that runs the OPTIMIZE command

D.

a data pipeline that includes a Delete data activity

Buy Now
Question # 19

You need to recommend a solution to resolve the MAR1 connectivity issues. The solution must minimize development effort. What should you recommend?

Options:

A.

Add a ForEach activity to the data pipeline.

B.

Configure retries for the Copy data activity.

C.

Configure Fault tolerance for the Copy data activity.

D.

Call a notebook from the data pipeline.

Buy Now
Question # 20

You need to ensure that the data engineers are notified if any step in populating the lakehouses fails. The solution must meet the technical requirements and minimize development effort.

What should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Question # 21

You need to populate the MAR1 data in the bronze layer.

Which two types of activities should you include in the pipeline? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.

ForEach

B.

Copy data

C.

WebHook

D.

Stored procedure

Buy Now
Question # 22

You need to create the product dimension.

How should you complete the Apache Spark SQL code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Question # 23

You need to ensure that the data analysts can access the gold layer lakehouse.

What should you do?

Options:

A.

Add the DataAnalyst group to the Viewer role for WorkspaceA.

B.

Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.

C.

Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.

D.

Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

Buy Now
Exam Code: DP-700
Exam Name: Implementing Data Engineering Solutions Using Microsoft Fabric
Last Update: Nov 17, 2025
Questions: 109
DP-700 pdf

DP-700 PDF

$33.25  $94.99
DP-700 Engine

DP-700 Testing Engine

$38.5  $109.99
DP-700 PDF + Engine

DP-700 PDF + Testing Engine

$50.75  $144.99