Free Implementing Analytics Solutions Using Microsoft Fabric Exam DP-600 Exam Practice Test

UNLOCK FULL
DP-600 Exam Features
In Just $59 You can Access
  • All Official Question Types
  • Interactive Web-Based Practice Test Software
  • No Installation or 3rd Party Software Required
  • Customize your practice sessions (Free Demo)
  • 24/7 Customer Support
Page: 1 / 11
Total Questions: 55
  • You have a Fabric tenant that contains a lakehouse.You plan to query sales data files by using the SQL endpoint. The files will be in an Amazon Simple Storage Service (Amazon S3) storage bucket.You need to recommend which file format to use and where to create a shortcut.Which two actions should you include in the recommendation? Each correct answer presents part of the solution.NOTE: Each correct answer is worth one point.

    Answer: A, ,B Next Question
  • You have a Fabric tenant that contains a takehouse named lakehouse1. Lakehouse1 contains a Delta table named Customer.When you query Customer, you discover that the query is slow to execute. You suspect that maintenance was NOT performed on the table.You need to identify whether maintenance tasks were performed on Customer. Solution: You run the following Spark SQL statement:DESCRIBE HISTORY customer Does this meet the goal?

    Answer: A Next Question
  • You have a Fabric tenant that contains 30 CSV files in OneLake. The files are updated daily.You create a Microsoft Power Bl semantic model named Modell that uses the CSV files as a data source. You configure incremental refresh for Model 1 and publish the model to a Premium capacity in the Fabric tenant.When you initiate a refresh of Model1, the refresh fails after running out of resources. What is a possible cause of the failure?

    Answer: E Next Question
  • You have source data in a folder on a local computer.You need to create a solution that will use Fabric to populate a data store. The solution must meet the following requirements:Support the use of dataflows to load and append data to the data store.Ensure that Delta tables are V-Order optimized and compacted automatically. Which type of data store should you use?

    Answer: A Next Question
  • You have a Fabric workspace that contains a DirectQuery semantic model. The model queries a data source that has 500 million rows.You have a Microsoft Power Bl report named Report1 that uses the model. Report! contains visualson multiple pages.You need to reduce the query execution time for the visuals on all the pages.What are two features that you can use? Each correct answer presents a complete solution. NOTE: Each correct answer is worth one point.

    Answer: A, ,B Next Question
  • What should you recommend using to ingest the customer data into the data store in the AnatyticsPOC workspace?

    Answer: D Next Question
  • You have a Fabric tenant that uses a Microsoft tower Bl Premium capacity. You need to enable scale- out for a semantic model. What should you do first?

    Answer: C Next Question
  • You have a Fabric tenant that contains a complex semantic model. The model is based on a star schema and contains many tables, including a fact table named Sales. You need to create a diagram of the model. The diagram must contain only the Sales table and related tables. What should you use from Microsoft Power Bl Desktop?

    Answer: C Next Question
  • You are the administrator of a Fabric workspace that contains a lakehouse named Lakehouse1. Lakehouse1 contains the following tables:Table1: A Delta table created by using a shortcutTable2: An external table created by using SparkTable3: A managed tableYou plan to connect to Lakehouse1 by using its SQL endpoint. What will you be able to do after connecting to Lakehouse1?

    Answer: D Next Question
  • You have a Fabric tenant that contains a lakehouse named lakehouse1. Lakehouse1 contains a table named Table1.You are creating a new data pipeline.You plan to copy external data to Table1. The schema of the external data changes regularly. You need the copy operation to meet the following requirements:Replace Table1 with the schema of the external data.Replace all the data in Table1 with the rows in the external data.You add a Copy data activity to the pipeline. What should you do for the Copy data activity?

    Answer: B Next Question
Page: 1 / 11
Total Questions: 55