You have a Fabric workspace that contains a lakehouse named Lakehouse1.In an external data source, you have data files that are 500 GB each. A new file is added every day.You need to ingest the data into Lakehouse1 without applying any transformations. The solution must meet the following requirementsTrigger the process when a new file is added.Provide the highest throughput.Which type of item should you use to ingest the data?
You have a Fabric workspace that contains a lakehouse named Lakehouse1.In an external data source, you have data files that are 500 GB each. A new file is added every day.You need to ingest the data into Lakehouse1 without applying any transformations. The solution must meet the following requirementsTrigger the process when a new file is added.Provide the highest throughput.Which type of item should you use to ingest the data?
You have a Fabric workspace that contains an eventhouse and a KQL database named Database1. Database1 has the following:A table named Table1 -A table named Table2 -An update policy named Policy1 -Policy1 sends data from Table1 to Table2.The following is a sample of the data in Table2.Recently, the following actions were performed on Table1:An additional element named temperature was added to the StreamData column.The data type of the Timestamp column was changed to date.The data type of the DeviceId column was changed to string.You plan to load additional records to Table2.Which two records will load from Table1 to Table2? Each correct answer presents a complete solution.NOTE: Each correct selection is worth one point.
HOTSPOT -You have a Fabric workspace.You are debugging a statement and discover the following issues:Sometimes, the statement fails to return all the expected rows.The PurchaseDate output column is NOT in the expected format of mmm dd, yy.You need to resolve the issues. The solution must ensure that the data types of the results are retained. The results can contain blank cells.How should you complete the statement? To answer, select the appropriate options in the answer area.NOTE: Each correct selection is worth one point.
You are developing a data pipeline named Pipeline1.You need to add a Copy data activity that will copy data from a Snowflake data source to a Fabric warehouse.What should you configure?
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.You have a KQL database that contains two tables named Stream and Reference. Stream contains streaming data in the following format.Reference contains reference data in the following format.Both tables contain millions of rows.You have the following KQL queryset.You need to reduce how long it takes to run the KQL queryset.Solution: You change the join type to kind=outer.Does this meet the goal?