aiotestking uk

70-776 Exam Questions - Online Test


70-776 Premium VCE File

Learn More 100% Pass Guarantee - Dumps Verified - Instant Download
150 Lectures, 20 Hours

Our pass rate is high to 98.9% and the similarity percentage between our 70-776 Braindumps and real exam is 90% based on our seven-year educating experience. Do you want achievements in the Microsoft 70-776 exam in just one try? I am currently studying for the 70-776 Exam Dumps. Latest 70-776 Braindumps, Try Microsoft 70-776 Brain Dumps First.

Free 70-776 Demo Online For Microsoft Certifitcation:

NEW QUESTION 1
You have a Microsoft Azure Data Lake Analytics service and an Azure Data Lake Store.
You need to use Python to submit a U-SQL job. Which Python module should you install?

  • A. azure-mgmt-datalake-store
  • B. azure-mgmt- datalake-analytics
  • C. azure-datalake-store
  • D. azure-mgmt-resource

Answer: B

Explanation:
References:
https://docs.microsoft.com/en-us/azure/data-lake-analytics/data-lake-analytics-manage-use- python-sdk

NEW QUESTION 2
HOTSPOT
You have a Microsoft Azure Data Lake Analytics service.
You have a tab-delimited file named UserActivity.tsv that contains logs of user sessions. The file does not have a header row.
You need to create a table and to load the logs to the table. The solution must distribute the data by a column named SessionId.
How should you complete the U-SQL statement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
70-776 dumps exhibit

    Answer:

    Explanation:
    References:
    https://msdn.microsoft.com/en-us/library/mt706197.aspx

    NEW QUESTION 3
    You plan to deploy a Microsoft Azure Stream Analytics job to filter multiple input streams from IoT devices that have a total data flow of 30 MB/s.
    You need to calculate how many streaming units you require for the job. The solution must prevent lag.
    What is the minimum number of streaming units required?

    • A. 3
    • B. 10
    • C. 30
    • D. 300

    Answer: C

    NEW QUESTION 4
    You have an on-premises data warehouse that uses Microsoft SQL Server 2021. All the data in the
    data warehouse comes from text files stored in Azure Blob storage. The text files are imported into the data warehouse by using SQL Server Integration Services (SSIS). The text files are not transformed.
    You need to migrate the data to an Azure SQL data warehouse in the least amount of time possible. Which two actions should you perform? Each correct answer presents part of the solution.
    NOTE: Each correct selection is worth one point.

    • A. Use SSIS to upload the files in Azure Blob storage to tables in the Azure SQL data warehouse.
    • B. Execute the CREATE EXTERNAL TABLE AS SELECT statement to export the data.
    • C. Use AzCopy to transfer the data from the on-premises data warehouse to Azure SQL data warehouse.
    • D. Execute the CREATE TABLE AS SELECT statement to load the data.
    • E. Define external tables in the Azure SQL data warehouse that map to the existing files in Azure Blob storage.

    Answer: DE

    Explanation:
    References:
    https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-load-from-azure- blob-storage-with-polybase

    NEW QUESTION 5
    Note: This question is part of a series of questions that use the same scenario. For your convenience, the scenario is repeated in each question. Each question presents a different goal and answer choices, but the text of the scenario is exactly the same in each question in this series.
    Start of repeated scenario
    You are migrating an existing on-premises data warehouse named LocalDW to Microsoft Azure. You will use an Azure SQL data warehouse named AzureDW for data storage and an Azure Data Factory named AzureDF for extract transformation, and load (ETL) functions.
    For each table in LocalDW, you create a table in AzureDW.

    • A. adataset
    • B. a gateway
    • C. a pipeline
    • D. a linked service

    Answer: A

    NEW QUESTION 6
    You have a Microsoft Azure SQL data warehouse that has a fact table named FactOrder. FactOrder contains three columns named CustomerId, OrderId, and OrderDateKey. FactOrder is hash distributed on CustomerId. OrderId is the unique identifier for FactOrder. FactOrder contains 3 million rows.
    Orders are distributed evenly among different customers from a table named dimCustomers that contains 2 million rows.
    You often run queries that join FactOrder and dimCustomers by selecting and grouping by the OrderDateKey column.
    You add 7 million rows to FactOrder. Most of the new records have a more recent OrderDateKey value than the previous records.
    You need to reduce the execution time of queries that group on OrderDateKey and that join dimCustomers and FactOrder.
    What should you do?

    • A. Change the distribution for the FactOrder table to round robin.
    • B. Update the statistics for the OrderDateKey column.
    • C. Change the distribution for the FactOrder table to be based on OrderId.
    • D. Change the distribution for the dimCustomers table to OrderDateKey.

    Answer: B

    Explanation:
    References:
    https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-tables-statistics

    NEW QUESTION 7
    DRAG DROP
    You are building a data pipeline that uses Microsoft Azure Stream Analytics.
    Alerts are generated when the aggregate of data streaming in from devices during a minute-long window matches the values in a rule.
    You need to retrieve the following information:
    *The event ID
    *The device ID
    *The application ID that runs the service
    Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
    70-776 dumps exhibit

      Answer:

      Explanation: 70-776 dumps exhibit

      NEW QUESTION 8
      You have an extract, transformation, and load (ETL) process for a Microsoft Azure SQL data
      warehouse.
      You run the following statements to create the logon and user for an account that will run the nightly data load for the data warehouse.
      CREATE LOGIN LoaderLogin WITH PASSWORD = ‘mypassword’; CREATE USER LoaderUser for LOGIN LoaderLogin;
      You connect to the data warehouse.
      You need to ensure that the user can access the highest resource class. Which statement should you execute?

      • A. ALTER SERVER ROLE xLargeRC ADD MEMBER LoaderLogin;
      • B. EXEC sp_addrolemember ‘xlargerc’, ‘LoaderUser’
      • C. ALTER SERVER ROLE LargeRC ADD MEMBER LoaderUser;
      • D. EXEC sp_addrolemember ‘largerc’, ‘LoaderLogin’

      Answer: B

      Explanation:
      References:
      https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-develop- concurrency

      NEW QUESTION 9
      You have a Microsoft Azure Data Lake Store and an Azure Active Directory tenant.
      You are developing an application that will access the Data Lake Store by using end-user credentials. You need to ensure that the application uses end-user authentication to access the Data Lake Store. What should you create?

      • A. a Native Active Directory app registration
      • B. a policy assignment that uses the Allowed resource types policy definition
      • C. a Web app/API Active Directory app registration
      • D. a policy assignment that uses the Allowed locations policy definition

      Answer: A

      Explanation:
      References:
      https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-end-user-authenticate-using-active-directory

      NEW QUESTION 10
      Note: This question is part of a series of questions that present the same scenario. Each question in
      the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
      After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
      You have a table named Table1 that contains 3 billion rows. Table1 contains data from the last 36 months.
      At the end of every month, the oldest month of data is removed based on a column named DateTime.
      You need to minimize how long it takes to remove the oldest month of data. Solution: You specify DateTime as the hash distribution column.
      Does this meet the goal?

      • A. Yes
      • B. No

      Answer: B

      NEW QUESTION 11
      You have a Microsoft Azure SQL data warehouse that contains information about community events. An Azure Data Factory job writes an updated CSV file in Azure Blob storage to Community/{date}/events.csv daily.
      You plan to consume a Twitter feed by using Azure Stream Analytics and to correlate the feed to the community events.
      You plan to use Stream Analytics to retrieve the latest community events data and to correlate the data to the Twitter feed data.
      You need to ensure that when updates to the community events data is written to the CSV files, the Stream Analytics job can access the latest community events data.
      What should you configure?

      • A. an output that uses a blob storage sink and has a path pattern of Community/{date}
      • B. an output that uses an event hub sink and the CSV event serialization format
      • C. an input that uses a reference data source and has a path pattern of Community/{date}/events.csv
      • D. an input that uses a reference data source and has a path pattern of Community/{date}

      Answer: C

      NEW QUESTION 12
      You have a Microsoft Azure SQL data warehouse. The following statements are used to define file formats in the data warehouse.
      70-776 dumps exhibit
      You have an external PolyBase table named file_factPowerMeasurement that uses the FileFormat_ORC file format.
      You need to change file_ factPowerMeasurement to use the FileFormat_PARQUET file format. Which two statements should you execute? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

      • A. CREATE EXTERNAL TABLE
      • B. ALTER TABLE
      • C. CREATE EXTERNAL TABLE AS SELECT
      • D. ALTER EXTERNAL DATA SOURCE
      • E. DROP EXTERNAL TABLE

      Answer: AE

      NEW QUESTION 13
      DRAG DROP
      You have IoT devices that produce the following output.
      70-776 dumps exhibit
      You need to use Microsoft Azure Stream Analytics to convert the output into the tabular format described in the following table.
      70-776 dumps exhibit
      How should you complete the Stream Analytics query? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
      NOTE: Each correct selection is worth one point.
      70-776 dumps exhibit

        Answer:

        Explanation: 70-776 dumps exhibit

        NEW QUESTION 14
        You have a Microsoft Azure Data Lake Analytics service.
        You need to store a list of milltiple-character string values in a single column and to use a cross apply explode expression to output the values.
        Which type of data should you use in a U-SQL query?

        • A. SQL.MAP
        • B. SQL.ARRAY
        • C. string
        • D. byte [ ]

        Answer: B

        NEW QUESTION 15
        DRAG DROP
        You need to load data from Microsoft Azure Data Lake Store to Azure SQL Data Warehouse by using Transact-SQL.
        In which sequence should you perform the actions? To answer, move all actions from the list of actions to the answer area and arrange them in the correct order.
        NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.
        70-776 dumps exhibit

          Answer:

          Explanation: 70-776 dumps exhibit

          NEW QUESTION 16
          You have sensor devices that report data to Microsoft Azure Stream Analytics. Each sensor reports data several times per second.
          You need to create a live dashboard in Microsoft Power BI that shows the performance of the sensor devices. The solution must minimize lag when visualizing the data.
          Which function should you use for the time-series data element?

          • A. LAG
          • B. SlidingWindow
          • C. System.TimeStamp
          • D. TumblingWindow

          Answer: D

          Thanks for reading the newest 70-776 exam dumps! We recommend you to try the PREMIUM 2passeasy 70-776 dumps in VCE and PDF here: https://www.2passeasy.com/dumps/70-776/ (91 Q&As Dumps)