IT認証試験問題集
毎月、ITshikenは1500人以上の受験者が試験準備を助けて、試験に合格するために受験者にご協力します
 ホームページ / DP-200 問題集  / DP-200 問題練習

Microsoft DP-200 問題練習

Implementing an Azure Data Solution 試験

最新更新時間: 2020/11/16,合計162問。

いい買物の日:DP-200 最新真題を買う時、日本語版と英語版両方を同時に獲得できます。

実際の問題集を練習し、試験のポイントを了解し、テストに申し込むするかどうかを決めることができます。

さらに試験準備時間の35%を節約するには、DP-200 問題集を使用してください。

 / 11

Question No : 1
HOTSPOT
Which masking functions should you implement for each column to meet the data masking requirements? To answer, select the appropriate options in the answer are a. NOTE: Each correct selection is worth one point.



正解:


Explanation:
Box 1: Default
Default uses a zero value for numeric data types (bigint, bit, decimal, int, money, numeric, smallint, smallmoney, tinyint, float, real).
- Only Show a zero value for the values in a column named ShockOilWeight.
Box 2: Credit Card
The Credit Card Masking method exposes the last four digits of the designated fields and adds a constant string as a prefix in the form of a credit card.
Example: XXXX-XXXX-XXXX-1234
- Only show the last four digits of the values in a column named SuspensionSprings.
Scenario:
The company identifies the following data masking requirements for the Race Central data that will be stored in SQL Database:
- Only Show a zero value for the values in a column named ShockOilWeight.
- Only show the last four digits of the values in a column named SuspensionSprings.
Reference: https://docs.microsoft.com/en-us/azure/azure-sql/database/dynamic-data-masking-overview

Question No : 2
HOTSPOT
Which masking functions should you implement for each column to meet the data masking requirements? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.



正解:


Explanation:
Box 1: Credit Card
The Credit Card Masking method exposes the last four digits of the designated fields and adds a constant string as a prefix in the form of a credit card.
Example: XXXX-XXXX-XXXX-1234
- Only show the last four digits of the values in a column named SuspensionSprings.
Box 2: Default
Default uses a zero value for numeric data types (bigint, bit, decimal, int, money, numeric, smallint, smallmoney, tinyint, float, real).
- Only show a zero value for the values in a column named ShockOilWeight.
Scenario:
The company identifies the following data masking requirements for the Race Central data that will be stored in SQL Database:
- Only show a zero value for the values in a column named ShockOilWeight.
- Only show the last four digits of the values in a column named SuspensionSprings.
Reference: https://docs.microsoft.com/en-us/azure/azure-sql/database/dynamic-data-masking-overview

Question No : 3
HOTSPOT
You are building the data store solution for Mechanical Workflow.
How should you configure Table1? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.



正解:


Explanation:
Table Type: Hash distributed.
Hash-distributed tables improve query performance on large fact tables.
Index type: Clusted columnstore
Scenario:
Mechanical Workflow has a named Table1 that is 1 TB. Large aggregations are performed on a single column of Table 1.
References: https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-tables-distribute

Question No : 4
On which data store should you configure TDE to meet the technical requirements?

正解:
Explanation:
Scenario: Transparent data encryption (TDE) must be enabled on all data stores, whenever possible. The database for Mechanical Workflow must be moved to Azure Synapse Analytics.
Incorrect Answers:
A: Cosmos DB does not support TDE.

Question No : 5
Testlet 3

Case Study
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.

To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other question in this case study.

At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab , note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Overview
General Overview
Litware, Inc. is an international car racing and manufacturing company that has 1,000 employees. Most employees are located in Europe. The company supports racing teams that complete in a worldwide racing series.

Physical Locations
Litware has two main locations: a main office in London, England, and a manufacturing plant in Berlin, Germany.

During each race weekend, 100 engineers set up a remote portable office by using a VPN to connect the datacenter in the London office. The portable office is set up and torn down in approximately 20 different countries each year.

Existing environment
Race Central
During race weekends, Litware uses a primary application named Race Central. Each car has several sensors that send real-time telemetry data to the London datacentre. The data is used for real-time tracking of the cars.
Race Central also sends batch updates to an application named Mechanical Workflow by using Microsoft SQL Server Integration Services (SSIS).

The telemetry data is sent to a MongoDB database. A custom application then moves the data to databases in SQL Server 2017. The telemetry data in MongoDB has more than 500 attributes. The application changes the attribute names when the data is moved to SQL Server 2017.
The database structure contains both OLAP and OLTP databases.

Mechanical Workflow
Mechanical Workflow is used to track changes and improvements made to the cars during their lifetime.
Currently, Mechanical Workflow runs on SQL Server 2017 as an OLAP system.
Mechanical Workflow has a table named Table1 that is 1 TB. Large aggregations are performed on a single column of Table1.

Requirements
Planned Changes
Litware is in the process of rearchitecting its data estate to be hosted in Azure. The company plans to decommission the London datacentre and move all its applications to an Azure datacenter.

Technical Requirements
Litware identifies the following technical requirements:
- Data collection for Race Central must be moved to Azure Cosmos DB and Azure SQL Database. The data must be written to the Azure datacenter closest to each race and must converge in the least amount of time.
- The query performance of Race Central must be stable, and the administrative time it takes to perform optimizations must be minimized.
- The database for Mechanical Workflow must be moved to Azure SQL Data Warehouse.
- Transparent data encryption (TDE) must be enabled on all data stores, whenever possible.
- An Azure Data Factory pipeline must be used to move data from Cosmos DB to SQL Database for Race Central. If the data load takes longer than 20 minutes, configuration changes must be made to Data Factory.
- The telemetry data must migrate toward a solution that is native to Azure.
- The telemetry data must be monitored for performance issues. You must adjust the Cosmos DB Request Units per second (RU/s) to maintain a performance SLA while minimizing the cost of the RU/s.

Data Masking Requirements
During race weekends, visitors will be able to enter the remote portable offices. Litware is concerned that some proprietary information might be exposed.
The company identifies the following data masking requirements for the Race Central data that will be stored in SQL Database:
- Only show the last four digits of the values in a column named SuspensionSprings.
- Only show a zero value for the values in a column named ShockOilWeight.

HOTSPOT
You need to build a solution to collect the telemetry data for Race Central.
What should you use? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.



正解:


Explanation:
API: Table
Azure Cosmos DB provides native support for wire protocol-compatible APIs for popular databases. These include MongoDB, Apache Cassandra, Gremlin, and Azure Table storage. Scenario: The telemetry data must migrate toward a solution that is native to Azure.
Consistency level: Strong
Use the strongest consistency Strong to minimize convergence time. Scenario: The data must be written to the Azure datacentre closest to each race and must converge in the least amount of time.
References: https://docs.microsoft.com/en-us/azure/cosmos-db/consistency-levels

Question No : 6
DRAG DROP
You need to provision the polling data storage account.
How should you configure the storage account? To answer, drag the appropriate Configuration Value to the correct Setting. Each Configuration Value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. NOTE: Each correct selection is worth one point.



正解:


Explanation:
Account type: StorageV2
You must create new storage accounts as type StorageV2 (general-purpose V2) to take advantage of Data Lake Storage Gen2 features.
Scenario: Polling data is stored in one of the two locations:
- An on-premises Microsoft SQL Server 2019 database named PollingData
- Azure Data Lake Gen 2
Data in Data Lake is queried by using PolyBase
Replication type: RA-GRS
Scenario: All services and processes must be resilient to a regional Azure outage.
Geo-redundant storage (GRS) is designed to provide at least 99.99999999999999% (16 9's) durability of objects over a given year by replicating your data to a secondary region that is hundreds of miles away from the primary region. If your storage account has GRS enabled, then your data is durable even in the case of a complete regional outage or a disaster in which the primary region isn't recoverable.
If you opt for GRS, you have two related options to choose from:
- GRS replicates your data to another data center in a secondary region, but that data is available to be read only if Microsoft initiates a failover from the primary to secondary region.
- Read-access geo-redundant storage (RA-GRS) is based on GRS. RA-GRS replicates your data to another data center in a secondary region, and also provides you with the option to read from the secondary region. With RA-GRS, you can read from the secondary region regardless of whether Microsoft initiates a failover from the primary to secondary region.
References:
https://docs.microsoft.com/bs-cyrl-ba/azure/storage/blobs/data-lake-storage-quickstart-create-account
https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy-grs

Question No : 7
Testlet 2

Background
Proseware, Inc, develops and manages a product named Poll Taker. The product is used for delivering public opinion polling and analysis.

Polling data comes from a variety of sources, including online surveys, house-to-house interviews, and booths at public events.

Polling data
Polling data is stored in one of the two locations:
- An on-premises Microsoft SQL Server 2019 database named PollingData
- Azure Data Lake Gen 2

Data in Data Lake is queried by using PolyBase

Poll metadata
Each poll has associated metadata with information about the poll including the date and number of respondents. The data is stored as JSON.

Phone-based polling
Security
- Phone-based poll data must only be uploaded by authorized users from authorized devices
- Contractors must not have access to any polling data other than their own
- Access to polling data must set on a per-active directory user basis

Data migration and loading
- All data migration processes must use Azure Data Factory
- All data migrations must run automatically during non-business hours
- Data migrations must be reliable and retry when needed

Performance
After six months, raw polling data should be moved to a storage account. The storage must be available in the event of a regional disaster. The solution must minimize costs.

Deployments
- All deployments must be performed by using Azure DevOps. Deployments must use templates used in multiple environments
- No credentials or secrets should be used during deployments

Reliability
All services and processes must be resilient to a regional Azure outage.

Monitoring
All Azure services must be monitored by using Azure Monitor. On-premises SQL Server performance must be monitored.

DRAG DROP
You need to ensure that phone-based polling data can be analyzed in the PollingData database.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer are and arrange them in the correct order.



正解:


Explanation:
Scenario: All deployments must be performed by using Azure DevOps. Deployments must use templates used in multiple environments No credentials or secrets should be used during deployments

Question No : 8
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Storage account that contains 100 GB of files. The files contain text and numerical values. 75% of the rows contain description data that has an average length of 1.1 MB.
You plan to copy the data from the storage account to an enterprise data warehouse in Azure Synapse Analytics.
You need to prepare the files to ensure that the data copies quickly.
Solution: You convert the files to compressed delimited text files.
Does this meet the goal?

正解:
Explanation:
All file formats have different performance characteristics. For the fastest load, use compressed delimited text files.
Reference: https://docs.microsoft.com/en-us/azure/sql-data-warehouse/guidance-for-loading-data

Question No : 9
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some questions sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure subscription that contains an Azure Storage account.
You plan to implement changes to a data storage solution to meet regulatory and compliance standards.
Every day, Azure needs to identify and delete blobs that were NOT modified during the last 100 days.
Solution: You schedule an Azure Data Factory pipeline.
Does this meet the goal?

正解:
Explanation:
Instead apply an Azure Blob storage lifecycle policy.
Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts? tabs=azure-portal

Question No : 10
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure SQL database named DB1 that contains a table named Table1. Table1 has a field named Customer_ID that is varchar(22).
You need to implement masking for the Customer_ID field to meet the following requirements:
- The first two prefix characters must be exposed.
- The last four prefix characters must be exposed.
- All other characters must be masked.
Solution: You implement data masking and use a custom string function mask.
Does this meet the goal?

正解:
Explanation:
Must use Custom Text data masking, which exposes the first and last characters and adds a custom padding string in the middle.
Reference: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-dynamic-data-masking-get-started

Question No : 11
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some questions sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure subscription that contains an Azure Storage account.
You plan to implement changes to a data storage solution to meet regulatory and compliance standards.
Every day, Azure needs to identify and delete blobs that were NOT modified during the last 100 days.
Solution: You apply an Azure Blob storage lifecycle policy.
Does this meet the goal?

正解:
Explanation:
Azure Blob storage lifecycle management offers a rich, rule-based policy for GPv2 and Blob storage accounts. Use the policy to transition your data to the appropriate access tiers or expire at the end of the data's lifecycle.
The lifecycle management policy lets you:
- Transition blobs to a cooler storage tier (hot to cool, hot to archive, or cool to archive) to optimize for performance and cost
- Delete blobs at the end of their lifecycles
- Define rules to be run once per day at the storage account level
- Apply rules to containers or a subset of blobs (using prefixes as filters)
Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts? tabs=azure-portal

Question No : 12
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some questions sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure subscription that contains an Azure Storage account. You plan to implement changes to a data storage solution to meet regulatory and compliance standards.
Every day, Azure needs to identify and delete blobs that were NOT modified during the last 100 days.
Solution: You apply an expired tag to the blobs in the storage account.
Does this meet the goal?

正解:
Explanation:
Instead apply an Azure Blob storage lifecycle policy.
Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts? tabs=azure-portal

Question No : 13
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some questions sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure subscription that contains an Azure Storage account. You plan to implement changes to a data storage solution to meet regulatory and compliance standards. Every day, Azure needs to identify and delete blobs that were NOT modified during the last 100 days.
Solution: You apply an Azure policy that tags the storage account.
Does this meet the goal?

正解:
Explanation:
Instead apply an Azure Blob storage lifecycle policy.
Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-lifecycle-management-concepts? tabs=azure-portal

Question No : 14
HOTSPOT
You have an Azure SQL database that contains a table named Employee. Employee contains sensitive data in a decimal (10,2) column named Salary. You need to ensure that nonprivileged users can view the table data, but Salary must display a number from 0 to 100.
What should you configure? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.



正解:


Explanation:
Box 1: SELECT
Users with SELECT permission on a table can view the table data. Columns that are defined as masked, will display the masked data.
Incorrect:
Grant the UNMASK permission to a user to enable them to retrieve unmasked data from the columns for which masking is defined.
The CONTROL permission on the database includes both the ALTER ANY MASK and UNMASK permission.
Box 2: Random number
Random number: Masking method, which generates a random number according to the selected boundaries and actual data types. If the designated boundaries are equal, then the masking function is a constant number.


Question No : 15
DRAG DROP
You have an Azure SQL database named DB1 in the East US 2 region. You need to build a secondary geo-replicated copy of DB1 in the West US region on a new server.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.



正解:


Explanation:
Step 1: From the Geo-replication settings of DB1, select West US The following steps create a new secondary database in a geo-replication partnership.

 / 11