Exam Code : AZ-204
Exam Name : Developing Solutions for Microsoft Azure
Vendor Name :
"Microsoft"
AZ-204 Dumps
AZ-204 Braindumps AZ-204 Real Questions AZ-204 Practice Test
AZ-204 Actual Questions
killexams.com
Developing Solutions for Microsoft Azure
https://killexams.com/pass4sure/exam-detail/AZ-204
Question: 78
You need to investigate the Azure Function app error message in the development environment.
What should you do?
Connect Live Metrics Stream from Application Insights to the Azure Function app and filter the metrics.
Create a new Azure Log Analytics workspace and instrument the Azure Function app with Application Insights.
Update the Azure Function app with extension methods from Microsoft.Extensions.Logging to log events by using the log instance.
Add a new diagnostic setting to the Azure Function app to send logs to Log Analytics.
Azure Functions offers built-in integration with Azure Application Insights to monitor functions.
The following areas of Application Insights can be helpful when evaluating the behavior, performance, and errors in your functions:
Live Metrics: View metrics data as it’s created in near real-time. Failures
Performance Metrics
Reference: https://docs.microsoft.com/en-us/azure/azure-functions/functions-monitoring
Question: 79
HOTSPOT
You need to update the APIs to resolve the testing error.
How should you complete the Azure CLI command? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
Explanation:
Enable Cross-Origin Resource Sharing (CORS) on your Azure App Service Web App. Enter the full URL of the site you want to allow to access your WEB API or * to allow all domains.
Box 1: cors
Box 2: add
Box 3: allowed-origins
Box 4: http://testwideworldimporters.com/
References: http://donovanbrown.com/post/How-to-clear-No-Access-Control-Allow-Origin-header-error-with-Azure- App-Service
Question: 80
You need to implement a solution to resolve the retail store location data issue.
Which three Azure Blob features should you enable? Each correct answer presents pan ol the solution. NOTE Each correct selection is worth one point
Immutability
Snapshots
Versioning
Soft delete
Object replication
Change feed
Scenario: You must perform a point-in-time restoration of the retail store location data due to an unexpected and accidental deletion of data.
Before you enable and configure point-in-time restore, enable its prerequisites for the storage account: soft delete, change feed, and blob versioning.
Reference: https://docs.microsoft.com/en-us/azure/storage/blobs/point-in-time-restore-manage
Question: 81
Topic 1, Windows Server 2016 virtual machine Case study
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study
To display the first question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Current environment
Windows Server 2016 virtual machine
This virtual machine (VM) runs BizTalk Server 2016. The VM runs the following workflows:
Ocean Transport C This workflow gathers and validates container information including container contents and arrival notices at various shipping ports.
Inland Transport C This workflow gathers and validates trucking information including fuel usage, number of stops, and routes.
The VM supports the following REST API calls:
Container API C This API provides container information including weight, contents, and other attributes.
Location API C This API provides location information regarding shipping ports of call and trucking stops.
Shipping REST API C This API provides shipping information for use and display on the shipping website. Shipping Data
The application uses MongoDB JSON document storage database for all container and transport information. Shipping Web Site
The site displays shipping container tracking information and container contents. The site is located at http://shipping.wideworldimporters.com/
Proposed solution
The on-premises shipping application must be moved to Azure. The VM has been migrated to a new Standard_D16s_v3 Azure VM by using Azure Site Recovery and must remain running in Azure to complete the BizTalk component migrations. You create a Standard_D16s_v3 Azure VM to host BizTalk Server.
The Azure architecture diagram for the proposed solution is shown below:
Requirements Shipping Logic app
The Shipping Logic app must meet the following requirements:
Support the ocean transport and inland transport workflows by using a Logic App.
Support industry-standard protocol X12 message format for various messages including vessel content details and arrival notices.
Secure resources to the corporate VNet and use dedicated storage resources with a fixed costing model.
Maintain on-premises connectivity to support legacy applications and final BizTalk migrations. Shipping Function app
Implement secure function endpoints by using app-level security and include Azure Active Directory (Azure AD). REST APIs
The REST API’s that support the solution must meet the following requirements:
Secure resources to the corporate VNet.
Allow deployment to a testing location within Azure while not incurring additional costs.
Automatically scale to double capacity during peak shipping times while not causing application downtime.
Minimize costs when selecting an Azure payment model. Shipping data
Data migration from on-premises to Azure must minimize costs and downtime. Shipping website
Use Azure Content Delivery Network (CDN) and ensure maximum performance for dynamic content while minimizing latency and costs.
Issues
Windows Server 2016 VM
The VM shows high network latency, jitter, and high CPU utilization. The VM is critical and has not been backed up in the past. The VM must enable a quick restore from a 7-day snapshot to include in-place restore of disks in case of failure.
Shipping website and REST APIs
The following error message displays while you are testing the website:
Failed to load http://test-shippingapi.wideworldimporters.com/: No ‘Access-Control-Allow-Origin’ header is present on the requested resource. Origin ‘http://test.wideworldimporters.com/’ is therefore not allowed access.
DRAG DROP
You need to support the message processing for the ocean transport workflow.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions
to the answer area and arrange them in the correct order.
Explanation:
Step 1: Create an integration account in the Azure portal
You can define custom metadata for artifacts in integration accounts and get that metadata during runtime for your logic app to use. For example, you can provide metadata for artifacts, such as partners, agreements, schemas, and maps
all store metadata using key-value pairs.
Step 2: Link the Logic App to the integration account
A logic app that’s linked to the integration account and artifact metadata you want to use. Step 3: Add partners, schemas, certificates, maps, and agreements
Step 4: Create a custom connector for the Logic App.
References: https://docs.microsoft.com/bs-latn-ba/azure/logic-apps/logic-apps-enterprise-integration- metadata
Question: 82
HOTSPOT
You are developing an application that uses a premium block blob storage account. You are optimizing costs by automating Azure Blob Storage access tiers.
You apply the following policy rules to the storage account.
You must determine the implications of applying the rules to the data. (Line numbers are included for reference only.)
Explanation:
Question: 83
You need to resolve the log capacity issue. What should you do?
Create an Application Insights Telemetry Filter
Change the minimum log level in the host.json file for the function
Implement Application Insights Sampling
Set a LogCategoryFilter during startup
Scenario, the log capacity issue: Developers report that the number of log message in the trace output for the processor is too high, resulting in lost log messages.
Sampling is a feature in Azure Application Insights. It is the recommended way to reduce telemetry traffic and storage, while preserving a statistically correct analysis of application data. The filter selects items that are related, so that you can navigate between items when you are doing diagnostic investigations. When metric counts are presented to you in the portal, they are renormalized to take account of the sampling, to minimize any effect on the statistics.
Sampling reduces traffic and data costs, and helps you avoid throttling. Reference: https://docs.microsoft.com/en-us/azure/azure-monitor/app/sampling
Question: 84
HOTSPOT
You need to configure the Account Kind, Replication, and Storage tier options for the corporate website’s Azure Storage account.
How should you complete the configuration? To answer, select the appropriate options in the dialog box in the answer area. NOTE: Each correct selection is worth one point.
Explanation:
Account Kind: StorageV2 (general-purpose v2)
Scenario: Azure Storage blob will be used (refer to the exhibit). Data storage costs must be minimized.
General-purpose v2 accounts: Basic storage account type for blobs, files, queues, and tables. Recommended for most scenarios using Azure Storage.