★ Pass on Your First TRY ★ 100% Money Back Guarantee ★ Realistic Practice Exam Questions

Free Instant Download NEW 70-534 Exam Dumps (PDF & VCE):
Available on: https://www.certleader.com/70-534-dumps.html


Couple of years ago, when we brought up Microsoft, consumers would probably constantly think folks which possessed obtained this certificate were being quite superb plus the generally had become the concentration involving eyes. Even so, currently, once you request friends which is effective inside it sector wether he has received Microsoft qualification, he may holler shockingly, "why ought i learn these kinds of inadequate details when Microsoft? There are lots of The idea certification these days that the many businesses will never have a look at him or her.Inch Such a reaction aids people to visualize the existing declare than it qualification, and is particularly The idea qualification really that will pointless?

2021 Oct azure certification 70-534 dumps:

Q11. - (Topic 6) 

You are designing an Azure web application that includes many static content files. 

The application is accessed from locations all over the world by using a custom domain name. 

You need to recommend an approach for providing access to the static content with the least amount of latency. 

Which two actions should you recommend? Each correct answer presents part of the solution. 

A. Place the static content in Azure Table storage. 

B. Configure a CNAME DNS record for the Azure Content Delivery Network (CDN) domain. 

C. Place the static content in Azure Blob storage. 

D. Configure a custom domain name that is an alias for the Azure Storage domain. 

Answer: B,C 

Explanation: B: There are two ways to map your custom domain to a CDN endpoint. 

1. 

Create a CNAME record with your domain registrar and map your custom domain and subdomain to the CDN endpoint 

2. 

Add an intermediate registration step with Azure cdnverify 

C: The Azure Content Delivery Network (CDN) offers developers a global solution for delivering high-bandwidth content by caching blobs and static content of compute instances at physical nodes in the United States, Europe, Asia, Australia and South America. The benefits of using CDN to cache Azure data include: / Better performance and user experience for end users who are far from a content source, and are using applications where many 'internet trips' are required to load content / Large distributed scale to better handle instantaneous high load, say, at the start of an event such as a product launch 

Reference: Using CDN for Azure https://azure.microsoft.com/en-gb/documentation/articles/cdn-how-to-use/ 

Reference: How to map Custom Domain to Content Delivery Network (CDN) endpoint 

https://github.com/Azure/azure-content/blob/master/articles/cdn-map-content-to-custom-domain.md 

https://github.com/Azure/azure-content/blob/master/articles/cdn-map-content-to-custom-domain.md 


Q12. - (Topic 4) 

You need to configure the deployment of the storage analysis application. 

What should you do? 

A. Create a new Mobile Service. 

B. Configure the deployment from source control. 

C. Add a new deployment slot. 

D. Turn on continuous integration. 

Answer: B 

Explanation: 

Scenario: Data analysis results: 

The solution must provide a web service that allows applications to access the results of 

analyses. 


Q13. - (Topic 1) 

You need to recommend a solution that allows partners to authenticate. 

Which solution should you recommend? 

A. Configure the federation provider to trust social identity providers. 

B. Configure the federation provider to use the Azure Access Control service. 

C. Create a new directory in Azure Active Directory and create a user account for the partner. 

D. Create an account on the VanArsdel domain for the partner and send an email message that contains the password to the partner. 

Answer: B 

Explanation: * Scenario: The partners all use Hotmail.com email addresses. 

* In Microsoft Azure Active Directory Access Control (also known as Access Control Service or ACS), an identity provider is a service that authenticates user or client identities and issues security tokens that ACS consumes. The ACS Management Portal provides built-in support for configuring Windows Live ID as an ACS Identity Provider. 

Incorrect: 

Not C, not D: Scenario: VanArsdel management does NOT want to create and manage 

user accounts for partners. 

Reference: Identity Providers 

https://msdn.microsoft.com/en-us/library/azure/gg185971.aspx 


Topic 2, Trey Research

Background

Overview

Trey Research conducts agricultural research and sells the results to the agriculture and food industries. The company uses a combination of on-premises and third-party server clusters to meet its storage needs. Trey Research has seasonal demands on its services, with up to 50 percent drops in data capacity and bandwidth demand during low-demand periods. They plan to host their websites in an agile, cloud environment where the company can deploy and remove its websites based on its business requirements rather than the requirements of the hosting company.

A recent fire near the datacenter that Trey Research uses raises the management team's awareness of the vulnerability of hosting all of the company's websites and data at any single location. The management team is concerned about protecting its data from loss as a result of a disaster.

Websites

Trey Research has a portfolio of 300 websites and associated background processes that are currently hosted in a third-party datacenter. All of the websites are written in ASP.NET, and the background processes use Windows Services. The hosting environment costs Trey Research approximately S25 million in hosting and maintenance fees.

Infrastructure

Trey Research also has on-premises servers that run VMs to support line-of-business applications. The company wants to migrate the line-of-business applications to the cloud, one application at a time. The company is migrating most of its production VMs from an aging VMWare ESXi farm to a Hyper-V cluster that runs on Windows Server 2012.

Applications

DistributionTracking

Trey Research has a web application named Distributiontracking. This application constantly collects realtime data that tracks worldwide distribution points to customer retail sites. This data is available to customers at all times.

The company wants to ensure that the distribution tracking data is stored at a location that is geographically close to the customers who will be using the information. The system must continue running in the event of VM failures without corrupting data. The system is processor intensive and should be run in a multithreading environment.

HRApp

The company has a human resources (HR) application named HRApp that stores data in an on-premises SQL Server database. The database must have at least two copies, but data to support backups and business continuity must stay in Trey Research locations only.

The data must remain on-premises and cannot be stored in the cloud.

HRApp was written by a third party, and the code cannot be modified. The human resources data is used by all business offices, and each office requires access to the entire database. Users report that HRApp takes all night to generate the required payroll reports, and they would like to reduce this time.

MetricsTracking

Trey Research has an application named MetricsTracking that is used to track analytics for the DistributionTracking web application. The data MetricsTracking collects is not customer-facing. Data is stored on an on-premises SQL Server database, but this data should be moved to the cloud. Employees at other locations access this data by using a remote desktop connection to connect to the application, but latency issues degrade the functionality.

Trey Research wants a solution that allows remote employees to access metrics data without using a remote desktop connection. MetricsTracking was written in-house, and the development team is available to make modifications to the application if necessary.

However, the company wants to continue to use SQL Server for MetricsTracking.

Business Requirements

Business Continuity

You have the following requirements:

Move all customer-facing data to the cloud.

Web servers should be backed up to geographically separate locations,

If one website becomes unavailable, customers should automatically be routed to websites that are still operational.

Data must be available regardless of the operational status of any particular website.

The HRApp system must remain on-premises and must be backed up.

The MetricsTracking data must be replicated so that it is locally available to all Trey Research offices.

Auditing and Security

You have the following requirements:

Both internal and external consumers should be able to access research results.

Internal users should be able to access data by using their existing company credentials without requiring multiple logins.

Consumers should be able to access the service by using their Microsoft credentials.

Applications written to access the data must be authenticated.

Access and activity must be monitored and audited.

Ensure the security and integrity of the data collected from the worldwide distribution points for the distribution tracking application.

Storage and Processing

You have the following requirements:

Provide real-time analysis of distribution tracking data by geographic location.

Collect and store large datasets in real-time data for customer use.

Locate the distribution tracking data as close to the central office as possible to improve bandwidth.

Co-locate the distribution tracking data as close to the customer as possible based on the customer's location.

Distribution tracking data must be stored in the JSON format and indexed by metadata that is stored in a SQL Server database.

Data in the cloud must be stored in geographically separate locations, but kept with the same political boundaries.

Technical Requirements

Migration

You have the following requirements:

Deploy all websites to Azure.

Replace on-premises and third-party physical server clusters with cloud-based solutions.

Optimize the speed for retrieving exiting JSON objects that contain the distribution tracking data.

Recommend strategies for partitioning data for load balancing.

Auditing and Security

You have the following requirements:

Use Active Directory for internal and external authentication.

Use OAuth for application authentication.

Business Continuity

You have the following requirements:

Data must be backed up to separate geographic locations.

Web servers must run concurrent versions of all websites in distinct geographic locations.

Use Azure to back up the on-premises MetricsTracking data.

Use Azure virtual machines as a recovery platform for MetricsTracking and HRApp.

Ensure that there is at least one additional on-premises recovery environment for the HRApp.

9. DRAG DROP - (Topic 2) 

You need to ensure that customer data is secured both in transit and at rest. 

Which technologies should you recommend? To answer, drag the appropriate technology to the correct security requirement. Each technology may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content. 


Answer: 



Q14. DRAG DROP - (Topic 6) 

You have a web application on Azure. 

The web application does not employ Secure Sockets Layer (SSL). 

You need to enable SSL for your production deployment web application on Azure. 

Which four actions should you perform in sequence? To answer, move the appropriate 

actions from the list of actions to the answer area and arrange them in the correct order. 


Answer: 



Q15. - (Topic 6) 

An application currently resides on an on-premises virtual machine that has 2 CPU cores, 4 GB of RAM, 20 GB of hard disk space, and a 10 megabit/second network connection. 

You plan to migrate the application to Azure. You have the following requirements: 

You must not make changes to the application. 

You must minimize the costs for hosting the application. 

You need to recommend the appropriate virtual machine instance type. 

Which virtual machine tier should you recommend? 

A. Network Optimized (A Series) 

B. General Purpose Compute, Basic Tier (A Series) 

C. General Purpose Compute, Standard Tier (A Series) 

D. Optimized Compute (D Series) 

Answer: B 

Explanation: General purpose compute: Basic tier An economical option for development workloads, test servers, and other applications that don't require load balancing, auto-scaling, or memory-intensive virtual machines. 

CPU core range: 1-8 RAM range: 0.75 – 14 GB Disk size: 20-240 GB 

Reference: Virtual Machines Pricing. Launch Windows Server and Linux in minutes 

http://azure.microsoft.com/en-us/pricing/details/virtual-machines/ 


70-534 practice question

Refresh exam ref 70-534:

Q16. - (Topic 6) 

You are evaluating an Azure application. The application includes the following elements: 

. A web role that provides the ASP.NET user interface and business logic 

. A single SQL database that contains all application data 

Each webpage must receive data from the business logic layer before returning results to the client. Traffic has increased significantly. The business logic is causing high CPU usage. 

You need to recommend an approach for scaling the application. 

What should you recommend? 

A. Store the business logic results in Azure Table storage. 

B. Vertically partition the SQL database. 

C. Move the business logic to a worker role. 

D. Store the business logic results in Azure local storage. 

Answer: C 

Explanation: For Cloud Services in Azure applications need both web and worker roles to scale well. 

Reference: Application Patterns and Development Strategies for SQL Server in Azure Virtual Machines 

https://msdn.microsoft.com/en-us/library/azure/dn574746.aspx 


Q17. - (Topic 3) 

You need to configure availability for the virtual machines that the company is migrating to Azure. 

What should you implement? 

A. Traffic Manager 

B. Express Route 

C. Update Domains 

D. Cloud Services 

Answer: B 

Explanation: ExpressRoute gives you a fast and reliable connection to Azure making it suitable for scenarios like periodic data migration, replication for business continuity, disaster recovery and other high availability strategies. It can also be a cost-effective option for transferring large amounts of data such as datasets for high performance computing applications or moving large VMs between your dev/test environment in Azure and on-premises production environment. 

Reference: ExpressRoute, Experience a faster, private connection to Azure 

http://azure.microsoft.com/en-us/services/expressroute/ 


Q18. HOTSPOT - (Topic 2) 

You need to design a data storage strategy for each application. 

In the table below, identify the strategy that you should use for each application. Make only one selection in each column. 


Answer: 



Q19. - (Topic 6) 

You are running a Linux guest in Azure Infrastructure-as-a-Service (IaaS). 

You must run a daily maintenance task. The maintenance task requires native BASH commands. 

You need to configure Azure Automation to perform this task. 

Which three actions should you perform? Each correct answer presents part of the solution. 

A. Create an automation account. 

B. Create an Orchestrator runbook. 

C. Create an asset credential. 

D. Run the Invoke-Workflow Azure PowerShell cmdlet. 

E. Import the SSH PowerShell Module. 

Answer: A,C,E 

Explanation: A: An Automation Account is a container for your Azure Automation resources: it provides a way to separate your environments or further organize your workflows. To create An Automation Account 

1. 

Log in to the Azure Management Portal. 

2. 

In the Management Portal, click Create an Automation Account. 

3. 

On the Add a New Automation Account page, enter a name and pick a region for the account. 

Reference: Get started with Azure Automation http://azure.microsoft.com/en-gb/documentation/articles/automation-create-runbook-from-samples/ 

C: 

Asset credentials are either a username and password combination that can be used with Windows PowerShell commands or a certificate that is uploaded to Azure Automation. 

The Assets page in Automation displays the various resources (also called “settings”) that are globally available to be used in or associated with a runbook, plus commands to import an integration module, add a new asset, or delete an asset. Assets include variables, schedules, credentials, and connections. 

Reference: Getting Started with Azure Automation: Automation Assets 

http://azure.microsoft.com/blog/2014/07/29/getting-started-with-azure-automation-automation-assets-2/ 

E: 

Reference: Managing SSH enabled Linux hosts using Service Management Automation 

http://blogs.technet.com/b/orchestrator/archive/2014/05/01/managing-ssh-enabled-linux-hosts-using-service-management-automation.aspx 


Q20. - (Topic 6) 

You are designing an Azure application that stores data. 

You have the following requirements: 

The data storage system must support storing more than 500 GB of data. Data retrieval must be possible from a large number of parallel threads. Threads must not block each other. 

You need to recommend an approach for storing data. 

What should you recommend? 

A. Azure Notification Hubs 

B. A single SQL database in Azure 

C. Azure Queue storage 

D. Azure Table storage 

Answer: D 

Explanation: * Azure Table Storage can be useful for applications that must store large amounts of nonrelational data, and need additional structure for that data. Tables offer key-based access to unschematized data at a low cost for applications with simplified data-access patterns. While Azure Table Storage stores structured data without schemas, it does not provide any way to represent relationships between the data. 

* As a solution architect/developer, consider using Azure Table Storage when: 

/ Your application stores and retrieves large data sets and does not have complex 

relationships that require server-side joins, secondary indexes, or complex server-side 

logic. 

/ You need to achieve a high level of scaling without having to manually shard your dataset. 

Reference: Azure Table Storage and Windows Azure SQL Database - Compared and Contrasted 

https://msdn.microsoft.com/en-us/library/azure/jj553018.aspx