★ Pass on Your First TRY ★ 100% Money Back Guarantee ★ Realistic Practice Exam Questions

Free Instant Download NEW 70-463 Exam Dumps (PDF & VCE):
Available on: https://www.certleader.com/70-463-dumps.html


Want to know Actualtests 70-463 Exam practice test features? Want to lear more about Microsoft Implementing a Data Warehouse with Microsoft SQL Server 2012 certification experience? Study Approved Microsoft 70-463 answers to Updated 70-463 questions at Actualtests. Gat a success with an absolute guarantee to pass Microsoft 70-463 (Implementing a Data Warehouse with Microsoft SQL Server 2012) test on your first attempt.

2021 Jun 70-463 course outline:

Q41. You are developing a data flow transformation to merge two data sources. One source contains product data and the other source contains data about the country in which the product was manufactured. Both data sources contain a two-character CountryCode column and both use SQL Server. Both data sources contain an ORDER BY clause to sort the data by the CountryCode column in ascending order. 

You use a Merge Join transformation to join the data. 


You need to ensure that the Merge Join transformation works correctly without additional transformations. 

What should you do? (Each correct answer presents part of the solution. Choose all that apply.) 

A. Change the ORDER BY clause on the product source to order by ProductName. 

B. Change the Merge Join transformation to a Merge transformation. 

C. Set the appropriate SortKeyPosition properties on the data sources. 

D. Set the IsSorted property on both data sources. 

Answer: C,D 


Q42. You are designing a data warehouse with two fact tables. The first table contains sales per month and the second table contains orders per day. 

Referential integrity must be enforced declaratively. 

You need to design a solution that can join a single time dimension to both fact tables. 

What should you do? 

A. Create a time mapping table. 

B. Change the level of granularity in both fact tables to be the same. 

C. Merge the fact tables. 

D. Create a view on the sales table. 

Answer: C 


Q43. To ease the debugging of packages, you standardize the SQL Server Integration Services (SSIS) package logging methodology. 

The methodology has the following requirements: 

Centralized logging in SQL Server 

Simple deployment 

Availability of log information through reports or T-SQL 

Automatic purge of older log entries 

Configurable log details 

You need to configure a logging methodology that meets the requirements while minimizing the amount of deployment and development effort. 

What should you do? 

A. Deploy the package by using an msi file. 

B. Use the gacutil command. 

C. Create an OnError event handler. 

D. Create a reusable custom logging component. 

E. Use the dtutil /copy command. 

F. Use the Project Deployment Wizard. 

G. Run the package by using the dtexec /rep /conn command. 

H. Add a data tap on the output of a component in the package data flow. 

I. Run the package by using the dtexec /dumperror /conn command. 

J. Run the package by using the dtexecui.exe utility and the SQL Log provider. 

K. Deploy the package to the Integration Services catalog by using dtutil and use SQL Server to store the configuration. 

Answer: I 

Explanation: References: http://msdn.microsoft.com/en-us/library/ms140246.aspx 

http://msdn.microsoft.com/en-us/library/hh231187.aspx 


Q44. You manage a SQL Server Master Data Services (MDS) environment. 

A new application requires access to the product data that is available in the MDS repository. 

You need to design a solution that gives the application access to the product data with the least amount of development effort. 

What should you do? 

A. Use sp_addlinkedserver to add a linked server to access the MDS database tables directly. 

B. Create an OLE DB connection string that sets the Provider property to MDS. 

C. Use transactional replication for data synchronization. 

D. Create a Subscription View in MDS. 

Answer: D 


Q45. A SQL Server Integration Services (SSIS) package was deployed two weeks ago with the Project Deployment Model. 

Sometimes the package is started as part of a multistep SQL job. At other times, the package is started manually by a database administrator by using the Object Explorer in SQL Server Management Studio. 

You need to identify the authenticated user responsible for starting the package each time it executes. 

How can you find this information? 

A. In the SSISDB.[catalog], query the .[executions] view. 

B. In the SSISDB.[catalog] , query the [event_messages] view. 

C. In SQL Server Management Studio, view the SQL Agent Job History. 

D. In SQL Server Management Studio, view the SQL Agent Error Log. 

E. In SQL Server Management Studio, view the SQL Server Log. 

Answer: A 


70-463 exam cram

Down to date 70-463 sample questions:

Q46. You are developing a SQL Server Integration Services (SSIS) project that copies a large amount of rows from a SQL Azure database. The project uses the Package Deployment 

Model. This project is deployed to SQL Server on a test server. 

You need to ensure that the project is deployed to the SSIS catalog on the production server. 

What should you do? 

A. Open a command prompt and run the dtexec /dumperror /conn command. 

B. Create a reusable custom logging component and use it in the SSIS project. 

C. Open a command prompt and run the gacutil command. 

D. Add an OnError event handler to the SSIS project. 

E. Open a command prompt and execute the package by using the SQL Log provider and running the dtexecui.exe utility. 

F. Open a command prompt and run the dtexec /rep /conn command. 

G. Open a command prompt and run the dtutil /copy command. 

H. Use an msi file to deploy the package on the server. 

I. Configure the SSIS solution to use the Project Deployment Model. 

J. Configure the output of a component in the package data flow to use a data tap. 

K. Run the dtutil command to deploy the package to the SSIS catalog and store the configuration in SQL Server. 

Answer: I 

Explanation: References: http://msdn.microsoft.com/en-us/library/hh231102.aspx 

http://msdn.microsoft.com/en-us/library/hh213290.aspx http://msdn.microsoft.com/en-us/library/hh213373.aspx 


Q47. You are designing a partitioning strategy for a large fact table in a data warehouse. Tens of millions of new records are loaded into the data warehouse weekly, outside of business hours. 

Most queries are generated by reports and by cube processing. Data is frequently queried at the day level and occasionally at the month level. 

You need to partition the table to maximize the performance of queries. What should you do? (More than one answer choice may achieve the goal. Select the BEST answer.) 

A. Partition the fact table by month, and compress each partition. 

B. Partition the fact table by week. 

C. Partition the fact table by year. 

D. Partition the fact table by day, and compress each partition. 

Answer: D 


Q48. You are designing a SQL Server Integration Services (SSIS) package that uses the Fuzzy Lookup transformation. 

The reference data to be used in the transformation does not change. 

You need to reuse the Fuzzy Lookup match index to increase performance and reduce maintenance. 

What should you do? 

A. Select the GenerateAndPersistNewIndex option in the Fuzzy Lookup Transformation Editor. 

B. Select the GenerateNewIndex option in the Fuzzy Lookup Transformation Editor. 

C. Select the DropExistingMatchlndex option in the Fuzzy Lookup Transformation Editor. 

D. Execute the sp_FuzzyLookupTableMaintenanceUninstall stored procedure. 

E. Execute the sp_FuzzyLookupTableMaintenanceInvoke stored procedure. 

Answer: A 

Reference: http://msdn.microsoft.com/en-us/library/ms137786.aspx 


Q49. You are performance tuning a SQL Server Integration Services (SSIS) package to load sales data from a source system into a data warehouse that is hosted on Windows Azure SQL Database. 

The package contains a data flow task that has seven source-to-destination execution trees. 

Only three of the source-to-destination execution trees are running in parallel. 

You need to ensure that all the execution trees run in parallel. 

What should you do? 

A. Set the EngineThreads property of the data flow task to 7. 

B. Set the MaxConcurrentExcecutables property of the package to 7. 

C. Create seven data flow tasks that contain one source-to-destination execution tree each. 

D. Place the data flow task in a For Loop container that is configured to execute seven times. 

Answer: A 


Q50. You are reviewing the design of a customer dimension table in an existing data warehouse hosted on SQL Azure. 

The current dimension design does not allow the retention of historical changes to customer attributes such as Postcode. 

You need to redesign the dimension to enable the full historical reporting of changes to multiple customer attributes including Postcode. 

What should you do? 

A. Add StartDate and EndDate columns to the customer dimension. 

B. Add an IsCurrent column to the customer dimension. 

C. Enable Snapshot Isolation on the data warehouse. 

D. Add CurrentValue and PreviousValue columns to the customer dimension. 

Answer: A