aiotestking uk

70-470 Exam Questions - Online Test


70-470 Premium VCE File

Learn More 100% Pass Guarantee - Dumps Verified - Instant Download
150 Lectures, 20 Hours

Q1. - (Topic 9) 

You are designing a partitioning strategy for a large fact table in a data warehouse. Tens of millions of new records are loaded into the data warehouse weekly, outside of 

business hours. Most queries are generated by reports and by cube processing. Data is 

frequently queried at the day level and occasionally at the month level. 

You need to partition the table to maximize the performance of queries. 

What should you do? (More than one answer choice may achieve the goal. Select the 

BEST answer.) 

A. Partition the fact table by month, and compress each partition. 

B. Partition the fact table by week. 

C. Partition the fact table by year. 

D. Partition the fact table by day, and compress each partition. 

Answer:

Q2. - (Topic 10) 

You are developing a SQL Server Analysis Services (SSAS) cube. The cube contains several dimensions, a local measure group, and a linked measure group. Both measure groups use MOLAP partitions. 

You need to write-enable one of the linked measure group partitions to support Microsoft Excel 2010 PivotTable What-If Analysis. 

What should you do before the partition can be write-enabled? 

A. Implement the linked measure group as a local measure group. 

B. Implement the local measure group as a linked measure group. 

C. Set the Type property of the partition's measure group to Forecast. 

D. Set the StorageMode property of the linked measure group to Rolap. 

Answer:

Q3. DRAG DROP - (Topic 10) 

You are planning the installation of PowerPivot for SharePoint. 

You install SharePoint Server 2010 Enterprise Edition with Service Pack 1. 

You need to install the PowerPivot for SharePoint instance. Then you need to configure the 

Default Account username used to provision shared services in the SharePoint farm. Which three actions should you perform in sequence? (To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.) 

Answer:  

Q4. - (Topic 5) 

You need to create the Package Activity report. 

What should you do? 

A. Create a log table and use SSIS event handlers to write to the log table. Then create an SSRS report that uses the log table. 

B. Use the SSIS log provider for SQL Server. Then create an SSRS report that uses the sysssislog table. 

C. Create a log table and build a custom log provider to write to the log table. Then create an SSRS report that uses the log table. 

D. Create an SSRS report that uses thecatalog.executions and catalog.execution_data_statistics views. 

Answer:

Q5. - (Topic 2) 

You need to develop an SSRS report that retrieves currency exchange rate data. 

How should you configure the data source for the report? 

A. Use the Windows Azure SQL Database data source type and then set Windows authentication for the credentials. 

B. Use the Windows Azure SQL Database data source type and then set a username and password for the credentials. 

C. Use the SQL Server data source type and then set a username and password for the credentials. 

D. Use the SQL Server data source type and then set Windows authentication for the credentials. 

Answer:

Topic 3, Tailspin Toys Case A 

Background You are the business intelligence (BI) solutions architect for Tailspin Toys. 

You produce solutions by using SQL Server 2012 Business Intelligence edition and Microsoft SharePoint Server 2010 Service Pack 1 (SP1) Enterprise edition. 

Technical Background 

Data Warehouse The data warehouse is deployed on a SQL Server 2012 relational database. A subset of the data warehouse schema is shown in the exhibit. (Click the Exhibit button.) 

The schema shown does not include the table design for the product dimension. 

The schema includes the following tables: . FactSalesPlan table stores data at month-level granularity. There are two scenarios: Forecast and Budget. . The DimDate table stores a record for each date from the beginning of the company's operations through to the end of the next year. . The DimRegion table stores a record for each sales region, classified by country. 

Sales regions do not relocate to different countries. . The DimCustomer table stores a record for each customer. . The DimSalesperson table stores a record for each salesperson. If a salesperson 

relocates to a different region, a new salesperson record is created to support historically accurate reporting. A new salesperson record is not created if a salesperson's name changes. 

. The DimScenario table stores one record for each of the two planning scenarios. 

All relationships between tables are enforced by foreign keys. The schema design is as denormalized as possible for simplicity and accessibility. One exception to this is the DimRegion table, which is referenced by two dimension tables. 

Each product is classified by a category and subcategory and is uniquely identified in the source database by using its stock-keeping unit (SKU). A new SKU is assigned to a product if its size changes. Products are never assigned to a different subcategory, and subcategories are never assigned to a different category. 

Extract, transform, load (ETL) processes populate the data warehouse every 24 hours. 

ETL Processes One SQL Server Integration Services (SSIS) package is designed and developed to populate each data warehouse table. The primary source of data is extracted from a SQL Azure database. Secondary data sources include a Microsoft Dynamics CRM 2011 on-premises database. ETL developers develop packages by using the SSIS project deployment model. The ETL developers are responsible for testing the packages and producing a deployment file. The deployment file is given to the ETL administrators. The ETL administrators belong to a Windows security group named SSISOwners that maps to a SQL Server login named SSISOwners. 

Data Models The IT department has developed and manages two SQL Server Analysis Services (SSAS) BI Semantic Model (BISM) projects: Sales Reporting and Sales Analysis. The Sales Reporting database has been developed as a tabular project. The Sales Analysis database has been developed as a multidimensional project. Business analysts use PowerPivot for Microsoft Excel to produce self-managed data models based directly on the data warehouse or the corporate data models, and publish the PowerPivot workbooks to a SharePoint site. 

The sole purpose of the Sales Reporting database is to support business user reporting and ad-hoc analysis by using Power View. The database is configured for DirectQuery mode and all model queries result in SSAS querying the data warehouse. The database is based on the entire data warehouse. 

The Sales Analysis database consists of a single SSAS cube named Sales. The Sales cube has been developed to support sales monitoring, analysts, and planning. The Sales cube metadata is shown in the following graphic. 

Details of specific Sales cube dimensions are described in the following table. 

The Sales cube dimension usage is shown in the following graphic. 

The Sales measure group is based on the FactSales table. The Sales Plan measure group 

is based on the FactSalesPlan table. The Sales Plan measure group has been configured 

with a multidimensional OLAP (MOLAP) writeback partition. Both measure groups use 

MOLAP partitions, and aggregation designs are assigned to all partitions. Because the 

volumes of data in the data warehouse are large, an incremental processing strategy has 

been implemented. 

The Sales Variance calculated member is computed by subtracting the Sales Plan forecast 

amount from Sales. The Sales Variance °/o calculated member is computed by dividing 

Sales Variance by Sales. The cube's Multidimensional Expressions (MDX) script does not 

set any color properties. 

Analysis and Reporting 

SQL Server Reporting Services (SSRS) has been configured in SharePoint integrated 

mode. 

A business analyst has created a PowerPivot workbook named Manufacturing 

Performance that integrates data from the data warehouse and manufacturing data from an 

operational database hosted in SQL Azure. The workbook has been published in a 

PowerPivot Gallery library in SharePoint Server and does not contain any reports. The 

analyst has scheduled daily data refresh from the SQL Azure database. Several SSRS 

reports are based on the PowerPivot workbook, and all reports are configured with a report 

execution mode to run on demand. 

Recently users have noticed that data in the PowerPivot workbooks published to 

SharePoint Server is not being refreshed. The SharePoint administrator has identified that 

the Secure Store Service target application used by the PowerPivot unattended data 

refresh account has been deleted. 

Business Requirements 

ETL Processes 

All ETL administrators must have full privileges to administer and monitor the SSIS catalog, 

and to import and manage projects. 

Data Models 

The budget and forecast values must never be accumulated when querying the Sales 

cube. Queries should return the forecast sales values by default. 

Business users have requested that a single field named SalespersonName be made 

available to report the full name of the salesperson in the Sales Reporting data model. 

Writeback is used to initialize the budget sales values for a future year and is based on a 

weighted allocation of the sales achieved in the previous year. 

Analysis and Reporting 

Reports based on the Manufacturing Performance PowerPivot workbook must deliver data 

that is no more than one hour old. 

Management has requested a new report named Regional Sales. This report must be 

based on the Sales cube and must allow users to filter by a specific year and present a grid 

with every region on the columns and the Products hierarchy on the rows. The hierarchy 

must initially be collapsed and allow the user to drill down through the hierarchy to analyze 

sales. Additionally, sales values that are less than S5000 must be highlighted in red. 

Technical Requirements 

Data Warehouse 

Business logic in the form of calculations should be defined in the data warehouse to 

ensure consistency and availability to all data modeling experiences. 

The schema design should remain as denormalized as possible and should not include 

unnecessary columns. 

The schema design must be extended to include the product dimension data. 

ETL Processes 

Package executions must log only data flow component phases and errors. 

Data Models 

Processing time for all data models must be minimized. 

A key performance indicator (KPI) must be added to the Sales cube to monitor sales 

performance. The KPI trend must use the Standard Arrow indicator to display improving, 

static, or deteriorating Sales Variance % values compared to the previous time period. 

Analysis and Reporting 

IT developers must create a library of SSRS reports based on the Sales Reporting 

database. A shared SSRS data source named Sales Reporting must be created in a 

SharePoint data connections library. 

Q6. DRAG DROP - (Topic 10) 

You are developing a SQL Server Analysis Services (SSAS) tabular project. 

You need to add a calculated column to a table in the model. 

Which three actions should you perform in sequence? (To answer, move the appropriate 

actions from the list of actions to the answer area and arrange them in the correct order.) 

Answer:  

Q7. - (Topic 10) 

You manage an environment that has SharePoint Server 2010 and a SQL Server Reporting Services (SSRS) instance in SharePoint integrated mode. Several report subscriptions are configured to deliver reports through a shared folder by using a shared schedule. The shared folder will be going offline. 

You need to temporarily suspend the shared schedule until the shared folder is brought back online. 

What should you do? 

A. In SharePoint Central Administration, pause the shared schedule. 

B. Open Report Manager and then delete the shared schedule. 

C. In SharePoint Central Administration, delete the shared schedule. 

D. Open Report Manager and then pause the shared schedule. 

Answer:

Q8. - (Topic 10) 

You are developing a SQL Server Analysis Services (SSAS) tabular project. The model has tables named Invoice Line Items and Products. 

The Invoice Line Items table has the following columns: 

.. 

Product Id Unit Sales Price 

The Unit Sales Price column stores the unit price of the product sold. The Products table 

has the following columns: 

. Product Id 

. Minimum Sales Price 

The Minimum Sales Price column is available only in the Products table. 

You add a column named Is Undersell to the Invoice Line Items table. The Is Undersell column must store a value of TRUE if the value of the Unit Sales Price is less than the value of the Minimum Sales Price. Otherwise, a value of FALSE must be stored. 

You need to define the Data Analysis Expressions (DAX) expression for the Is Undersell column. 

Which DAX formula should you use? (Each answer represents a complete solution. Choose all that apply.) 

A. =IF([Unit Sales Price] < RELATED(Products[Minimum Sales Price]), TRUE, FALSE) 

B. =IF(RELATED(Products[Unit Sales Price]) < [Minimum Sales Price], TRUE, FALSE) 

C. =IF([Unit Sales Price] < LOOKUPVALUE(Products[Minimum Sales Price], Products[Product Id], [Product Id]), TRUE, FALSE) 

D. =IF(LOOKUPVALUE(Products[Unit Sales Price], Products[Product Id], [Product Id]) < [Minimum Sales Price]), TRUE, FALSE) 

Answer: A,C 

Explanation: A: RELATED Function Returns a related value from another table. 

* The RELATED function requires that a relationship exists between the current table and 

the table with related information. You specify the column that contains the data that you 

want, and the function follows an existing many-to-one relationship to fetch the value from 

the specified column in the related table. 

C: 

The lookupvalue function returns the value in result_columnName for the row that meets all 

criteria specified by search_columnName and search_value. 

Syntax: 

LOOKUPVALUE( <result_columnName>, <search_columnName>, <search_value>[, <search_columnName>, <search_value>]…) 

Note: 

The syntax of DAX formulas is very similar to that of Excel formulas, and uses a 

combination of functions, operators, and values. 

Q9. - (Topic 9) 

A company runs SQL Server Database Engine and SQL Server Reporting Services (SSRS) in native mode. Reports are based on data that is cached in multiple shared datasets. Source data is purged each day at midnight for regulatory compliance purposes. The shared datasets may continue to cache data that should not be used in reports. Shared report schedules are often paused during nightly server maintenance windows. 

Reports must not return purged data. 

You need to create a fully automated solution to ensure that reports do not deliver purged data. 

What should you do? (More than one answer choice may achieve the goal. Select the BEST answer.) 

A. Create a shared schedule. Configure the datasets to expire on the shared schedule. 

B. Write a script that calls the flushcache method to clear individual items from the SSRS cache. Create a SQL Server Agent job that runs rs.exe with the script as an input file, and schedule the job to run every day after the purge process completes. 

C. Create a SQL Server Agent job that uses a Transact-SQL (T-SQL) step to delete the data from the dbo.ExecutionCache table in the ReportServerTempDB database. Schedule the job to run every day after the purge process completes. 

D. Republish the cached datasets by using SQL Server Data Tools. 

Answer:

Q10. - (Topic 10) 

You are troubleshooting query performance for a SQL Server Analysis Services (SSAS) cube. 

A user reports that a Multidimensional Expressions (MDX) query is very slow. 

You need to identify the MDX query statement in a trace by using SQL Server Profiler. 

Which event class should you use? 

A. Progress Report Begin 

B. Query Begin 

C. Execute MDX Script Begin 

D. Calculate Non Empty Begin 

E. Get Data From Aggregation 

F. Query Subcube 

Answer: