Exam Code : DA-100
Exam Name : Analyzing Data with Microsoft Power BI
Vendor Name :
"Microsoft"
DA-100 Dumps
DA-100 Braindumps DA-100 Real Questions DA-100 Practice Test
DA-100 Actual Questions
killexams.com
Analyzing Data with Microsoft Power BI
https://killexams.com/pass4sure/exam-detail/DA-100
Question: 221
You have a prospective customer list that contains 1,500 rows of data. The list contains the following fields:
First name
Last name
Email address
State/Region
Phone number
You import the list into Power Query Editor.
You need to ensure that the list contains records for each State/Region to which you want to target a marketing campaign.
Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
A . Open the Advanced Editor. B . Select Column quality.
C . Enable Column profiling based on entire dataset. D . Select Column distribution.
E . Select Column profile.
Data Profiling, Quality & Distribution in Power BI / Power Query features
To enable these features, you need to go to the View tab à Data Preview Group à Check the following:
Column quality
Column profile
Column distribution
Column profile
Turn on the Column Profiling feature.
Column distribution
Can use it to visually realize that your query is missing some data because of distinct and uniqueness counts.
Reference:
https://www.poweredsolutions.co/2019/08/13/data-profiling-quality-distribution-in-power-bi-power-query/ https://www.altentertraining.com/microsoft/power-bi/column-profiling-is-good/
Question: 222
You open a query in Power Query Editor.
You need to identify the percentage of empty values in each column as quickly as possible. Which Data Preview option should you select?
A . Show whitespace B . Column profile
C . Column distribution D . Column quality
Column quality: In this section, we can easily see valid, Error and Empty percentage of data values associated with the Selected table.
Note: In Power Query Editor, Under View tab in Data Preview Section we can see the following data profiling functionalities:
Column quality
Column distribution
Column profile
Reference: https://community.powerbi.com/t5/Community-Blog/Data-Profiling-in-Power-BI-Power-BI-Update-April- 2019/ba-p/674555
Question: 223
DRAG DROP
You are building a dataset from a JSON file that contains an array of documents.
You need to import attributes as columns from all the documents in the JSON file. The solution must ensure that date attributes can be used as date hierarchies in Microsoft Power BI reports.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Explanation:
Step 1: Expand the records.
First Open Power BI desktop and navigate to Power Query, import the JSON file, then load the data, click on the record to expand it and to see the record and list.
Step 2: Add columns that use data type conversions. Step 3: Convert the list to a table
Question: 224
You have the tables shown in the following table.
The Impressions table contains approximately 30 million records per month. You need to create an ad analytics system to meet the following requirements:
Present ad impression counts for the day, campaign, and Site_name. The analytics for the last year are required.
Minimize the data model size.
Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
A . Group the impressions by Ad_id, Site_name, and Impression_date. Aggregate by using the CountRows function. B . Create one-to-many relationships between the tables.
C . Create a calculated measure that aggregates by using the COUNTROWS function. D . Create a calculated table that contains Ad_id, Site_name, and Impression_date.
Your company has training videos that are published to Microsoft Stream. You need to surface the videos directly in a Microsoft Power BI dashboard.
Which type of tile should you add? A . video
B . custom streaming data C . text box
D . web content
The only way to visualize a streaming dataset is to add a tile and use the streaming dataset as a custom streaming data source.
Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/service-real-time-streaming
Question: 226
Testlet 1 Case Study
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all question included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other question on this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study
To display the first question on this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab , note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview
Litware, Inc. is an online retailer that uses Microsoft Power BI dashboards and reports.
The company plans to leverage data from Microsoft SQL Server databases, Microsoft Excel files, text files, and several other data sources.
Litware uses Azure Active Directory (Azure AD) to authenticate users. Existing Environment
Sales Data
Litware has online sales data that has the SQL schema shown in the following table.
In the Date table, the date_id column has a format of yyyymmdd and the month column has a format of yyyymm. The week column in the Date table and the week_id column in the Weekly_Returns table have a format of yyyyww. The region_id column can be managed by only one sales manager.
Sales Data
Litware, Inc. is an online retailer that uses Microsoft Power BI dashboards and reports.
The company plans to leverage data from Microsoft SQL Server databases, Microsoft Excel files, text files, and several other data sources.
Litware uses Azure Active Directory (Azure AD) to authenticate users. Data Concerns
You are concerned with the quality and completeness of the sales data. You plan to verify the sales data for negative sales amounts.
Reporting Requirements
Litware identifies the following technical requirements:
Executives require a visual that shows sales by region.
Region managers require a visual to analyze weekly sales and returns.
Sales managers must be able to see the sales data of their respective region only.
The sales managers require a visual to analyze sales performance versus sales targets.
The sale department requires reports that contain the number of sales transactions.
Users must be able to see the month in reports as shown in the following example: Feb 2020.
The customer service department requires a visual that can be filtered by both sales month and ship month independently.
You need to create a calculated column to display the month based on the reporting requirements. Which DAX expression should you use?
A . FORMAT(‘Date'[date], "MMM YYYY")
B . FORMAT(‘Date’ [date], "M YY")
C . FORMAT(‘Date'[date_id], "MMM") & "" & FORMAT(‘Date'[year], "#") D . FORMAT(‘Date’ [date_id], "MMM YYYY")
Scenario: In the Date table, the date_id column has a format of yyyymmdd. Users must be able to see the month in reports as shown in the following example: Feb 2020.
Question: 228
Question Set 3
You have a large dataset that contains more than 1 million rows. The table has a datetime column named Date. You need to reduce the size of the data model.
What should you do?
A . Round the hour of the Date column to startOfHour. B . Change the data type of the Date column to Text. C . Trim the Date column.
D . Split the Date column into two columns, one that contains only the time and another that contains only the date.
We have to separate date & time tables. Also, we don’t need to put the time into the date table, because the time is repeated every day. Split your DateTime column into a separate date & time columns in fact table, so that you can join the date to the date table & the time to the time table. The time need to be converted to the nearest round minute or second so that every time in your data corresponds to a row in your time table.
Reference: https://intellipaat.com/community/6461/how-to-include-time-in-date-hierarchy-in-power-bi
Question: 229
You have a data model that contains many complex DAX expressions. The expressions contain frequent references to the RELATED and RELATEDTABLE functions.
You need to recommend a solution to minimize the use of the RELATED and RELATEDTABLE functions. What should you recommend?
A . Split the model into multiple models.
B . Hide unused columns in the model. C . Merge tables by using Power Query. D . Transpose.
Combining data means connecting to two or more data sources, shaping them as needed, then consolidating them into a useful query. When you have one or more columns that you’d like to add to another query, you merge the queries.
Note: The RELATEDTABLE function is a shortcut for CALCULATETABLE function with no logical expression. CALCULATETABLE evaluates a table expression in a modified filter context and returns A table of values.
Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-shape-and-combine-data
Question: 230
You have the following three versions of an Azure SQL database:
Test
Production
Development
You have a dataset that uses the development database as a data source.
You need to configure the dataset so that you can easily change the data source between the development, test, and production database servers from powerbi.com.
Which should you do?
A . Create a JSON file that contains the database server names. Import the JSON file to the dataset. B . Create a parameter and update the queries to use the parameter.
C . Create a query for each database server and hide the development tables.
D . Set the data source privacy level to Organizational and use the ReplaceValuePower Query M function.
With privacy level settings, you can specify an isolation level that defines the degree that one data source must be isolated from other data sources.
An Organizational data source limits the visibility of a data source to a trusted group of people. An Organizational data source is isolated from all Public data sources, but is visible to other Organizational data sources.
Reference: https://docs.microsoft.com/en-us/power-bi/admin/desktop-privacy-levels
Question: 231
You have an Azure SQL database that contains sales transactions. The database is updated frequently.
You need to generate reports from the data to detect fraudulent transactions. The data must be visible within five minutes of an update.
How should you configure the data connection? A . Add a SQL statement.
B . Set Data Connectivity mode to DirectQuery. C . Set the Command timeout in minutes setting. D . Set Data Connectivity mode to Import.
With Power BI Desktop, when you connect to your data source, it’s always possible to import a copy of the data into the Power BI Desktop. For some data sources, an alternative approach is available: connect directly to the data source using DirectQuery.
DirectQuery: No data is imported or copied into Power BI Desktop. For relational sources, the selected tables and columns appear in the Fields list. For multi-dimensional sources like SAP Business Warehouse, the dimensions and measures of the selected cube appear in the Fields list. As you create or interact with a visualization, Power BI Desktop queries the underlying data source, so you’re always viewing current data.
Incorrect Answers:
D: Import: The selected tables and columns are imported into Power BI Desktop. As you create or interact with a visualization, Power BI Desktop uses the imported data. To see underlying data changes since the initial import or the most recent refresh, you must refresh the data, which imports the full dataset again.
Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-use-directquery
Question: 232
Testlet 2 Case Study
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However, there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is independent of the other question on this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to the next section of the exam. After you begin a new section, you cannot return to this section.
To start the case study
To display the first question on this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem statements. If the case study has an All Information tab , note that the information displayed is identical to the information displayed on the subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Overview
Existing Environment
Contoso, Ltd. is a manufacturing company that produces outdoor equipment. Contoso has quarterly board meetings for which financial analysts manually prepare Microsoft Excel reports, including profit and loss statements for each of the company’s four business units, a company balance sheet, and net income projections for the next quarter.
Data and Sources
Data for the reports comes from three sources. Detailed revenue, cost, and expense data comes from an Azure SQL database. Summary balance sheet data comes from Microsoft Dynamics 365 Business Central. The balance sheet data is not related to the profit and loss results, other than they both relate dates.
Monthly revenue and expense projections for the next quarter come from a Microsoft SharePoint Online list. Quarterly projections relate to the profit and loss results by using the following shared dimensions: date, business unit, department, and product category.
Net Income Projection Data
Net income projection data is stored in a SharePoint Online list named Projections in the format shown in the following table.
Revenue projections are set at the monthly level and summed to show projections for the quarter. Balance Sheet Data
The balance sheet data is imported with final balances for each account per month in the format shown in the following table.
There is always a row for each account for each month in the balance sheet data. Dynamics 365 Business Central Data
Business Central contains a product catalog that shows how products roll up to product categories, which roll up to business units.
Revenue data is provided at the date and product level. Expense data is provided at the date and department level. Business Issues
Historically, it has taken two analysts a week to prepare the reports for the quarterly board meetings. Also, there is usually at least one issue each quarter where a value in a report is wrong because of a bad cell reference in an Excel formula. On occasion, there are conflicting results in the reports because the products and departments that roll up to each business unit are not defined consistently.
Requirements Planned Changes
Contoso plans to automate and standardize the quarterly reporting process by using Microsoft Power BI. The company wants to how long it takes to populate reports to less than two days. The company wants to create common logic for business units, products, and departments to be used across all reports, including, but not limited, to the quarterly reporting for the board.
Technical Requirements
Contoso wants the reports and datasets refreshed with minimal manual effort.
The company wants to provide a single package of reports to the board that contains custom navigation and links to supplementary information.
Maintenance, including manually updating data and access, must be minimized as much as possible. Security Requirements
The reports must be made available to the board from powerbi.com. A mail-enabled security group will be used to share information with the board.
The analysts responsible for each business unit must see all the data the board sees, except the profit and loss data, which must be restricted to only their business unit’s data. The analysts must be able to build new reports from the dataset that contains the profit and loss data, but any reports that the analysts build must not be included in the quarterly reports for the board. The analysts must not be able to share the quarterly reports with anyone.
Report Requirements
You plan to relate the balance sheet to a standard date table in Power BI in a many-to-one relationship based on the last day of the month. At least one of the balance sheet reports in the quarterly reporting package must show the ending balances for the quarter, as well as for the previous quarter.
Projections must contain a column named RevenueProjection that contains the revenue projection amounts.
A relationship must be created from Projections to a table named Date that contains the columns shown in the following table.
The relationships between products and departments to business units must be consistent across all reports. The board must be able to get the following information from the quarterly reports:
Revenue trends over time
Ending balances for each account
A comparison of expenses versus projections by quarter
Changes in long-term liabilities from the previous quarter
A comparison of quarterly revenue versus the same quarter during the prior year
What is the minimum number of datasets and storage modes required to support the reports? A . two imported datasets
B . a single DirectQuery dataset C . two DirectQuery datasets
D . a single imported dataset
Scenario: Data and Sources Data for the reports comes from three sources. Detailed revenue, cost, and expense data comes from an Azure SQL database. Summary balance sheet data comes from Microsoft Dynamics 365 Business Central. The balance sheet data is not related to the profit and loss results, other than they both relate dates.
Monthly revenue and expense projections for the next quarter come from a Microsoft SharePoint Online list. Quarterly projections relate to the profit and loss results by using the following shared dimensions: date, business unit, department, and product category.
Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/service-datasets-understand
Question: 233
You import two Microsoft Excel tables named Customer and Address into Power Query. Customer contains the following columns:
Customer ID
Customer Name
Phone
Email Address
Address ID
Address contains the following columns:
Address ID
Address Line 1
Address Line 2
City
State/Region
Country
Postal Code
The Customer ID and Address ID columns represent unique rows.
You need to create a query that has one row per customer. Each row must contain City, State/Region, and Country for each customer.
What should you do?
A . Merge the Customer and Address tables.
B . Transpose the Customer and Address tables.
C . Group the Customer and Address tables by the Address ID column. D . Append the Customer and Address tables.
There are two primary ways of combining queries: merging and appending.
When you have one or more columns that you’d like to add to another query, you merge the queries.
When you have additional rows of data that you’d like to add to an existing query, you append the query. Reference: https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-shape-and-combine-data