Register Now

Login

Lost Password

Lost your password? Please enter your email address. You will receive a link and will create a new password via email.

Add post

You must login to add post .

Add question

You must login to ask a question.

Login

Register Now

Welcome to Scholarsark.com! Your registration will grant you access to using more features of this platform. You can ask questions, make contributions or provide answers, view profiles of other users and lots more. Register now!

Practice Exams | Microsoft DA-100: MS Data Analyst Associate

Practice Exams | Microsoft DA-100: MS Data Analyst Associate

Price: $19.99

These tests are simulations of what the real exam will be like. If you ace these practice tests, you’ll be in good shape for the actual exam.

Each question has a detailed explanation and links to reference materials to support the answers which ensures accuracy of the problem solutions.

The questions will be shuffled each time you repeat the tests so you will need to know why an answer is correct, not just that the correct answer was item “B” last time you went through the test.

Data Analysts enable businesses to maximize the value of their data assets by using Microsoft Power BI. As a subject matter expert, Data Analysts are responsible for designing and building scalable data models, cleaning and transforming data, and enabling advanced analytic capabilities that provide meaningful business value through easy-to-comprehend data visualizations. Data Analysts also collaborate with key stakeholders across verticals to deliver relevant insights based on identified business requirements.

The Data Analyst should have a fundamental understanding of data repositories and data processing both on-premises and in the cloud.

This exam measures your ability to accomplish the following technical tasks: prepare the data; model the data; visualize the data; analyze the data; and deploy and maintain deliverables.

Skills measured on Microsoft DA-100 Exam

Prepare the Data (20-25%)

  • Get data from different data sources

  • identify and connect to a data source

  • change data source settings

  • select a shared dataset or create a local dataset

  • select a storage mode

  • choose an appropriate query type

  • identify query performance issues

  • use Microsoft Dataverse

  • use parameters

  • use or create a PBIDS file

  • use or create a data flow

Profile the data

  • identify data anomalies

  • examine data structures

  • interrogate column properties

  • interrogate data statistics

Clean, transform, and load the data

  • resolve inconsistencies, unexpected or null values, and data quality issues

  • apply user-friendly value replacements

  • identify and create appropriate keys for joins

  • evaluate and transform column data types

  • apply data shape transformations to table structures

  • combine queries

  • apply user-friendly naming conventions to columns and queries

  • leverage Advanced Editor to modify Power Query M code

  • configure data loading

  • resolve data import errors

Model the Data (25-30%)

Design a data model

  • define the tables

  • configure table and column properties

  • define quick measures

  • flatten out a parent-child hierarchy

  • define role-playing dimensions

  • define a relationship’s cardinality and cross-filter direction

  • design the data model to meet performance requirements

  • resolve many-to-many relationships

  • create a common date table

  • define the appropriate level of data granularity

Develop a data model

  • apply cross-filter direction and security filtering

  • create calculated tables

  • create hierarchies

  • create calculated columns

  • implement row-level security roles

  • set up the Q&A feature

Create measures by using DAX

  • use DAX to build complex measures

  • use CALCULATE to manipulate filters

  • implement Time Intelligence using DAX

  • replace numeric columns with measures

  • use basic statistical functions to enhance data

  • create semi-additive measures

Optimize model performance

  • remove unnecessary rows and columns

  • identify poorly performing measures, relationships, and visuals

  • improve cardinality levels by changing data types

  • improve cardinality levels through summarization

  • create and manage aggregations

Visualize the Data (20-25%)

Create reports

  • add visualization items to reports

  • choose an appropriate visualization type

  • format and configure visualizations

  • import a custom visual

  • configure conditional formatting

  • apply slicing and filtering

  • add an R or Python visual

  • configure the report page

  • design and configure for accessibility

  • configure automatic page refresh

Create dashboards

  • set mobile view

  • manage tiles on a dashboard

  • configure data alerts

  • use the Q&A feature

  • add a dashboard theme

  • pin a live report page to a dashboard

  • configure data classification

Enrich reports for usability

  • configure bookmarks

  • create custom tooltips

  • edit and configure interactions between visuals

  • configure navigation for a report

  • apply sorting

  • configure Sync Slicers

  • use the selection pane

  • use drillthrough and cross filter

  • drilldown into data using interactive visuals

  • export report data

  • design reports for mobile devices

Analyze the Data (10-15%)

Enhance reports to expose insights

  • apply conditional formatting

  • apply slicers and filters

  • perform top N analysis

  • explore statistical summary

  • use the Q&A visual

  • add a Quick Insights result to a report

  • create reference lines by using Analytics pane

  • use the Play Axis feature of a visualization

  • personalize visuals

Perform advanced analysis

  • identify outliers

  • conduct Time Series analysis

  • use groupings and binnings

  • use the Key Influencers to explore dimensional variances

  • use the decomposition tree visual to break down a measure

  • apply AI Insights

Deploy and Maintain Deliverables (10-15%)

Manage datasets

  • configure a dataset scheduled refresh

  • configure row-level security group membership

  • providing access to datasets

  • configure incremental refresh settings

  • promote or certify Power BI content

  • identify downstream dataset dependencies

Create and manage workspaces

  • create and configure a workspace

  • recommend a development lifecycle strategy

  • assign workspace roles

  • configure and update a workspace app

  • publish, import, or update assets in a workspace

  • apply sensitivity labels to workspace content

  • use deployment pipelines

The exam is available in the following languages: English, Chinese (Simplified), Korean, Japanese

About arkadmin

Leave a reply