CDW110 Exam Actual Exam | Questions With 100% Correct Answers | Verified | Latest Update /
1.(General Reporting Tips) If a query refers to more than one table, all columns should be prefixed by a descriptor (table name or alias) - Correct Answer-Using descriptors ensures you have unambiguous column references, preventing issues that can occur when two tables contain columns with the same name.
2.Chapter 1. (Study Checklist) Caboodle Console - Correct Answer-The Caboodle Console is a web application housed on the Caboodle server. It includes the following: Dictionary Dictionary Editor Executions Work Queue Configuration
3.Chapter 1. (Study Checklist) Data Warehouse - Correct Answer-In a data warehouse, multiple sources may load data pertaining to a single entity. This means that more than one package may populate a given row in a Caboodle table. As a result, there may be multiple business key values associated with a single entity in a Caboodle table.
4.Chapter 1. (Study Checklist) ETL - Correct Answer-Extract, Transform, Load
5.Chapter 1. (Study Checklist) SSIS Package - Correct Answer-The architecture of Caboodle includes a staging database and a reporting database. Data is extracted from source systems (like Clarity), transformed in the staging database, and presented for users in the reporting database. This movement of data is realized via a set of SQL Server Integration Services (SSIS) packages.
6.Chapter 1. (Study Checklist) Data Lineage - Correct Answer-Generally, data lineage refers to the process of identifying the source of a specific piece of information. In Caboodle, data lineage is defined at the package level.
7.Chapter 1. (Study Checklist) Star Schema - Correct Answer-The standard schema for a dimensional data model. The name refers to the image of a fact table surrounded by many linked dimension tables, which loosely resembles a star.The Caboodle data model structure is based on a "star schema" ‐ where one central fact table will join to many associated lookup or dimension tables. This structure provides the foundation of the Caboodle data model.
8.Chapter 1. (Study Checklist) DMC - Correct Answer-DATA MODEL COMPONENT 1 / 3
No table in Caboodle "stands alone." Each is considered part of a Data Model Component, which refers to the collection of metadata tables that support the ETL process and reporting views stored in the FullAccess schema.Each DMC gets a type. Strict table naming conventions are followed in Caboodle, so that a table's suffix provides information about its structure and purpose.
These suffixes are:
· Dim for dimensions (e.g. PatientDim) · Fact for facts (e.g. EncounterFact) · Bridge for bridges (e.g. DiagnosisBridge) · DataMart for data marts (e.g. HospitalReadmissionDataMart) · AttributeValueDim for EAV tables (e.g. PatientAttributeValueDim) · X for custom tables (e.g. CustomFactX)
9.Chapter 1. (Study Checklist) Staging Database - Correct Answer-The Caboodle database into which records are loaded by SSIS packages and stored procedures.
10.Chapter 1. (Study Checklist) Reporting Database - Correct Answer-The architecture of Caboodle includes a staging database and a reporting database. Data is extracted from source systems (like Clarity), transformed in the staging database, and presented for users in the reporting database. This movement of data is realized via a set of SQL Server Integration Services (SSIS) packages.
11.Chapter 1. (Study Checklist) Dbo Schema - Correct Answer-STAGING DATABASE Import tables and Mapping tables live here. This is primarily used by administrators for moving data into Caboodle.
REPORTING DATABASE
The dbo schema stores reporting data and acts as the data source for SlicerDicer. The Caboodle Dictionary reflects the contents of the dbo schema.
12.Chapter 1. (Study Checklist) FullAccess Schema - Correct Answer-STAGING
DATABASE
The FullAccess schema does not exist on the Staging database.
REPORTING DATABASE
The FullAccess schema houses views that simplify reporting. FullAccess should be your default schema when reporting.
13.(ETL Terms) Execution - Correct Answer-An execution is the process that extracts data from a source system using packages, transforms the data in the staging database, and loads it to Caboodle for reporting. You create and run executions in the Caboodle Console.
- / 3
14.(ETL Terms) Extract - Correct Answer-Extracts to Caboodle from Clarity can be either backfill or incremental. Backfill extracts load or reload every row in a table from Clarity, whereas incremental extracts load only changed rows. Existing data is available while extracts are in progress.
15.(ETL Terms)package - Correct Answer-A package is a definition of an extract of data from one specific source to a specific import table. For example, a fact might have packages for Epic inpatient data, Epic outpatient data, and several non-Epic data sources. Packages are defined in SSIS .dtsx files.
16.Chapter 1. (Study Checklist) Identify key characteristics of the dimensional data model. - Correct Answer-MADE for report writers.· Simpler and more intuitive.· Easily extensible.· More performant..
17.Chapter 1. (Study Checklist) Identify documentation resources for reporting out of Caboodle - Correct Answer-Caboodle Dictionary Reporting with Caboodle document Caboodle ER diagram
18.Chapter 1. (Study Checklist) Identify reporting needs that best fit Caboodle - Correct Answer-Custom data packages can be written by Caboodle developers to accommodate your organization's reporting needs.
19.(General Reporting Tips) Add a filter to most queries to exclude Caboodle's special rows for unspecified, not applicable, and deleted records, which have surrogate keys of -1, -2, and -3 - Correct Answer-Include only rows where the key is greater than 0.
20.(General Reporting Tips) Caboodle has a numbers table, NumbersDim, that you can use as needed in your reports - Correct Answer-NumbersDim contains the integers from
- to 1,000,000, which you can reference to help manipulate strings and complete other
processes. If you need more than 1,000,000 rows to accomplish a task, you can refer to NumbersDim multiple times in your query.
21.Chapter 1. (Study Checklist) How does Epic data flow into Caboodle - Correct Answer-Epic data moves between several databases before it gets to Caboodle.
CHRONICLES flows into CLARITY via ETL. After transformation, the data is stored in a relational database on a separate server. Even though the structure of the Chronicles and Clarity databases differ significantly, the ETL process preserves the relationships mapped in Chronicles.
CLARITY flows into Caboodle data is extracted
- / 3