- SQL Server Reference Guide
- Introduction
- SQL Server Reference Guide Overview
- Table of Contents
- Microsoft SQL Server Defined
- SQL Server Editions
- SQL Server Access
- Informit Articles and Sample Chapters
- Online Resources
- Microsoft SQL Server Features
- SQL Server Books Online
- Clustering Services
- Data Transformation Services (DTS) Overview
- Replication Services
- Database Mirroring
- Natural Language Processing (NLP)
- Analysis Services
- Microsot SQL Server Reporting Services
- XML Overview
- Notification Services for the DBA
- Full-Text Search
- SQL Server 2005 - Service Broker
- Using SQL Server as a Web Service
- SQL Server Encryption Options Overview
- SQL Server 2008 Overview
- SQL Server 2008 R2 Overview
- SQL Azure
- The Utility Control Point and Data Application Component, Part 1
- The Utility Control Point and Data Application Component, Part 2
- Microsoft SQL Server Administration
- The DBA Survival Guide: The 10 Minute SQL Server Overview
- Preparing (or Tuning) a Windows System for SQL Server, Part 1
- Preparing (or Tuning) a Windows System for SQL Server, Part 2
- Installing SQL Server
- Upgrading SQL Server
- SQL Server 2000 Management Tools
- SQL Server 2005 Management Tools
- SQL Server 2008 Management Tools
- SQL Azure Tools
- Automating Tasks with SQL Server Agent
- Run Operating System Commands in SQL Agent using PowerShell
- Automating Tasks Without SQL Server Agent
- Storage – SQL Server I/O
- Service Packs, Hotfixes and Cumulative Upgrades
- Tracking SQL Server Information with Error and Event Logs
- Change Management
- SQL Server Metadata, Part One
- SQL Server Meta-Data, Part Two
- Monitoring - SQL Server 2005 Dynamic Views and Functions
- Monitoring - Performance Monitor
- Unattended Performance Monitoring for SQL Server
- Monitoring - User-Defined Performance Counters
- Monitoring: SQL Server Activity Monitor
- SQL Server Instances
- DBCC Commands
- SQL Server and Mail
- Database Maintenance Checklist
- The Maintenance Wizard: SQL Server 2000 and Earlier
- The Maintenance Wizard: SQL Server 2005 (SP2) and Later
- The Web Assistant Wizard
- Creating Web Pages from SQL Server
- SQL Server Security
- Securing the SQL Server Platform, Part 1
- Securing the SQL Server Platform, Part 2
- SQL Server Security: Users and other Principals
- SQL Server Security – Roles
- SQL Server Security: Objects (Securables)
- Security: Using the Command Line
- SQL Server Security - Encrypting Connections
- SQL Server Security: Encrypting Data
- SQL Server Security Audit
- High Availability - SQL Server Clustering
- SQL Server Configuration, Part 1
- SQL Server Configuration, Part 2
- Database Configuration Options
- 32- vs 64-bit Computing for SQL Server
- SQL Server and Memory
- Performance Tuning: Introduction to Indexes
- Statistical Indexes
- Backup and Recovery
- Backup and Recovery Examples, Part One
- Backup and Recovery Examples, Part Two: Transferring Databases to Another System (Even Without Backups)
- SQL Profiler - Reverse Engineering An Application
- SQL Trace
- SQL Server Alerts
- Files and Filegroups
- Partitioning
- Full-Text Indexes
- Read-Only Data
- SQL Server Locks
- Monitoring Locking and Deadlocking
- Controlling Locks in SQL Server
- SQL Server Policy-Based Management, Part One
- SQL Server Policy-Based Management, Part Two
- SQL Server Policy-Based Management, Part Three
- Microsoft SQL Server Programming
- An Outline for Development
- Database
- Database Services
- Database Objects: Databases
- Database Objects: Tables
- Database Objects: Table Relationships
- Database Objects: Keys
- Database Objects: Constraints
- Database Objects: Data Types
- Database Objects: Views
- Database Objects: Stored Procedures
- Database Objects: Indexes
- Database Objects: User Defined Functions
- Database Objects: Triggers
- Database Design: Requirements, Entities, and Attributes
- Business Process Model Notation (BPMN) and the Data Professional
- Business Questions for Database Design, Part One
- Business Questions for Database Design, Part Two
- Database Design: Finalizing Requirements and Defining Relationships
- Database Design: Creating an Entity Relationship Diagram
- Database Design: The Logical ERD
- Database Design: Adjusting The Model
- Database Design: Normalizing the Model
- Creating The Physical Model
- Database Design: Changing Attributes to Columns
- Database Design: Creating The Physical Database
- Database Design Example: Curriculum Vitae
- NULLs
- The SQL Server Sample Databases
- The SQL Server Sample Databases: pubs
- The SQL Server Sample Databases: NorthWind
- The SQL Server Sample Databases: AdventureWorks
- The SQL Server Sample Databases: Adventureworks Derivatives
- UniversalDB: The Demo and Testing Database, Part 1
- UniversalDB: The Demo and Testing Database, Part 2
- UniversalDB: The Demo and Testing Database, Part 3
- UniversalDB: The Demo and Testing Database, Part 4
- Getting Started with Transact-SQL
- Transact-SQL: Data Definition Language (DDL) Basics
- Transact-SQL: Limiting Results
- Transact-SQL: More Operators
- Transact-SQL: Ordering and Aggregating Data
- Transact-SQL: Subqueries
- Transact-SQL: Joins
- Transact-SQL: Complex Joins - Building a View with Multiple JOINs
- Transact-SQL: Inserts, Updates, and Deletes
- An Introduction to the CLR in SQL Server 2005
- Design Elements Part 1: Programming Flow Overview, Code Format and Commenting your Code
- Design Elements Part 2: Controlling SQL's Scope
- Design Elements Part 3: Error Handling
- Design Elements Part 4: Variables
- Design Elements Part 5: Where Does The Code Live?
- Design Elements Part 6: Math Operators and Functions
- Design Elements Part 7: Statistical Functions
- Design Elements Part 8: Summarization Statistical Algorithms
- Design Elements Part 9:Representing Data with Statistical Algorithms
- Design Elements Part 10: Interpreting the Data—Regression
- Design Elements Part 11: String Manipulation
- Design Elements Part 12: Loops
- Design Elements Part 13: Recursion
- Design Elements Part 14: Arrays
- Design Elements Part 15: Event-Driven Programming Vs. Scheduled Processes
- Design Elements Part 16: Event-Driven Programming
- Design Elements Part 17: Program Flow
- Forming Queries Part 1: Design
- Forming Queries Part 2: Query Basics
- Forming Queries Part 3: Query Optimization
- Forming Queries Part 4: SET Options
- Forming Queries Part 5: Table Optimization Hints
- Using SQL Server Templates
- Transact-SQL Unit Testing
- Index Tuning Wizard
- Unicode and SQL Server
- SQL Server Development Tools
- The SQL Server Transact-SQL Debugger
- The Transact-SQL Debugger, Part 2
- Basic Troubleshooting for Transact-SQL Code
- An Introduction to Spatial Data in SQL Server 2008
- Performance Tuning
- Performance Tuning SQL Server: Tools and Processes
- Performance Tuning SQL Server: Tools Overview
- Creating a Performance Tuning Audit - Defining Components
- Creating a Performance Tuning Audit - Evaluation Part One
- Creating a Performance Tuning Audit - Evaluation Part Two
- Creating a Performance Tuning Audit - Interpretation
- Creating a Performance Tuning Audit - Developing an Action Plan
- Understanding SQL Server Query Plans
- Performance Tuning: Implementing Indexes
- Performance Monitoring Tools: Windows 2008 (and Higher) Server Utilities, Part 1
- Performance Monitoring Tools: Windows 2008 (and Higher) Server Utilities, Part 2
- Performance Monitoring Tools: Windows System Monitor
- Performance Monitoring Tools: Logging with System Monitor
- Performance Monitoring Tools: User Defined Counters
- General Transact-SQL (T-SQL) Performance Tuning, Part 1
- General Transact-SQL (T-SQL) Performance Tuning, Part 2
- General Transact-SQL (T-SQL) Performance Tuning, Part 3
- Performance Monitoring Tools: An Introduction to SQL Profiler
- Performance Tuning: Introduction to Indexes
- Performance Monitoring Tools: SQL Server 2000 Index Tuning Wizard
- Performance Monitoring Tools: SQL Server 2005 Database Tuning Advisor
- Performance Monitoring Tools: SQL Server Management Studio Reports
- Performance Monitoring Tools: SQL Server 2008 Activity Monitor
- The SQL Server 2008 Management Data Warehouse and Data Collector
- Performance Monitoring Tools: Evaluating Wait States with PowerShell and Excel
- Practical Applications
- Choosing the Back End
- The DBA's Toolbox, Part 1
- The DBA's Toolbox, Part 2
- Scripting Solutions for SQL Server
- Building a SQL Server Lab
- Using Graphics Files with SQL Server
- Enterprise Resource Planning
- Customer Relationship Management (CRM)
- Building a Reporting Data Server
- Building a Database Documenter, Part 1
- Building a Database Documenter, Part 2
- Data Management Objects
- Data Management Objects: The Server Object
- Data Management Objects: Server Object Methods
- Data Management Objects: Collections and the Database Object
- Data Management Objects: Database Information
- Data Management Objects: Database Control
- Data Management Objects: Database Maintenance
- Data Management Objects: Logging the Process
- Data Management Objects: Running SQL Statements
- Data Management Objects: Multiple Row Returns
- Data Management Objects: Other Database Objects
- Data Management Objects: Security
- Data Management Objects: Scripting
- Powershell and SQL Server - Overview
- PowerShell and SQL Server - Objects and Providers
- Powershell and SQL Server - A Script Framework
- Powershell and SQL Server - Logging the Process
- Powershell and SQL Server - Reading a Control File
- Powershell and SQL Server - SQL Server Access
- Powershell and SQL Server - Web Pages from a SQL Query
- Powershell and SQL Server - Scrubbing the Event Logs
- SQL Server 2008 PowerShell Provider
- SQL Server I/O: Importing and Exporting Data
- SQL Server I/O: XML in Database Terms
- SQL Server I/O: Creating XML Output
- SQL Server I/O: Reading XML Documents
- SQL Server I/O: Using XML Control Mechanisms
- SQL Server I/O: Creating Hierarchies
- SQL Server I/O: Using HTTP with SQL Server XML
- SQL Server I/O: Using HTTP with SQL Server XML Templates
- SQL Server I/O: Remote Queries
- SQL Server I/O: Working with Text Files
- Using Microsoft SQL Server on Handheld Devices
- Front-Ends 101: Microsoft Access
- Comparing Two SQL Server Databases
- English Query - Part 1
- English Query - Part 2
- English Query - Part 3
- English Query - Part 4
- English Query - Part 5
- RSS Feeds from SQL Server
- Using SQL Server Agent to Monitor Backups
- Reporting Services - Creating a Maintenance Report
- SQL Server Chargeback Strategies, Part 1
- SQL Server Chargeback Strategies, Part 2
- SQL Server Replication Example
- Creating a Master Agent and Alert Server
- The SQL Server Central Management System: Definition
- The SQL Server Central Management System: Base Tables
- The SQL Server Central Management System: Execution of Server Information (Part 1)
- The SQL Server Central Management System: Execution of Server Information (Part 2)
- The SQL Server Central Management System: Collecting Performance Metrics
- The SQL Server Central Management System: Centralizing Agent Jobs, Events and Scripts
- The SQL Server Central Management System: Reporting the Data and Project Summary
- Time Tracking for SQL Server Operations
- Migrating Departmental Data Stores to SQL Server
- Migrating Departmental Data Stores to SQL Server: Model the System
- Migrating Departmental Data Stores to SQL Server: Model the System, Continued
- Migrating Departmental Data Stores to SQL Server: Decide on the Destination
- Migrating Departmental Data Stores to SQL Server: Design the ETL
- Migrating Departmental Data Stores to SQL Server: Design the ETL, Continued
- Migrating Departmental Data Stores to SQL Server: Attach the Front End, Test, and Monitor
- Tracking SQL Server Timed Events, Part 1
- Tracking SQL Server Timed Events, Part 2
- Patterns and Practices for the Data Professional
- Managing Vendor Databases
- Consolidation Options
- Connecting to a SQL Azure Database from Microsoft Access
- SharePoint 2007 and SQL Server, Part One
- SharePoint 2007 and SQL Server, Part Two
- SharePoint 2007 and SQL Server, Part Three
- Querying Multiple Data Sources from a Single Location (Distributed Queries)
- Importing and Exporting Data for SQL Azure
- Working on Distributed Teams
- Professional Development
- Becoming a DBA
- Certification
- DBA Levels
- Becoming a Data Professional
- SQL Server Professional Development Plan, Part 1
- SQL Server Professional Development Plan, Part 2
- SQL Server Professional Development Plan, Part 3
- Evaluating Technical Options
- System Sizing
- Creating a Disaster Recovery Plan
- Anatomy of a Disaster (Response Plan)
- Database Troubleshooting
- Conducting an Effective Code Review
- Developing an Exit Strategy
- Data Retention Strategy
- Keeping Your DBA/Developer Job in Troubled Times
- The SQL Server Runbook
- Creating and Maintaining a SQL Server Configuration History, Part 1
- Creating and Maintaining a SQL Server Configuration History, Part 2
- Creating an Application Profile, Part 1
- Creating an Application Profile, Part 2
- How to Attend a Technical Conference
- Tips for Maximizing Your IT Budget This Year
- The Importance of Blue-Sky Planning
- Application Architecture Assessments
- Transact-SQL Code Reviews, Part One
- Transact-SQL Code Reviews, Part Two
- Cloud Computing (Distributed Computing) Paradigms
- NoSQL for the SQL Server Professional, Part One
- NoSQL for the SQL Server Professional, Part Two
- Object-Role Modeling (ORM) for the Database Professional
- Business Intelligence
- BI Explained
- Developing a Data Dictionary
- BI Security
- Gathering BI Requirements
- Source System Extracts and Transforms
- ETL Mechanisms
- Business Intelligence Landscapes
- Business Intelligence Layouts and the Build or Buy Decision
- A Single Version of the Truth
- The Operational Data Store (ODS)
- Data Marts – Combining and Transforming Data
- Designing Data Elements
- The Enterprise Data Warehouse — Aggregations and the Star Schema
- On-Line Analytical Processing (OLAP)
- Data Mining
- Key Performance Indicators
- BI Presentation - Client Tools
- BI Presentation - Portals
- Implementing ETL - Introduction to SQL Server 2005 Integration Services
- Building a Business Intelligence Solution, Part 1
- Building a Business Intelligence Solution, Part 2
- Building a Business Intelligence Solution, Part 3
- Tips and Troubleshooting
- SQL Server and Microsoft Excel Integration
- Tips for the SQL Server Tools: SQL Server 2000
- Tips for the SQL Server Tools – SQL Server 2005
- Transaction Log Troubles
- SQL Server Connection Problems
- Orphaned Database Users
- Additional Resources
- Tools and Downloads
- Utilities (Free)
- Tool Review (Free): DBDesignerFork
- Aqua Data Studio
- Microsoft SQL Server Best Practices Analyzer
- Utilities (Cost)
- Quest Software's TOAD for SQL Server
- Quest Software's Spotlight on SQL Server
- SQL Server on Microsoft's Virtual PC
- Red Gate SQL Bundle
- Microsoft's Visio for Database Folks
- Quest Capacity Manager
- SQL Server Help
- Visual Studio Team Edition for Database Professionals
- Microsoft Assessment and Planning Solution Accelerator
- Aggregating Server Data from the MAPS Tool
In a a previous tutorial, I explained the general process I follow to implement a Business Intelligence project. Starting in this series of tutorials, I’ll implement that process using a need I now have for some intelligence data. If you’re just joining me now, start out with this article first, since you’ll need that background for this tutorial. You’ve got a front-seat to my experiment as I work through it.
I’m at the stage where I’ve identified the general answers I want to get from the system, located the source for the data (which happily is only one or two tables from a single database for this example) and last week I spelunked around a bit to find out what kind of answers I can find in my data.
With the scripts I ran, I found that the Management Data Warehouse (MDW) system tracks several data points for the I/O subsystem as it relates to performance. I’m going to use the data points I found last week to begin laying out my source for the analysis I’ll do in the final steps of this project.
This brings me to what I view as the most difficult part of designing a Business Intelligence system — designing and creating the Dimension and Fact table layouts. Let me stop a moment and define those terms a little more completely.
A “fact” is something that you want to measure and analyze. Most of the time, these are the numbers in your source tables. You may not want all of them, mind you, but the items called “facts” are the numbers themselves. By the way, some texts or systems refer to these numbers as “measures,” so “facts” and “measures” are the same. I’ll mostly stick to the term “facts” for this series. So if you look back in the last tutorial at the kinds of things the MDW tracks, you’ll find that there are numbers such as “reads,” “writes” and so on. These numbers represent the activity of the monitored system. The number of the read operations or write operations are the facts I’m after.
A “dimension” is something that describes or is related to a fact. In the MDW, I have multiple pieces of information that describe the read or write operations, such as the drive letter it was written on, or the Instance of SQL Server that owned the database where the I/O operation happened. The reason this information is called a dimension is because you look at the facts (numbers) based on this information. For instance, you want to know how many writes and reads are happening, but you want to know that they are happening on which drive, which database, which server and so on.
Having this information spread out into dimensions is very useful. When I’m done with this design, I’ll be able to see all write operations, then trim it down to specific drives, files or databases, or all of those things at once. Imagine taking all of the data I am talking about and placing the “dimensions” on the squares of a Rubik’s cube. Now imagine placing the “facts” (numbers) in the middle of the cube. If you look at one “face” or side of the cube, you could see all of the write operations on a Server. Then you could spin one of the sides of the cube to also see the write operations of the databases on that server, and you could spin it again to add the drive letters the databases use to store the data. You could come up with all kinds of combinations, and that’s exactly what I’m building the foundation for in this tutorial.
There’s another kind of very special dimension, which involves the time that a fact happened. While that is a kind of number, it’s a dimension, since you will probably want to know the information sliced on the time it happened. It answers questions like “when is the busiest time for my I/O subsystem?” and “when does each server do the most I/O?” It’s a special kind of dimension because you store its data based on the kinds of questions you will ask — and I’ll explain that further when I create it.
Everything is in place — I know where the data is, which tables and columns hold the data, and which kind of reports I want from it. I just need to put the data into the format that the Analysis Services engine can use to do all this work. I need a place to put all this data, so I’ll create a database called mdwDW (for MDW Data Warehouse) on my system:
CREATE DATABASE mdwDW; GO
Easy enough. Since I’ll want to show you a database diagram a little later, I need to assign an owner to the database. Yes, I’m already the owner, but it’s using my domain account, and I’m sitting here in the Public Library typing this article up, so I need to set an owner that the database designer can see right now. I’ll do that with a simple stored procedure, setting the owner to “sa,” an account the system can always get to:
USE mdwDW; GO EXEC dbo.sp_changedbowner @loginame = N'sa', @map = false GO
The table structure that Analysis Services uses to do its work is called a “star” table, and getting the data from the source servers, databases and tables into this format is one of the most difficult parts of creating a BI solution, at least in my opinion. It’s the step that takes the most time, needs the most planning, and has the most places to go wrong. If you take it one step at a time, however, you can get there.
The first thing to do is to identify the facts and dimensions from your source data. In my case, that was pretty simple, since I have only a single server, a single database, and a single table to work with. Even when it is more complicated, however, it’s the same process.
I looked at my source (you might have to look through several of them), and identified the numbers that hold the data I want to report on. That turned out to be these columns from the snapshots.io_virtual_file_stats table in the MDW database:
- num_of_reads
- num_of_writes
- num_of_bytes_read
- num_of_bytes_written
- io_stall_read_ms
- io_stall_write_ms
Although these numbers are self-descriptive, you might have to document what the numbers actually mean in a more complex example.
The next step is to identify the data that describes those bits of data — the dimensions. Remember, this is the data that you want to “slice” the numbers on. In my case, that comes from a couple of tables. The first is the one I just used, snapshots.io_virtual_file_stats. The second table is the one I joined it on in the last tutorial, core.source_info_internal. You’ll recall that there is a join with a third table to get those two things together, but I’ll come to that later. Here are the columns I found that I wanted to “slice” my view of the numbers on:
- instance_name
- database_name
- logical_disk
- logical_file_name
- type_desc
- collection_time
The next step is to build the tables that will hold these dimensions and facts. The fact table (the table that holds all of the numbers) will be joined to all the other tables using a set of Primary and Foreign Keys, and when you lay that out it looks like a star — hence the name “star tables.”
I’ll begin with the dimension tables. They will have a Primary Key, and then the text that defines the dimension. Here’s the one I created for the Instance name, which tells me which server is involved:
/* Create Dimension Tables */ CREATE TABLE [dbo].[dimInstance] ( [InstancePK] [int] IDENTITY(1,1) NOT NULL , [InstanceName] [nchar](10) NULL , CONSTRAINT [PK_dimInstance] PRIMARY KEY CLUSTERED ( [InstancePK] ASC )); GO
Pretty simple — and pretty small. There will be only a few items in here, since I’m only monitoring a few servers. I could also add things like the location of the server, the admin, the primary use of the server and so on, but I’m not collecting that right now. Since I’m not collecting it, I can’t analyze it.
The fact table will have a relationship back to this table, so that I’ll be able to enter a single number there and Analysis Services will be able to get to the text I need.
I’ll follow the same process for the rest of the dimensions:
CREATE TABLE [dbo].[dimLogicalDisk] ( [LogicalDiskPK] [int] IDENTITY(1,1) NOT NULL , [LogicalDiskName] [nchar](10) NULL , CONSTRAINT [PK_dimLogicalDisk] PRIMARY KEY CLUSTERED ( [LogicalDiskPK] ASC )); GO CREATE TABLE [dbo].[dimLogicalFileName] ( [LogicalFileNamePK] [int] IDENTITY(1,1) NOT NULL , [LogicalFileName] [nchar](10) NULL , CONSTRAINT [PK_dimLogicalFileName] PRIMARY KEY CLUSTERED ( [LogicalFileNamePK] ASC )); GO CREATE TABLE [dbo].[dimIOType] ( [IOTypePK] [int] IDENTITY(1,1) NOT NULL , [IOType] [nchar](10) NULL , CONSTRAINT [PK_dimIOType] PRIMARY KEY CLUSTERED ( [IOTypePK] ASC )); GO
I’ll stop here a moment, however, and talk a little about the time dimension table. I’m going to build the whole table out of a single field, but I’ll break the field down into the new SQL Server 2008 DATE and TIME types to have a column for year, month, day, hour, minute and so on. This will allow me to get a very fine “grain” on the times I care about.
I’ll also create a very special Primary Key here. I’ll explain the reason for that in the next tutorial:
CREATE TABLE [dbo].[dimIOCollectionTime]( [IOCollectionTimePK] [bigint] NOT NULL, [IOCollectionTimeYear] [date] NULL, [IOCollectionTimeMonth] [date] NULL, [IOCollectionTimeDay] [date] NULL, [IOCollectionTimeHour] [time](0) NULL, [IOCollectionTimeMinute] [time](0) NULL, [IOCollectionTimeSecond] [time](0) NULL, CONSTRAINT [PK_dimIOCollectionTime] PRIMARY KEY CLUSTERED ( [IOCollectionTimePK] ASC) ); GO
Now the interesting part — the fact table, It holds all the numbers, as I mentioned earlier, but it has two other features. It’s Primary Key is made up of the Primary Keys in the other tables:
/* Fact Table */ CREATE TABLE [dbo.FactIOPerformanceMetrics] ( InstancePK int NOT NULL, LogicalDiskPK int NOT NULL, LogicalFileNamePK int NOT NULL, IOTypePK int NOT NULL, IOCollectionTimePK bigint NOT NULL, Reads int NULL, Writes int NULL, bytesRead int NULL, bytesWritten int NULL, readsStalled int NULL, writesStalled int NULL CONSTRAINT [PK_FactIOPerformanceMetrics] PRIMARY KEY CLUSTERED ( [InstancePK] ASC , [LogicalDiskPK] ASC , [LogicalFileNamePK] ASC , [IOTypePK] ASC , [IOCollectionTimePK] ASC ) ); GO
The second interesting thing is that the fact table has Foreign Keys to all of the dimension tables. Here’s that code:
/* Set up Foriegn Keys*/ BEGIN TRANSACTION SET QUOTED_IDENTIFIER ON SET ARITHABORT ON SET NUMERIC_ROUNDABORT OFF SET CONCAT_NULL_YIELDS_NULL ON SET ANSI_NULLS ON SET ANSI_PADDING ON SET ANSI_WARNINGS ON COMMIT BEGIN TRANSACTION GO ALTER TABLE dbo.dimLogicalFileName SET (LOCK_ESCALATION = TABLE) GO COMMIT BEGIN TRANSACTION GO ALTER TABLE dbo.dimLogicalDisk SET (LOCK_ESCALATION = TABLE) GO COMMIT BEGIN TRANSACTION GO ALTER TABLE dbo.dimIOType SET (LOCK_ESCALATION = TABLE) GO COMMIT BEGIN TRANSACTION GO ALTER TABLE dbo.dimIOCollectionTime SET (LOCK_ESCALATION = TABLE) GO COMMIT BEGIN TRANSACTION GO ALTER TABLE dbo.dimInstance SET (LOCK_ESCALATION = TABLE) GO COMMIT BEGIN TRANSACTION GO ALTER TABLE dbo.[dbo.FactIOPerformanceMetrics] ADD CONSTRAINT [FK_dbo.FactIOPerformanceMetrics_dimInstance] FOREIGN KEY ( InstancePK ) REFERENCES dbo.dimInstance ( InstancePK ) ON UPDATE NO ACTION ON DELETE NO ACTION GO ALTER TABLE dbo.[dbo.FactIOPerformanceMetrics] ADD CONSTRAINT [FK_dbo.FactIOPerformanceMetrics_dimIOCollectionTime] FOREIGN KEY ( IOCollectionTimePK ) REFERENCES dbo.dimIOCollectionTime ( IOCollectionTimePK ) ON UPDATE NO ACTION ON DELETE NO ACTION GO ALTER TABLE dbo.[dbo.FactIOPerformanceMetrics] ADD CONSTRAINT [FK_dbo.FactIOPerformanceMetrics_dimLogicalDisk] FOREIGN KEY ( LogicalDiskPK ) REFERENCES dbo.dimLogicalDisk ( LogicalDiskPK ) ON UPDATE NO ACTION ON DELETE NO ACTION GO ALTER TABLE dbo.[dbo.FactIOPerformanceMetrics] ADD CONSTRAINT [FK_dbo.FactIOPerformanceMetrics_dimLogicalFileName] FOREIGN KEY ( LogicalFileNamePK ) REFERENCES dbo.dimLogicalFileName ( LogicalFileNamePK ) ON UPDATE NO ACTION ON DELETE NO ACTION GO ALTER TABLE dbo.[dbo.FactIOPerformanceMetrics] ADD CONSTRAINT [FK_dbo.FactIOPerformanceMetrics_dimIOType] FOREIGN KEY ( IOTypePK ) REFERENCES dbo.dimIOType ( IOTypePK ) ON UPDATE NO ACTION ON DELETE NO ACTION GO ALTER TABLE dbo.[dbo.FactIOPerformanceMetrics] SET (LOCK_ESCALATION = TABLE) GO COMMIT
And the source for my analysis is complete. Here’s the database diagram of the structure:
Now I’m ready for the next step: figuring out how the data gets from the source to this new database and the star schema. I’ll cover that process next week.
InformIT Articles and Sample Chapters
There’s a lot more here on the MDW feature in The SQL Server 2008 Management Data Warehouse and Data Collector.
Books and eBooks
Before we get through with this project, you’ll need a good background on SQL Server Integration Services. I’ve got the reference for that right here: Microsoft SQL Server 2008 Integration Services Unleashed.
Online Resources
To make sure I give credit where it is due, here is the reference from Books Online that I used in this tutorial.