SQL Server

PowerShell at #SQLSatOregon with @DBAduck

Mastery Session – Leveling Up with PowerShell for the DBA

I know that most of you have used or heard about using PowerShell, but I am sure that you have things you could learn about using PowerShell in your DBA career. This mastery session is meant to get you closer to mastery of PowerShell for the DBA. I will cover 3 modules: DBAtools module, SQLServer module and the ImportExcel module. In these modules you have a wealth of commands and tools that will help you get more out of your day, using less clicks. The trick is not knowing that these things exist, but understanding the possiblities and patterns that exist in each module. You will learn about the commonalities in the DBAtools module that make it very simple to get the most out of this one. You will learn about some hidden gems in the SQLServer module and why you should care to learn everything you can about it. The ImportExcel module does not have a lot of commands but the commands available are amazing and will allow you to do things very simply without requiring Excel. There will also be a mix of deployment options using PowerShell as well. SSIS project deployment will also be covered. Join me at SQL Saturday Oregon and learn what your next steps are with PowerShell. You will absolutely love it.

Tips and Tricks for the PowerShell DBA

With environments growing and people hiring is lessening, you have to get some tools in your toolbelt to get you ahead of the game. These tips will cover both PowerShell and SQLServer PowerShell tricks. Some of the tips are directly related to performance and others about using features of SMO/SQLServer to get things done faster and more efficiently. Join me for a great session on getting things done.

Register Now for SQL Saturday Oregon

Read More

PowerShell for the DBA at SQLIntersection with @DBAduck

If you are wondering why you would want to use PowerShell as a DBA, read this first – Why Become a PowerShell DBA. Then go to SQLIntersection Registration to register and I will see you there. And if you register, please use the code MILLERB and you will get $50 off registration.

Click the links below to see the list of sessions, workshops and speakers
Sessions: http://bit.ly/SQL_Fall_2019_Sessions  
Workshops: http://bit.ly/SQL_Fall_2019_Workshops  
Speakers: http://bit.ly/SQL_Fall_2019_Speakers

Workshop

Leveling Up With PowerShell for the DBA

This workshop is a great one for anyone that wants to Level Up their PowerShell skills relating to DBA tasks. We are going to cover multiple things in this workshop. The tools we will be going over are: DBAtools module, SQLServer module, ImportExcel module. Each one of these has great power and can take a lot of clicking away. They also have the ability to allow you to export data to Excel or CSV and store for other purposes, or even take data from a query and put it directly in a table without having to write SQL or SSIS packages to put or import the data another way. Join me for a fun filled day that will have you coming back for more. It is not required to know PowerShell inside and out to attend. You will learn something new no matter your level of PowerShell expertise or years working with SQL and PowerShell. Join me in Vegas at the MGM for SQL Intersection, you won’t regret it.

An In-Depth Look at the SQL Server Provider in PowerShell

You probably already know that there is a SqlServer module on the PowerShell Gallery. This module contains many cmdlets that will help you with your DBA work. But something you may not know about is the SqlServer Provider that is in there too. This provider will be the subject of this presentation. Have you ever opened file explorer and started browsing through your filesystem? How about if you could have Object Explorer in SSMS as a navigable structure in PowerShell. Well you guessed it, it is. This is the SqlServer Provider. I will take you through the provider and show you how to navigate it, but that is not all. By the end of the session you will understand how you can use it in your PowerShell life to get more out of SqlServer by using Paths. Yes, you heard right, a path like structure that you can navigate and use to make your life simpler in retrieving objects and manipulating them in PowerShell. Join me for another chapter in the quest to become a PowerShell DBA. You won’t want to miss this one.

Synopsis: The SqlServer module is a powerful tool in PowerShell. It continues to receive updates and new cmdlets, which is fantastic. We will talk about these functions that allow you to do more with commands instead of TSQL. But the real value of this session is all about the SQLServer PowerShell Provider. This provider allows you to navigate SQL Server like a file system and that is not all, we will talk (and mostly show) what you can do with it and how you can leverage this provider to do almost anything that you would like to with objects and commands rather than clicking your way through. Join me for an in depth look at the provider and you will be converted to this tech, I am sure of it.

Getting Started with the DBAtools PowerShell Module

There has been a movement towards PowerShell in the SQL Server Community lately. With so much out there, how do you get ahead. I will take you through this massive module that has grown very quickly to over 300 functions for use in your daily DBA job. This module can single handedly migrate an entire server with one command, or you can choose the parts you want to do and just do those. Join me to find out how you can quickly leverage the power of this module. This is one tool you want in your PowerShell toolbox.

Synopsis: DBAtools has been around for 5 years and has finally reached release status. It is now in version 1.x which means it has stabilized the command names and parameter names to be consistent through all the commands. This remains very important because a lot of commands can be used to automate a lot of things. This gives you the ability to use splatting and parameter objects to allow you to reuse and get the most out of the module without typing a whole lot. Join me for a great look into some of the productivity tools in this module and you will be hungry for more.

Temporal Tables In-Depth

Have you ever wanted to know what the data looked like yesterday before a change was made? Have you cringed at the thought of creating triggers to get data changes logged for later analysis? Looking at a new feature in SQL 2016 called Temporal Tables gives you the ability to create a table that automatically keeps track of your data in time. Magic was never meant to be part of IT, but SQL Server has done a great job with this new feature. We will cover how to set it up, how it works, querying the tables including the history table and last but not least, how you can incorporate this feature into existing tables. Join me in a time travelling adventure to find out how you can leverage Temporal Tables in your world.

Synopsis: Temporal tables are not new, but there are still very many that have not used them yet. I will tell you there are many things that you can use them for, but recently we came across an update statement gone awry and had we had a temporal table in place, it would have been much simpler to just put the data back in place and not have to restore databases to do so. I will take you through the scenarios and show you how to put them in place, from a new table, an existing table, and a table that you want to customize your temporal tables indexes to allow you more ability to query the data and see the data at a given point in time. We will cover In Memory OLTP and new features in 2017 and 2019. Join me for a fun filled hour that may just make your head hurt.

Read More

dbatools 1.0 is here and why you should care!

I have been using dbatools for a long time now. I have even contributed a few commands and fixed some bugs.  I present on this topic and help many people with PowerShell and being a DBA.  I am so excited for the launch of 1.0 for dbatools, that I cannot stop pinching myself to make sure that it is true.

Chrissy Lemaire ( b | t ) and company have done a great job keeping this project going as well as supporting the sqlcommunity.slack.com slack environment. Many people are helped in there and it is a great sense of community being fostered there.  I take my hat off to Chrissy for spearheading this project and for inspiring all of us to contribute and use the module in our daily lives.

Experience

I wanted to also share that in my presentations, I show the things that dbatools can do, and I help people get started and hope that they get something out of it.  I recently had a presentation on Complete Database Migration with dbatools and it went across so well, I was inspired again.  Then that same week I was helping a client to get things set up on a new server and it was late at night and I needed to recreate a credential and Operator and Proxy for a job that would run as the proxy.  I scoured emails and looked and looked.  I could not find the password and assumed that the password was sent another way other than email.  Then the light bulb came on.  “I can use dbatools, I thought”.  So I downloaded the latest version to ensure that I was using the best, and proceeded to use Copy-DbaCredential and Copy-DbaAgentOperator and Copy-DbaAgentProxy as well as Copy-DbaDbMail.  They all worked and I could configure everything. Then I used Copy-DbaAgentJob to cap it all off.  I cannot tell you how nice it was to use the tools I had just shown an audience not 2 days earlier.

I hope that everyone is able to get something out of this module and that your day is much better than days before because you have found dbatools.  I again tip my hat to all those that work behind the scenes on building, testing and publishing this module for all of us to use.

Happy Launch Day!!!

Read More

SQLintersection Workshop attendees change to URL

SQL Server

Hello All.  I am ensuring that you have access to the downloads. In the slide deck there was a link and it pointed to the file.  You need to add /wp-content/ to the URL before uploads and after dbaduck.com.

That means that the uploads are at https://dbaduck.com/wp-content/uploads/nameoffile.zip

I apologize for having the link in the wrong place.  Please feel free to email me at the email in the slides if you cannot get the file.

Read More

PowerShell month in May with DBAduck

I am excited for this month.

SQLSkills Training

Event 1 for this month revolving around PowerShell is May 8-10 at SQLskills.com training in Chicago. I am teaching the first class of the Immersion event on PowerShell for the DBA. You can see the details in the link on the class.

I am super humbled and excited to teach this class for SQLskills and at the time of this writing there are 3 seats left after we opened more seats for the class. Please join me for a great event.

We will be going over how to get started with PowerShell and then dive right in and learn why a DBA should learn PowerShell and why you would ever want to become a PowerShell DBA. There are so many reasons and so much value, I cannot contain it all in this type of post. But we will be having a grand time drinking from a firehose and learning while we do. 🙂

I hope you will join me for this iteration, or plan on another iteration another time. I love PowerShell and look forward to meeting all of you who are going to be there.

PASS Virtual PowerShell Chapter Meeting

Event 2 for this month will be on Wednesday, May 17th at 12:00 PM Eastern Time I will be presenting to the group on Gathering Data for Trending using PowerShell.  You can RSVP here . I will be covering a great topic that I spend a lot of time thinking about. There is a lot of data in SQL Server and a lot of Meta-Data that can tell some great stories about your servers and what the behavior looks like.  Ever since working at Microsoft with the Community Research team on Newsgroup Behaviors, I have been hooked on looking at things in SQL Server as far as behaviors go, not just about features and such.  So I have created some data gathering scripts to get data and put it into tables with snapshots in time and looking at the information in a different light to see if I can derive how things are working in SQL Server.  Join me for a great time and register for the webinar here.

SQL Intersection

Event 3 will be a PreCon at SQLIntersection on May 20th in Orlando at the Disney Swan Hotel.  If you are going to be there, I hope you will join me and if you are thinking about going, please register and join me there.  It is my well-liked PreCon that takes you from 0-60 in using PowerShell as a DBA and a professional that wants to manage SQL Server from a command line instead of clicking your way through everything.  I will be teaching how to leverage the shell as well as balancing the use between SSMS and PowerShell to gain another tool in your toolbelt.

This will be a great month focused on PowerShell.

 

Read More

#SQLIntersection with DBADuck

SQL Server

SQLintersection
In May, I will be at SQLintersection at the Disney World Swan in Orlando, Florida.

You can visit the site to see the full lineup of sessions and workshops. I’m going to be presenting three sessions on SQL Server topics and 1 Workshop on PowerShell in SQL:

IN DEPTH TEMPORAL TABLES IN SQL 2016
Have you ever wanted to know what the data looked like yesterday before a change was made? Have you cringed at the thought of creating triggers to get data changes logged for later analysis? Looking at a new feature in SQL 2016 called Temporal Tables gives you the ability to create a table that automatically keeps track of your data in time. Magic was never meant to be part of IT, but SQL Server has done a great job with this new feature. We will cover how to set it up, how it works, querying the tables including the history table and last but not least, how you can incorporate this feature into existing tables. Join me in a time travelling adventure to find out how you can leverage Temporal Tables in your world.

PRACTICAL POWERSHELL FOR SQL SERVER
This session will cover real world functions that are used for real interaction with SQL Server. There will be a few slides but mostly demos with actual scripts that interact with SQL Server for management and getting data in and out. I will also introduce some outside modules created to assist as well. Things like Reading/Changing the following: Permissions, DB Owner maintenance and database space management. We all could use a few standard things in the SQL Server world of ever changing roles and “I have to have it now” stuff, that Management Studio can get in the way of getting it done faster. Join me for a fun PowerShell hour and never try to reuse a click again.

SMO: A STUDY OF THE SERVER MANAGEMENT OBJECTS
SMO, short for Shared Management Objects (or SQL Management Objects to some), are some powerful programming interfaces to get and manipulate SQL Server in code. These objects are used in custom programs, in scripting with PowerShell and can be used to fully manage a SQL Server. But what do you really do with them? How do you use them? This session will cover some core objects that are used in every day interaction with SQL Server. I will cover how to get access to them, how to use them and best of all you will get to know these mysterious objects

WORKSHOP ON Saturday, May 20, 2017

PowerShell for the DBA from 0-60 in a Day
Think of how many servers or instances you maintain. Putting tools in your toolbox becomes a critical part of your life as a DBA. How many clicks can be reused? We will be taking you from 0–60 and everywhere in between with how PowerShell fits into a DBA’s day in the life. We will go from getting PowerShell connected to SQL Server to walking away with a toolbox starter kit that will help you get automated and allow you to focus on harder problems than modifying database options or getting information from SQL Server and storing it away for later analysis. The toolbox alone will make it well worth your attendance.

Join me in the quest to become a PowerShell DBA.

Hope to see you there!

Read More

PASS Board Elections, Why am I running?

SQL Server

Hey all you SQL PASS people in the #SQLFamily. I am a little late to the game, but wanted to ensure while the voting is active now, that you had why I am running.

In this vast connected world of technology there are so many SQL People I don’t even know if I venture to guess how many there are. My experience has been that PASS has been an organization that has done a lot of good for their members. The problem I have that I want to attempt to address is why the many people who are members don’t really know what that means or don’t understand what they are a part of. I really want to help people connect and to take advantage of all the benefits and connection opportunities that they have inside an organization of this size. PASS is not just a SQL Saturday and a PASS Summit organization. Funding is important to run, but the communities should be fueled by the organization and the sponsors should fuel the organization. As members we should be concerned about supporting the organization by actively participating in all the aspects of PASS, not just taking advantage of the free events like SQL Saturdays.

Now before I get strung up, I want to clarify what taking advantage of what PASS offers means. When you as a member starts out, there is a period of time that you get fed, everything you want, everything you can handle. But after a while, there is a time to give back and that is the eco-system in which we “Should” live in with PASS. I have been a member of PASS for a long time and I have been a Chapter leader for a long time, but I realized during the interview of Board Candidates, that I had not given back enough. Sure I speak at a lot of SQL Saturdays, and at events around the country and in my home state. But I have not volunteered in the PASS organization at all and I really need to give back.

Giving back and connecting our #SQLFamily is why I am running after all. The stuff above is the path that took me to submitting my application. I support all the other candidates and think that the slate this year is amazing and you cannot go wrong. PLEASE VOTE, it is part of volunteering and supporting the organization that we belong to. Get out there (online) and cast your vote and make this election the one with the most participation.

Happy Voting.
PASS Candidates 2016

Voting is in your myPASS profile.
Vote Now in your PASS Profile

Read More

Join me at SQL Intersections in October 2016

Powershell, SQL Server

SQLintersection
In October, I will be at SQLintersection at the MGM Grand in Las Vegas, Nevada.

You can visit the site to see the full lineup of sessions and workshops. I’m going to be presenting four, count them 4 sessions on SQL Server topics and 2 Workshops on PowerShell in SQL:

Be Friendly to SQL Server with TSQL Best Practices
TSQL is a necessity when interacting with SQL Server so knowing can be half the battle. Performance is always good as the database starts to grow, but building in resilience when you begin is a greater advantage than refactoring. I will go over 5 key things to know when you write TSQL, use DataTypes and/or variables in comparisons and you will also learn about the procedure cache and how to avoid pitfalls there. This is a beginners session but the concepts in this session are a great foundation to begin with. If you are looking for a solid foundation to build on and need the basics to start, this session is definitely for you.

SQL Server Filestream Implementation and Management
Filestream was introduced in SQL 2008 and has been a part of the engine since then. The reasons to use Filestream over BLOB has been talked about in many different ways and for many different reasons. I will take you through how to get SQL Server configured to use Filestream, what that means and how to implement a column that is of Filestream type. I will also go over where the files live and how to stream them outside of the SQL engine instead of pulling them through the engine. Join me for an in depth discussion on this technology. We will also touch on FileTables which was new in SQL 2012.

Managing Availability Groups with PowerShell
Today Availability Groups are all the rage. Being a PowerShell guy I prefer to manage things with PowerShell. The SQL team has granted us a great toolset in the SQLPS/SqlServer module that allows you to fully manage Availability Groups simply. Join me for a fun filled hour of PowerShell and Availability Groups, you won’t look at them the same way after we are finished.

SQL Server on Server Core with PowerShell
Ever wondered what the craze of Server Core is all about? There is a lot of power in Core and with it comes the admin challenge of NO GUI. But there are settings that you want to change in the OS, how do you do that? This session will take you through some of the challenges that are not really challenges when you know. We will use the builtin method of making some changes and also use PowerShell and commandline tools to get you where you want to be with Core. I built a 6 node cluster on Server Core and created it with PowerShell (Cluster and all) and it runs very nicely with a little footprint in RAM for the effort.

WORKSHOP ON TUESDAY, OCTOBER 25, 2016

PowerShell for the DBA from 0-60 in a Day
Think of how many servers or instances you maintain. Putting tools in your toolbox becomes a critical part of your life as a DBA. How many clicks can be reused? We will be taking you from 0–60 and everywhere in between with how PowerShell fits into a DBA’s day in the life. We will go from getting PowerShell connected to SQL Server to walking away with a toolbox starter kit that will help you get automated and allow you to focus on harder problems than modifying database options or getting information from SQL Server and storing it away for later analysis. The toolbox alone will make it well worth your attendance.

WORKSHOP ON SATURDAY, OCTOBER 29, 2016

PowerShell for the DBA from 60-120 in a Day
This is the next installment of becoming a PowerShell DBA. Now that you have been introduced to PowerShell and have an idea of what you can do, we take it to the next level and start you thinking about tool building and using PowerShell to actually manage instances. We take a look at modules you can use every day and then we talk about building your own modules. PowerShell is becoming more prevalent in the world of DBAs but still has not come to a level that I would like to see in our careers of working smarter not harder. We will be looking at practical items that you can use PowerShell to enhance because DBA work can contain a mass amount of clicking around if you only use Management Studio. We will take the challenge to remove as many clicks as possible from your daily management of databases and database servers. We cover Database Maintenance, monitoring and data gathering. We talk about managing SQL Server instances with PowerShell and last but not least, we will see how to use it to manage Jobs and SQL Server Agent. Join me in the quest to become an effective PowerShell DBA.

Join me in the quest to become a PowerShell DBA.

Hope to see you there!

Read More

SQL Server TDE on Mirror and Log Shipping

Today in our Virtual Chapter meeting for VCDBA, I presented on SQL Server Encryption including TDE.  The question was posed as to whether or not the TDE would take affect if the mirror was established first and then you enabled TDE on the database.  I was not sure at the time only because I had not done any tests, so I did not want to say yes or no unless I could substantiate it.

I set off to do some tests.  First I created a Mirrored database using the demo code from the VC presentation.

use master
GO

CREATE DATABASE [TESTTDE] ON  PRIMARY 
( 
	NAME = N'TESTTDE', 
	FILENAME = N'C:\sqldata\S1\TESTTDE.mdf' , 
	SIZE = 10240KB , 
	FILEGROWTH = 10240KB 
)
 LOG ON 
( 
	NAME = N'TESTTDE_log', 
	FILENAME = N'C:\sqldata\S1\TESTTDE_log.ldf' , 
	SIZE = 10240KB , 
	FILEGROWTH = 10240KB 
)
GO

Now once the database is created on instance 1 I backed up the database and a log backup. Then restored it to the second instance with it’s log with no recovery. Once it was primed and ready, I established a mirror and saw the database was synchronized.  Now the trick is to use create the components to establish the Encryption Hierarchy.

USE master;
GO
CREATE MASTER KEY 
ENCRYPTION BY PASSWORD = 'INSTANCE1_Use1Strong2Password3Here!';
go

CREATE CERTIFICATE MyServerCert 
WITH SUBJECT = 'My DEK Certificate'
GO


Now that we have the MASTER KEY and the Certificate in place, you need to backup the certificate to a set of files and restore it to the second instance.

BACKUP CERTIFICATE MyServerCert
TO FILE='C:\SQLBACKUP\MyserverCert.cer'
WITH PRIVATE KEY (FILE='C:\SQLBACKUP\MyServerCert.pvk',
	ENCRYPTION BY PASSWORD='My1Secure2Password!')
GO

NOW CHANGE TO THE SECOND INSTANCE

-- Make sure that you change to the second instance

USE master;
GO
CREATE MASTER KEY 
ENCRYPTION BY PASSWORD = 'INSTANCE2_Use1Strong2Password3Here!';
GO
CREATE CERTIFICATE MyServerCert
    FROM FILE = 'C:\SQLBACKUP\MyServerCert.cer' 
    WITH PRIVATE KEY (FILE = 'C:\SQLBACKUP\MyServerCert.pvk', 
    DECRYPTION BY PASSWORD = 'My1Secure2Password!');
GO

Now that you have the Encryption Hierarchy established for both instances, let’s encrypt the database.

-- Make sure that you are in the Principal instance 
-- or this will not work because you won't be able 
-- to USE TESTTDE on the Mirror

USE TESTTDE
GO

CREATE DATABASE ENCRYPTION KEY
WITH ALGORITHM = AES_128
ENCRYPTION BY SERVER CERTIFICATE MyServerCert;
GO

ALTER DATABASE TESTTDE
SET ENCRYPTION ON;
GO

What will happen in the Mirror is that it will encrypt the database, even though you won’t be able to see it.  When it is fully encrypted, and you fail the database over in the Mirror, it will indicate that the database is encrypted. On the Principal you can see it with the query below.

SELECT 
    db.name,
    db.is_encrypted,
    dm.encryption_state,
    dm.percent_complete,
    dm.key_algorithm,
    dm.key_length
FROM 
    sys.databases db
    LEFT OUTER JOIN sys.dm_database_encryption_keys dm
        ON db.database_id = dm.database_id;

Now for the Log Shipping copy

To establish the test for the Log Shipping, do the same as you did for the mirror but restore the Full, then the Log with Standby so that you can see that it flags it as encrypted.  First establish the Log Shipping without encryption set up, and once you are in Standby on the Shipped side, you can use the existing Certificate to test the idea of encryption afterwards.

Steps:

  1. CREATE DATABASE
  2. SET Recovery Model to Full
  3. Backup Database
  4. Backup Log
  5. Change to second instance
  6. RESTORE DATABASE with NORECOVERY
  7. RESTORE LOG WITH STANDBY
  8. Change to first instance
  9. USE TESTTDELOGSHIPPING
  10. CREATE DATABASE ENCRYPTION KEY
  11. ALTER DATABASE TESTTDELOGSHIPPING SET ENCRYPTION ON
  12. BACKUP LOG
  13. Change to second instance
  14. RESTORE LOG TESTTDELOGSHIPPING WITH STANDBY
  15. Run the above query to see if the database is encrypted. You will see is_encrypted = 1

This shows that the Mirroring and Log Shipping processes are affected by the Database Encryption Process.

Happy Encrypting….!

Read More

sys.dm_db_index_physical_stats Easter Egg

I was reading through documentation on the DMV sys.dm_db_index_physical_stats and found that there are a couple of things you need to be aware of when using this DMV.  First, we know that there are parameters for the DMV.

Parameters

  • Database Id
  • Object ID
  • Index Id
  • Partition Id
  • Level of interrogation (DEFAULT, LIMITED, DETAILED, SAMPLED, NULL = LIMITED).

The interesting thing is that when any of the first 4 parameters are NULL, the default is to evaluate ALL of the items in that parameter.  For example, if you specify NULL in the Database ID then it will evaluate ALL databases in the server. If Object ID is NULL then it will evaluate all objects.  So I knew all that, but here is what I did not realize, quoting from the documentation:

passing values that are not valid to these functions may cause unintended results. For example, if the database or object name cannot be found because they do not exist or are spelled incorrectly, both functions will return NULL. The sys.dm_db_index_physical_stats function interprets NULL as a wildcard value specifying all databases or all objects.”

Additionally, the OBJECT_ID function is processed before the sys.dm_db_index_physical_stats function is called and is therefore evaluated in the context of the current database, not the database specified in database_id. This behavior may cause the OBJECT_ID function to return a NULL value; or, if the object name exists in both the current database context and the specified database, an error message may be returned. The following examples demonstrate these unintended results.”

Now that is nice, but here is what it means. If you are specifying a database id for use in this function and the database id is not the “current” database, then if the OBJECT_ID() function is used, it will evaluate it in the “current” database and NOT the database that is represented by the database id in the statement. If the object does not exist in the database, then it will result in NULL and will then proceed to do an evaluation on all objects in the database represented by the database id.

The other caveat is that if in the current database there is an Object by that name that results in an object id being returned, when the function begins, it will not be able to find the object id returned and will actually error. Take a look at the examples below and notice that you will want to be in the database that you intend to use if you are specifying anything beyond the database id. Don’t get caught with NULL in the statement without you knowing about it.

 

use master
GO
SELECT *
FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks2012'), OBJECT_ID('Person.Person'), NULL, NULL, 'LIMITED')

-- Results were for 190 rows 
/*
** What happens here is that it evaluates OBJECT_ID('Person.Person') to NULL 
** because it does not exist in master as shown below.
*/

use master
GO
SELECT OBJECT_ID('Person.Person')
-- Result is NULL
GO
use AdventureWorks2012
GO
SELECT OBJECT_ID('Person.Person')
-- Result is 1765581328

/*
** Let's create a table in master of the same name
** as in the AdventureWorks2012 database.
*/

use master
GO
CREATE SCHEMA Person AUTHORIZATION dbo
GO
CREATE TABLE Person.Person (id int)
GO
SELECT *
FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks2012'), OBJECT_ID('Person.Person'), NULL, NULL, 'LIMITED')
GO
-- You get the error below
/*
Msg 2501, Level 16, State 40, Line 3
Cannot find a table or object with the name '.StateProvinceCountryRegion'. Check the system catalog.
*/
Read More
Menu