Why Become a PowerShell DBA

If you find this article valuable and you decide that you want to get your learn on with PowerShell and SQL Server, there is a SQLSkills class that I am teaching this year, please join me to get a great start in how to get your toolset growing by adding PowerShell to your toolbelt. Early Bird discount still available until March 10, 2017.

I have heard many times from many people that it is not worth learning PowerShell as a DBA.  I have been using PowerShell since v1.0 and have found it to be amazingly rewarding. I will illustrate why this is the case in this entry and hopefully some of it will resonate with you and you will find some worth in learning at least a little PowerShell for your DBA job.

First, PowerShell was built to manage Windows Servers and soon manage some part of Linux servers as DotNet Core comes into play.  Back in the day, if you were a DBA and wanted to manage SQL Servers, you used a lot of SMO (Shared Management Objects) and that meant programming in C# or pseudo C# in PowerShell. Creating objects, managing properties, etc.. That was not for everyone I know. I have a developer background and an IT background so managing servers and Active Directory and building programs to do things was part of my world a long time ago.  But what I found was that there were not a lot of tools in the marketplace for doing things against SQL Server with PowerShell then so it was not widely adopted by DBAs.

In the SQL Server 2008 days, Microsoft wrote some PowerShell Snapins that would give us access to some cmdlets (Commands) in PowerShell to manage some things and introduced a PowerShell Provider for SQL Server that turned your SQL Server into a series of Paths (SQLSERVER:\sql\servername\instancename) and that was pretty powerful, at least to a guy like me, looking to automate many repetitive tasks. With those cmdlets and the provider there began to be a way to now use PowerShell to get at SQL Server without writing .NET code. It certainly did not cover the breadth of the product like Exchange had done with their cmdlets, but it was a start of some love. This is where the dawn of becoming a PowerShell DBA began.

Now Why Investigate

For a DBA there are many times you want to know information about your server, and there is plenty of TSQL to get that information. You need to know the DMVs or the system tables to get at that information and for a fairly new DBA, that can be a real challenge. Here is where the paradigm begins. In SMO there are objects (methods, properties) that encapsulate the properties of the objects in SQL Server into a class (Object Oriented term). These classes represent the objects by having the appropriate methods and properties to act on those objects.  Let me illustrate.

We all know that there is a Database object in SQL Server and there are many TSQL commands that we can use to find out information such as…

SELECT * FROM sys.databases

SELECT * FROM sys.database_files

and many more….

It is a common practice to create a set of TSQL scripts and take them with you as you go. That is one of the values of longevity in the industry is what you have created and built up.  There is no argument from me for this mentality. This is where you parallel with the PowerShell mentality in that you can reuse the scripts later on and not have to remember or rebuild them on the fly. But is there a better way? Maybe, but at least there is “another” way.

Enter PowerShell

PowerShell in and of itself is a mechanism to do many things. I advocate DBAs learn it because of the way you can change your job by using Automation, Scripting and Information Gathering that is much different than running TSQL Scripts and copy-pasting them into documents or tables. (or even Excel)  I will take you through a couple of scenarios to give you a flavor of an additional tool (not advocating replacing SSMS or any other tool with PowerShell, only making you more efficient) with examples.

Say I want to have a list of databases with their name and recovery model.


SELECT name, recovery_model FROM sys.databases;


Get-SqlDatabase –ServerInstance servername  | Select Name, RecoveryModel | Format-Table * –AutoSize

We won’t really go into how you get to be able to run these commands, just to compare the commands. They are not much different than each other, one is run in SSMS or sqlcmd and one is run in PowerShell console or ISE or another tool that will execute PowerShell.  The results are pretty similar as well.



Now that doesn’t look too scary. You get the same results.  Here is how you would get it with the provider.


Notice that I used a Path-Like structure to get to the Databases in the Provider.  I used the –Force parameter to ensure that I got the System Databases. Now that seems the same, other than the tool. So you ask the question, why do I do it?  Why do I learn PowerShell to be able to get the same information?  Here is the answer.

Objects are the answer

In TSQL you get results in a Grid or Text, you copy and paste or you just look at it and read it. You just get some text back either in PowerShell or TSQL or so it seems.  In PowerShell everything is an object and they all have (potentially) methods and properties.  Let me illustrate something that is an interesting part of PowerShell and objects.

Say I want to change the recovery model to Full on my DBA database.  in TSQL I could do it in a couple of ways and it would look something like this.


Let’s take a look at how you do it with PowerShell and why it would be valuable to get this tool in your toolbelt.


What you see is the retrieving of the Database object in an SMO object (the provider always returns an SMO object) and there is a Property called RecoveryModel that tells me which recovery mode it is. To set it, the property is read/write so I can change it with the words “FULL”, “SIMPLE” or “BULKLOGGED” and then I use the method Alter() as above.  The methods have the double parenthesis () after the method name and I could have inserted ($true) in the parenthesis to do a WITH ROLLBACK IMMEDIATE like I would do if I were changing a property that required full access.  Now that looks like it may not be very advantageous, but imagine the next scenario.


I create a function to change my recovery model.  I can use parameters for my SQL Server instance and the database and the RecoveryModel.  Now I have a tool that does not require SSMS to be installed and allows me to call a function to change my recovery model.  Once this is written and saved (and even put in a module for easy loading), and loaded into my environment, I can use it pretty easily for any database on any server.

Like I said before, this is contrived and works because I have SMO loaded, which is not hard to do, but this entry is all about why, not how to get things loaded.  I have a SQL PowerShell Stairway at SqlServerCentral.com that you can go and read to help you get that part worked out.  Not hard but not the reason for this post.

Join me in the quest to become a PowerShell DBA. Stay tuned for more on this site and join me in a class soon.

Read More

SQLintersection Workshop attendees change to URL

Hello All.  I am ensuring that you have access to the downloads. In the slide deck there was a link and it pointed to the file.  You need to add /wp-content/ to the URL before uploads and after dbaduck.com.

That means that the uploads are at https://dbaduck.com/wp-content/uploads/nameoffile.zip

I apologize for having the link in the wrong place.  Please feel free to email me at the email in the slides if you cannot get the file.

Read More

PowerShell month in May with DBAduck

I am excited for this month.

SQLSkills Training

Event 1 for this month revolving around PowerShell is May 8-10 at SQLskills.com training in Chicago. I am teaching the first class of the Immersion event on PowerShell for the DBA. You can see the details in the link on the class.

I am super humbled and excited to teach this class for SQLskills and at the time of this writing there are 3 seats left after we opened more seats for the class. Please join me for a great event.

We will be going over how to get started with PowerShell and then dive right in and learn why a DBA should learn PowerShell and why you would ever want to become a PowerShell DBA. There are so many reasons and so much value, I cannot contain it all in this type of post. But we will be having a grand time drinking from a firehose and learning while we do. 🙂

I hope you will join me for this iteration, or plan on another iteration another time. I love PowerShell and look forward to meeting all of you who are going to be there.

PASS Virtual PowerShell Chapter Meeting

Event 2 for this month will be on Wednesday, May 17th at 12:00 PM Eastern Time I will be presenting to the group on Gathering Data for Trending using PowerShell.  You can RSVP here . I will be covering a great topic that I spend a lot of time thinking about. There is a lot of data in SQL Server and a lot of Meta-Data that can tell some great stories about your servers and what the behavior looks like.  Ever since working at Microsoft with the Community Research team on Newsgroup Behaviors, I have been hooked on looking at things in SQL Server as far as behaviors go, not just about features and such.  So I have created some data gathering scripts to get data and put it into tables with snapshots in time and looking at the information in a different light to see if I can derive how things are working in SQL Server.  Join me for a great time and register for the webinar here.

SQL Intersection

Event 3 will be a PreCon at SQLIntersection on May 20th in Orlando at the Disney Swan Hotel.  If you are going to be there, I hope you will join me and if you are thinking about going, please register and join me there.  It is my well-liked PreCon that takes you from 0-60 in using PowerShell as a DBA and a professional that wants to manage SQL Server from a command line instead of clicking your way through everything.  I will be teaching how to leverage the shell as well as balancing the use between SSMS and PowerShell to gain another tool in your toolbelt.

This will be a great month focused on PowerShell.


Read More

#SQLIntersection with DBADuck

In May, I will be at SQLintersection at the Disney World Swan in Orlando, Florida.

You can visit the site to see the full lineup of sessions and workshops. I’m going to be presenting three sessions on SQL Server topics and 1 Workshop on PowerShell in SQL:

Have you ever wanted to know what the data looked like yesterday before a change was made? Have you cringed at the thought of creating triggers to get data changes logged for later analysis? Looking at a new feature in SQL 2016 called Temporal Tables gives you the ability to create a table that automatically keeps track of your data in time. Magic was never meant to be part of IT, but SQL Server has done a great job with this new feature. We will cover how to set it up, how it works, querying the tables including the history table and last but not least, how you can incorporate this feature into existing tables. Join me in a time travelling adventure to find out how you can leverage Temporal Tables in your world.

This session will cover real world functions that are used for real interaction with SQL Server. There will be a few slides but mostly demos with actual scripts that interact with SQL Server for management and getting data in and out. I will also introduce some outside modules created to assist as well. Things like Reading/Changing the following: Permissions, DB Owner maintenance and database space management. We all could use a few standard things in the SQL Server world of ever changing roles and “I have to have it now” stuff, that Management Studio can get in the way of getting it done faster. Join me for a fun PowerShell hour and never try to reuse a click again.

SMO, short for Shared Management Objects (or SQL Management Objects to some), are some powerful programming interfaces to get and manipulate SQL Server in code. These objects are used in custom programs, in scripting with PowerShell and can be used to fully manage a SQL Server. But what do you really do with them? How do you use them? This session will cover some core objects that are used in every day interaction with SQL Server. I will cover how to get access to them, how to use them and best of all you will get to know these mysterious objects

WORKSHOP ON Saturday, May 20, 2017

PowerShell for the DBA from 0-60 in a Day
Think of how many servers or instances you maintain. Putting tools in your toolbox becomes a critical part of your life as a DBA. How many clicks can be reused? We will be taking you from 0–60 and everywhere in between with how PowerShell fits into a DBA’s day in the life. We will go from getting PowerShell connected to SQL Server to walking away with a toolbox starter kit that will help you get automated and allow you to focus on harder problems than modifying database options or getting information from SQL Server and storing it away for later analysis. The toolbox alone will make it well worth your attendance.

Join me in the quest to become a PowerShell DBA.

Hope to see you there!

Read More

PASS Board Elections, Why am I running?

Hey all you SQL PASS people in the #SQLFamily. I am a little late to the game, but wanted to ensure while the voting is active now, that you had why I am running.

In this vast connected world of technology there are so many SQL People I don’t even know if I venture to guess how many there are. My experience has been that PASS has been an organization that has done a lot of good for their members. The problem I have that I want to attempt to address is why the many people who are members don’t really know what that means or don’t understand what they are a part of. I really want to help people connect and to take advantage of all the benefits and connection opportunities that they have inside an organization of this size. PASS is not just a SQL Saturday and a PASS Summit organization. Funding is important to run, but the communities should be fueled by the organization and the sponsors should fuel the organization. As members we should be concerned about supporting the organization by actively participating in all the aspects of PASS, not just taking advantage of the free events like SQL Saturdays.

Now before I get strung up, I want to clarify what taking advantage of what PASS offers means. When you as a member starts out, there is a period of time that you get fed, everything you want, everything you can handle. But after a while, there is a time to give back and that is the eco-system in which we “Should” live in with PASS. I have been a member of PASS for a long time and I have been a Chapter leader for a long time, but I realized during the interview of Board Candidates, that I had not given back enough. Sure I speak at a lot of SQL Saturdays, and at events around the country and in my home state. But I have not volunteered in the PASS organization at all and I really need to give back.

Giving back and connecting our #SQLFamily is why I am running after all. The stuff above is the path that took me to submitting my application. I support all the other candidates and think that the slate this year is amazing and you cannot go wrong. PLEASE VOTE, it is part of volunteering and supporting the organization that we belong to. Get out there (online) and cast your vote and make this election the one with the most participation.

Happy Voting.
PASS Candidates 2016

Voting is in your myPASS profile.
Vote Now in your PASS Profile

Read More

Join me at SQL Intersections in October 2016

In October, I will be at SQLintersection at the MGM Grand in Las Vegas, Nevada.

You can visit the site to see the full lineup of sessions and workshops. I’m going to be presenting four, count them 4 sessions on SQL Server topics and 2 Workshops on PowerShell in SQL:

Be Friendly to SQL Server with TSQL Best Practices
TSQL is a necessity when interacting with SQL Server so knowing can be half the battle. Performance is always good as the database starts to grow, but building in resilience when you begin is a greater advantage than refactoring. I will go over 5 key things to know when you write TSQL, use DataTypes and/or variables in comparisons and you will also learn about the procedure cache and how to avoid pitfalls there. This is a beginners session but the concepts in this session are a great foundation to begin with. If you are looking for a solid foundation to build on and need the basics to start, this session is definitely for you.

SQL Server Filestream Implementation and Management
Filestream was introduced in SQL 2008 and has been a part of the engine since then. The reasons to use Filestream over BLOB has been talked about in many different ways and for many different reasons. I will take you through how to get SQL Server configured to use Filestream, what that means and how to implement a column that is of Filestream type. I will also go over where the files live and how to stream them outside of the SQL engine instead of pulling them through the engine. Join me for an in depth discussion on this technology. We will also touch on FileTables which was new in SQL 2012.

Managing Availability Groups with PowerShell
Today Availability Groups are all the rage. Being a PowerShell guy I prefer to manage things with PowerShell. The SQL team has granted us a great toolset in the SQLPS/SqlServer module that allows you to fully manage Availability Groups simply. Join me for a fun filled hour of PowerShell and Availability Groups, you won’t look at them the same way after we are finished.

SQL Server on Server Core with PowerShell
Ever wondered what the craze of Server Core is all about? There is a lot of power in Core and with it comes the admin challenge of NO GUI. But there are settings that you want to change in the OS, how do you do that? This session will take you through some of the challenges that are not really challenges when you know. We will use the builtin method of making some changes and also use PowerShell and commandline tools to get you where you want to be with Core. I built a 6 node cluster on Server Core and created it with PowerShell (Cluster and all) and it runs very nicely with a little footprint in RAM for the effort.


PowerShell for the DBA from 0-60 in a Day
Think of how many servers or instances you maintain. Putting tools in your toolbox becomes a critical part of your life as a DBA. How many clicks can be reused? We will be taking you from 0–60 and everywhere in between with how PowerShell fits into a DBA’s day in the life. We will go from getting PowerShell connected to SQL Server to walking away with a toolbox starter kit that will help you get automated and allow you to focus on harder problems than modifying database options or getting information from SQL Server and storing it away for later analysis. The toolbox alone will make it well worth your attendance.


PowerShell for the DBA from 60-120 in a Day
This is the next installment of becoming a PowerShell DBA. Now that you have been introduced to PowerShell and have an idea of what you can do, we take it to the next level and start you thinking about tool building and using PowerShell to actually manage instances. We take a look at modules you can use every day and then we talk about building your own modules. PowerShell is becoming more prevalent in the world of DBAs but still has not come to a level that I would like to see in our careers of working smarter not harder. We will be looking at practical items that you can use PowerShell to enhance because DBA work can contain a mass amount of clicking around if you only use Management Studio. We will take the challenge to remove as many clicks as possible from your daily management of databases and database servers. We cover Database Maintenance, monitoring and data gathering. We talk about managing SQL Server instances with PowerShell and last but not least, we will see how to use it to manage Jobs and SQL Server Agent. Join me in the quest to become an effective PowerShell DBA.

Join me in the quest to become a PowerShell DBA.

Hope to see you there!

Read More

Join me at SQL Intersections for a great lineup this year!

In April, I will be at SQLintersection at the Walt Disney World Swan in Lake Buena Vista, Florida.

This year, you can save $50 on attending by using my last name (MILLER) as the promo code, AND if you register by January 31st, you may be eligible for free hardware as well – a Microsoft Band 2, a Surface 3, or an XBox One.

You can visit the site to see the full lineup of sessions and workshops. I’m going to be presenting two 200-level sessions on PowerShell and Encryption and a Workshop on PowerShell:

SQL Server Encryption
Have you ever wanted to know how Transparent Database Encryption (TDE) works or how you set it up? What about encrypting your backups? This session will go over all the steps and caveats that go with this technology. TDE allows you to have your database encrypted on disk and the same Encryption Hierarchy allows you to back up your database and have it encrypt the contents in the backup file. We will discuss the Encryption Hierarchy which is used for encryption in SQL Server and take you through keeping your secrets safe. Master the concepts of SQL Server Encryption when you are done with this session.

SMO Internals for High Performance Automation using PowerShell
In today’s fast paced world, automation is becoming a necessity and not a luxury. PowerShell is a powerful tool that allows you to leverage Windows objects and coupled with SMO you have the power to connect to SQL Server objects. This is all great, but not understanding how SMO works, you could find that your automations are not as fast as you would hope, when accessing SQL Server. This session will take you beneath the surface and demonstrate the secret sauce that will help you get the most out of your SQL Server automation scripts and have them perform. Understanding is the key here and we will get you there in this session. Let’s take your automation to infinity and beyond.


PowerShell for the DBA from 0-60 in a Day
Think of how many servers or instances you maintain. Putting tools in your toolbox becomes a critical part of your life as a DBA. How many clicks can be reused? We will be taking you from 0–60 and everywhere in between with how PowerShell fits into a DBA’s day in the life. We will go from getting PowerShell connected to SQL Server to walking away with a toolbox starter kit that will help you get automated and allow you to focus on harder problems than modifying database options or getting information from SQL Server and storing it away for later analysis. The toolbox alone will make it well worth your attendance.

Join me in the quest to become a PowerShell DBA.

Hope to see you there!

Read More

FreeCon in Seattle, October 27th before PASS Summit 2015

SQL Solutions Group is getting ready to host a day of training. This will be a FREECON, which means that it is free and before a conference. Seattle is the place for PASS Summit 2015 and that is where the FreeCon will be held. Registration is here.
More information is here. This training is not a part of the Summit. Below is a little detail of what I will be speaking on.

Practical PowerShell for the DBA
Think of all the tools you use in managing your SQL Servers. All those SQL Servers being managed by tools and man that is a lot of clicks. We will show practical scripts and techniques to help you get a handle on all those clicks. Whether you are gathering data or statistics from your SQL Servers or deploying an object to all of them. Configuration items are not excluded from the need for good tools. PowerShell is that tool that will let you get away from all those clicks. Reusable scripts that let you manage all those instances with ease. This session will give you a great start on how to think about admin tasks using PowerShell scripts or modules. Many items are already out there to help you and we will take a good look.

I hope you will join us for a great FreeCon before the Summit starts.

Happy PowerShelling!

Read More

SQL Server TDE on Mirror and Log Shipping

Today in our Virtual Chapter meeting for VCDBA, I presented on SQL Server Encryption including TDE.  The question was posed as to whether or not the TDE would take affect if the mirror was established first and then you enabled TDE on the database.  I was not sure at the time only because I had not done any tests, so I did not want to say yes or no unless I could substantiate it.

I set off to do some tests.  First I created a Mirrored database using the demo code from the VC presentation.

use master

	FILENAME = N'C:\sqldata\S1\TESTTDE.mdf' , 
	SIZE = 10240KB , 
	NAME = N'TESTTDE_log', 
	FILENAME = N'C:\sqldata\S1\TESTTDE_log.ldf' , 
	SIZE = 10240KB , 

Now once the database is created on instance 1 I backed up the database and a log backup. Then restored it to the second instance with it’s log with no recovery. Once it was primed and ready, I established a mirror and saw the database was synchronized.  Now the trick is to use create the components to establish the Encryption Hierarchy.

USE master;

WITH SUBJECT = 'My DEK Certificate'

Now that we have the MASTER KEY and the Certificate in place, you need to backup the certificate to a set of files and restore it to the second instance.

TO FILE='C:\SQLBACKUP\MyserverCert.cer'
	ENCRYPTION BY PASSWORD='My1Secure2Password!')


-- Make sure that you change to the second instance

USE master;
    FROM FILE = 'C:\SQLBACKUP\MyServerCert.cer' 
    WITH PRIVATE KEY (FILE = 'C:\SQLBACKUP\MyServerCert.pvk', 
    DECRYPTION BY PASSWORD = 'My1Secure2Password!');

Now that you have the Encryption Hierarchy established for both instances, let’s encrypt the database.

-- Make sure that you are in the Principal instance 
-- or this will not work because you won't be able 
-- to USE TESTTDE on the Mirror




What will happen in the Mirror is that it will encrypt the database, even though you won’t be able to see it.  When it is fully encrypted, and you fail the database over in the Mirror, it will indicate that the database is encrypted. On the Principal you can see it with the query below.

    sys.databases db
    LEFT OUTER JOIN sys.dm_database_encryption_keys dm
        ON db.database_id = dm.database_id;

Now for the Log Shipping copy

To establish the test for the Log Shipping, do the same as you did for the mirror but restore the Full, then the Log with Standby so that you can see that it flags it as encrypted.  First establish the Log Shipping without encryption set up, and once you are in Standby on the Shipped side, you can use the existing Certificate to test the idea of encryption afterwards.


  2. SET Recovery Model to Full
  3. Backup Database
  4. Backup Log
  5. Change to second instance
  8. Change to first instance
  13. Change to second instance
  15. Run the above query to see if the database is encrypted. You will see is_encrypted = 1

This shows that the Mirroring and Log Shipping processes are affected by the Database Encryption Process.

Happy Encrypting….!

Read More

sys.dm_db_index_physical_stats Easter Egg

I was reading through documentation on the DMV sys.dm_db_index_physical_stats and found that there are a couple of things you need to be aware of when using this DMV.  First, we know that there are parameters for the DMV.


  • Database Id
  • Object ID
  • Index Id
  • Partition Id

The interesting thing is that when any of the first 4 parameters are NULL, the default is to evaluate ALL of the items in that parameter.  For example, if you specify NULL in the Database ID then it will evaluate ALL databases in the server. If Object ID is NULL then it will evaluate all objects.  So I knew all that, but here is what I did not realize, quoting from the documentation:

passing values that are not valid to these functions may cause unintended results. For example, if the database or object name cannot be found because they do not exist or are spelled incorrectly, both functions will return NULL. The sys.dm_db_index_physical_stats function interprets NULL as a wildcard value specifying all databases or all objects.”

Additionally, the OBJECT_ID function is processed before the sys.dm_db_index_physical_stats function is called and is therefore evaluated in the context of the current database, not the database specified in database_id. This behavior may cause the OBJECT_ID function to return a NULL value; or, if the object name exists in both the current database context and the specified database, an error message may be returned. The following examples demonstrate these unintended results.”

Now that is nice, but here is what it means. If you are specifying a database id for use in this function and the database id is not the “current” database, then if the OBJECT_ID() function is used, it will evaluate it in the “current” database and NOT the database that is represented by the database id in the statement. If the object does not exist in the database, then it will result in NULL and will then proceed to do an evaluation on all objects in the database represented by the database id.

The other caveat is that if in the current database there is an Object by that name that results in an object id being returned, when the function begins, it will not be able to find the object id returned and will actually error. Take a look at the examples below and notice that you will want to be in the database that you intend to use if you are specifying anything beyond the database id. Don’t get caught with NULL in the statement without you knowing about it.


use master
FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks2012'), OBJECT_ID('Person.Person'), NULL, NULL, 'LIMITED')

-- Results were for 190 rows 
** What happens here is that it evaluates OBJECT_ID('Person.Person') to NULL 
** because it does not exist in master as shown below.

use master
SELECT OBJECT_ID('Person.Person')
-- Result is NULL
use AdventureWorks2012
SELECT OBJECT_ID('Person.Person')
-- Result is 1765581328

** Let's create a table in master of the same name
** as in the AdventureWorks2012 database.

use master
CREATE TABLE Person.Person (id int)
FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks2012'), OBJECT_ID('Person.Person'), NULL, NULL, 'LIMITED')
-- You get the error below
Msg 2501, Level 16, State 40, Line 3
Cannot find a table or object with the name '.StateProvinceCountryRegion'. Check the system catalog.
Read More