SQL Server

T-SQL Tuesday #39: More Efficient SMO with Server.SetDefaultInitFields

TSQL2sDay150x150 This T-SQL Tuesday is about PowerShell and how I use it with SQL Server. There are many more posts to come about PowerShell and SQL Server but I thought I would cut my teeth on this T-SQL Tuesday since it was on PowerShell.

If you read Rob Farley’s (b | t) post on the SMO topic, I am going to extend it a little and illustrate how you can get a little better TSQL out of SMO by using some things built into the SMO Server object.

The Server object contains a lot of properties and methods, but this post will focus on one called SetDefaultInitFields. This method takes an object and a list of properties that should be retrieved initially when the object properties are accessed. This allows much more efficiency when dealing with collections of items, such as Databases, Tables and Columns.

The code will retrieve me a list of databases with their PageVerify setting and the screenshots will show you what the Profiler output looks like with and without using the Server.SetDefaultInitFields method. Nothing rocket-science, but it will help illustrate that you can help SMO out in the efficiency department.

# This will get the Server object from the SQL Provider
# either by using Import-Module SQLPS with SQL 2012 installed
# or by using Add-PSSnapin SqlServerProviderSnapin100 with 2008/R2 installed
# or by using SQLPS.exe with 2008/R2 installed
# If you load the Module SQLPS with 2012, you don't have to load the 
# SMO Assembly for Microsoft.SqlServer.Smo
# Execute each set in a new PowerShell window to see the same information

# This set will get the list of databases, but will execute the query
# to get the PageVerify for each database (plus much more information)

$server= Get-Item SQLSERVER:\SQL\localhost\default
$server.Databases | Select Name, PageVerify

# This set will get the list of databases and when it does, it will get them with 1 query
# along with the PageVerify at the same time

$server= Get-Item SQLSERVER:SQLlocalhostdefault
$server.SetDefaultInitFields([Microsoft.SqlServer.Management.Smo.Database], "PageVerify")
$server.Databases | Select Name, PageVerify

Here are the screenshots that show the output in Profiler for each query. Notice that you will see the first set of PowerShell actually gets the data for each database one at a time.  With the second set, you see one query.  The reason is because of SetDefaultInitFields on the Server Object.

First screenshot, shows the retrieval of the Name of the database for one database and you get the CompatibilityLevel for free as SMO will interrogate that automatically for it’s own internal purposes. But you will see that the second screenshot will have a lot more information that it retrieves, just because you asked for one of the properties in a group that SMO has defined. It retrieves it all. Notice on each of these, that it is specifically for the database named ‘Bob1’ and not for any other. Each database will have these 2 queries run for it, and without any help it could be a long process to get 1000 database attributes.

Profiler_1_NoHelp

Profiler_2_NoHelp

Now in the third screenshot you will see that using the second piece of PowerShell in the code, that it issues 1 query and gets all the information in one round trip to the server. Look Mom, no WHERE clause.  Pretty efficient.

Profiler_4_WithHelp

Hopefully this gave you a taste of what you can do to help SMO be a little more efficient and return with less round trips to the server. Do not underestimate the power of SetDefaultInitiFields.

Have a great T-SQL Tuesday.

Ben Miller

Read More

Log Shipping encounters corrupt log backup

I ran into an issue the other day and wanted to ensure that I shared it out there.  In a Log Shipping scenario, you can occasionally run into a corrupt transaction log backup. This causes the database to stay in restoring mode (as mine was previously in a standby/read-only state for reporting), but requires the next LSN to be restored, which you don’t have.

In this case, it went overnight and was out of sync a ways, but at night I do a differential backup of the database.  By restoring the differential backup with norecovery to the log shipped database, it allowed me to continue restoring with log backups from that time on.

Hope that helps anyone in the www realm, it is a nice save since databases can be large and the initialization could be very impactful.

Read More

Get SQL Server Service Accounts with WMI

Today I was tasked with retrieving the SQL Server Service accounts to validate them in Active Directory.  I wanted a simple way to get them and show the account names for each SQL Server service.  I knew that you could get the service in PowerShell with Get-Service.  I also new that PowerShell could get the service with Get-WmiObject –class Win32_Service. But neither one had the ServiceAccount property.

So I turned to WMI under SMO with PowerShell.  The script below has the details of a script I named GetSqlServices_WMI with a parameter of $sqlserver.

param (
	[string]$sqlserver
)

# Call this with .filename.ps1 "servername"

[void][reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.SqlWmiManagement")
$wmi = New-Object ("Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer") $sqlserver
$wmi.Services | Select DisplayName, Type, StartMode, ServiceState, ServiceAccount | ft -auto

The call is in the code.  The results will have DisplayName, Type, StartMode, ServiceState and ServiceAccount listed in a table with column headers.

Great way to interrogate the Services from SQL Server.

Available Properties to you are as follows:

  • Parent
  • AcceptsPause
  • AcceptsStop
  • Description
  • DisplayName
  • ErrorControl
  • ExitCode
  • PathName
  • ProcessId
  • ServiceAccount
  • StartMode
  • Type
  • IsHadrEnabled (because I am using 11.0 SMO)
  • StartupParameters
  • Dependencies
  • AdvancedProperties
  • Urn
  • Name
  • Properties
  • UserData
  • State

With Methods

  • Alter
  • ChangeHadrServiceSetting (because I am using 11.0 SMO)
  • ChangePassword
  • Equals
  • GetHashCode
  • GetType
  • Initialize
  • Pause
  • Refresh
  • REsume
  • SetServiceAccount (handy little guy)
  • Start
  • STop
  • ToString
  • Validate

The link to the documentation for this object is here (Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer).

Enjoy and happy scripting.

@DBAduck

Read More

Powershell, SMO and Database Files

In response to some questions about renaming a physical file for a database, I have 2 versions of Powershell scripts that do this for you, including taking the database offline and then online to make the physical change match the meta-data.

First, there is an article about this at http://msdn.microsoft.com/en-us/library/ms345483.aspx.  This explains that you start by setting the database offline, then alter the database and modify the filename then set it back online.  This particular article does not show the physical renaming of the file, but it is necessary since the ALTER DATABASE MODIFY FILE statement only changes the metadata of the database in master.

There is a procedure to do this with SQL CLR and TSQL, but I chose to illustrate it in Powershell and SMO (using SQL Provider as well as straight SMO).  The SQLCLR version is by a SQL Server MVP, Ted Krueger (Blog | Twitter) and can be found at this link.

My version will be based on Powershell and SMO with a smattering of SQL Provider provided in SQL 2008+.

Code Snippet
  1. Add-PSSnapin SqlServerCmdletSnapin100
  2. Add-PSSnapin SqlServerProviderSnapin100
  3.  
  4. $servername = "localhost"
  5. $instance = "default"
  6. $dbname = "N2CMS"
  7. $logicalName = "N2CMS"
  8. $NewPath = "c:\program files\microsoft sql server\mssql10_50.mssqlserver\mssql\data"
  9. $NewFilename = "N2CMS_4.mdf"
  10.  
  11. if(Test-Path "sqlserver:\sql\$servername\$instance\databases\$dbname")
  12. {
  13.         $database = Get-Item("sqlserver:\sql\$servername\$instance\databases\$dbname");
  14.  
  15.         $fileToRename = $database.FileGroups["PRIMARY"].Files[$logicalName]
  16.         $InitialFilePath = $fileToRename.FileName
  17.  
  18.         $fileToRename.FileName = "$NewPath\$NewFilename"
  19.  
  20.         $database.Alter();
  21.         $database.SetOffline()
  22.         Rename-Item -Path $InitialFilePath -NewName "$NewFilename"
  23.         $database.SetOnline()
  24.         Write-Host "File Renamed"
  25. }
  26. else
  27. {
  28.         Write-Host "Database does not exist";
  29. }

If you notice, I first add the Snapin that is the SQL Provider.  If you already have it loaded in your Powershell Profile, then you can just omit those lines.  If you do not have SQL 2008 objects installed on this machine, then you will notice that the Snapins don’t load either.  In that case you would just use the version below.

In the SQL Provider version you will see a Powershellism with Test-Path and you are using a path, it just is a SQL provider path that points to the Database and makes sure that it exists.  Compare it to the other version where you are looking at the Server.Databases[$dbname] and then you get the filegroups and then the files.

This version of the script is purely SMO and PowerShell.  First you load the objects from SQL Server 2005/2008 SMO.

Code Snippet
  1. # Always
  2. [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | outnull
  3.  
  4. # Only if you don't have SQL 2008 SMO Objects installed
  5. [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | outnull
  6.  
  7. # Only if you have SQL 2008 SMO objects installed
  8. [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SqlEnum") | outnull
  9.  
  10. $servername = "localhost"
  11. $instance = "default"
  12. $dbname = "N2CMS"
  13. $logicalName = "N2CMS"
  14. $NewPath = "c:\program files\microsoft sql server\mssql10_50.mssqlserver\mssql\data"
  15. $NewFilename = "N2CMS_4.mdf"
  16.  
  17. $server = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList $servername
  18. if($server.Databases[$dbname] != $null)
  19. {
  20.         $database = $server.Databases[$dbname];
  21.         $fileToRename = $database.FileGroups["PRIMARY"].Files[$logicalName]
  22.         $InitialFilePath = $fileToRename.FileName
  23.         $fileToRename.FileName = "$NewPath\$NewFilename"
  24.         $database.Alter();
  25.         $database.SetOffline()
  26.         Rename-Item -Path $InitialFilePath -NewName "$NewFilename"
  27.         $database.SetOnline()
  28.         Write-Host "File Renamed"
  29. }
  30. else
  31. {
  32.         Write-Host "Database does not exist";
  33. }

This version will use the Server object to get the database that you are looking for so that you can get at the files.  The file you are looking for is the $logicalName so that you can get the PhysicalName of the file.  Then using the builtin CmdLets to handle Files operations, a Rename-Item is issued to rename the initial path to the new one.

Operation order:

  • ALTER DATABASE dbname SET OFFLINE
  • ALTER DATABASE dbname MODIFY FILE
  • Rename-Item from old to new
  • ALTER DATABASE dbname SET ONLINE

I did not put these in a function, but they could easily (or not too hard that is) put together using Parameters and make it versatile for other situations, but that is for you to do so that it works for your scenario.

Love PowerShell!

Read More

The Best Thing I Learned at PASS

Being that I have never missed a PASS Summit yet (in the US), I thought I would share my experience about last PASS and previous PASS conferences.

I attended a Pre-Con with Kalen Delaney on Query Tuning and Plan Guides. I also remember sessions last PASS that were very relevant to my job. Each year that I attend PASS, I find that there is more to know and there is more to learn from many that are there to present.

I was pleased that when I got home each time, I applied my learning directly to my job. I remembered the Plan Guides day, and ironically while troubleshooting some performance issues with a database, found that I could leverage plan guides for a specific query, and it worked as Kalen illustrated. It actually saved many queries from bloating the plan cache, and was pretty helpful. But I had not done anything at that time with Plan Guides, so the topic and the time spent was very timely.

The other topics included Powershell and Wait Stats and Wait Types. The more practical sessions are very helpful and very applicable to my job as a DBA and manager mentoring DBAs. I can only say that I still plan attending all the SQL PASS Summits in the US. One day I hope to attend one outside of the US, but for now, the value of these Summits in networking, vendor introductions and all out knowledge transfer is worth every minute spent for me.

Read More

SQL PASS 2007 has concluded

I went to the SQL PASS 2007 Summit this past week, and I have to say that this was one of the best ones I have been to. I went to a pre-conference session with Kalen Delaney on Managing and the Reuse of Query Plans. Everything that I learned, I could use in my current work. That was very important for me as a new DBA in a project that needed this information.

Thanks to all speakers who took the time to put together presentations for us to learn from.

Read More

Error in SQL (Error: 17806, Error: 18452) SSPI Problem

I recently came across this while managing a clustered SQL Server. There were various messages listed out on forums, etc., but for me we found that the error was originating because we have DNS issues and Domain Controllers (DC) was unreachable due to a network switch configuration problem.

So if you see these errors, you may see in the EventLog a NetLogon error relating to not being able to reach a domain controller to login, and you will get the error below because the DC is the one that hands the Kerberos token to the process for use in SSPI related activities.

Logon,Unknown,Error: 17806 Severity: 20 State: 2.
Logon,Unknown,Login failed for user ”. The user is not associated with a trusted SQL Server connection. [CLIENT: IPAddress] Logon,Unknown,Error: 18452 Severity: 14 State: 1.
Logon,Unknown,SSPI handshake failed with error code 0x80090311 while establishing a connection with integrated security; the connection has been closed.

Good luck and have a great SQL day.

Read More

Working with SSIS in SQL Server 2005

I thought I would document my exploits in SQL Server SSIS 2005 conversion of DTS packages and the scheduling.

I had a DTS package that used the ProtectionLevel “Encrypt Sensitive Data with User Key” that I did not design or create. When I migrated them from SQL 2000 to SQL 2005, I used BI studio to bring them to the file system and then modify them in SSIS and put them back into SQL 2005 into a SSIS Store.

What I did not realize is that when you have the Encrypt with User Key, in 2005 the only option to run an SSIS package is to run it as SQL Server Agent, which in our case is running as Local System. Now the User Key would not be the same as the one that created it, nor would it have Admin rights (per se) to be part of Administrators Group to run this package.

I went about looking for the solution and found a KB article (918760) that indicates that I can put it in a SQL Server storage protection level and then it would use SQL Server roles to get the data out of encryption.

I promptly went into the BI Studio and opened the package and tried to change the protection level to SQL Server Storage and it would not let me save the package because it could not verify that it was secure. But there is no way to put it into SQL Server Storage to use that Protection Level. So I had to change it to Encrypt SensitiveInfo with Password.

Here is the fun part.

I went into BI studio and Added a Package from SSIS store (because the packages were already in there) and then modified the option and then saved it. What I did not know was that it saved it to the file system and did not save it back to the SSIS Store, so the package still failed to run under SQL Agent.

I figured out that I still needed to import it into the SSIS store and I did so, but it still would not run. I could not figure out why, but then after digging and looking at all the options, I saw this dialog and decided to click on the button circled.
SSIS Import Package

I put the password into the package and then saved it out to the File System, then imported it into SSIS Store. I needed to schedule this package to run and so I went into SQL Agent jobs and created a job with a step that indicated to run an SSIS Package. I knew that I had to specify a password so I looked up the dtexec utility page and found the /DECRYPT option. Now here is what threw me. I thought that in the first place that BI Studio saved the package or that the import of the package saved the option of Encrypt with Password option so I went into the CommandLines tab and edited the option and put in /DECRYPT with the password in the Edit Manually option, and then clicked OK. I ran the package and it worked. Well I found out later that it had not, but another weird part is that when I edited the job step the option in CommandLines was set back to Restore original options and I did not see the DECRYPT option in the window below.

So I battled over this for a while, then tried the import again and clicked on the dialog button to change the protection level and chose the Password option, set the password and then it imported. After doing this I changed the job and an interesting thing happened. It prompted me for the password for the package this time. I went into the Commandlines and it did not show any difference, but I went ahead and left it.

I found out that the DECRYPT option was in the CommandLines window, but there was no password specified, which means that it was hiding it from the UI, which is OK I guess, but I had never seen it in there saved before when I manually edited it.

I also found out that when I right clicked on the job and scripted it as CREATE to a new query window, that the option in Command showed the DECRYPT option with the password I specified when I created the Job Step.

So wild stuff and I have survived and the job actually runs now under SQL Agent so I am happy now. Just more wiser too as I spent a great amount of time working this one out.

Sorry if this does not make complete sense. I wanted to get this out now so that I did not forget it.
Have fun and I will have more SSIS stuff to share after the next couple of days.

Read More
Menu