How is the pandemic going for you? (T-SQL Tuesday #132)

At the first of this year, things were going well. Just before the pandemic was in full swing, the company I worked for merged with another company. This added a lot of extra things and stress to the current situation and I found that it was a welcome blessing to work from home all the time. This post is about what I learned and one of the things that I will be working on during this time to keep myself in a better place.

What I realized at the onset of the pandemic

I thought that it would be the best thing ever to be able to work from home and never go into the office. In the beginning it was great as I loved to work from home anyway because I got more done and drivebys were not prevalent. What I realized is that the interaction with the team was really nice. To keep up with things and talk with people physically was more important than I thought. I also found out that I was perfectly OK without interaction while working and that I was more of an introvert than I thought. But now I did not get to travel to SQL Saturday’s and speak, I did not go to the User Group physically and catch up with the members. So while I loved being in my home-office and was perfectly content, I was not perfectly content. I had a side of me that wanted to interact and see the whites of peoples eyes and present in front of. In the end I wanted to be with the #SQLFamily more than I thought I would.

Health Ramifications

I don’t think I take care of myself as much as when I was in the office. I eat worse and I sit all day and find myself having sat for hours on end without drinking water or eating anything. I don’t think that I am in better health than before the pandemic. I weigh more and I have not developed good habits as I should. So I have to start thinking about long term and not short term in my health. I am grateful to have been shown this by the pandemic, I know I cannot keep doing what I have been doing and be OK. I am beginning to put together a plan to get more healthy and develop good habits around exercise, good office practices (taking breaks, getting up from the chair, standing for a bit) so that I can be more healthy and happy, less stress and a good outlet to the daily grind.

My Advice to others

Really it is about finding out who you are and what you want to be. To be of help to your colleagues you need to keep up with yourself and have something to offer. I have given all my time, effort and really everything to everyone else, and I love to do it because growing is very important in keeping your mental health. If you do not learn and grow, you stagnate which means that you start to feel like you don’t have anything to offer and if that is a pattern every day, then it will be really detrimental a month from now, because the frog doesn’t get boiled overnight, it takes a slow rise of temperature and the frog stays in the water until the heat is so much that he cannot get out of the water because of no strength and then he boils. Don’t let the temperature of your water slowly cook you to death, recognize right now that you are in that pot of water and you want to ensure that you get out on dry land to make a difference. Give a little and then give a little more, but remember to give to yourself at the same time. Mostly so that you can thrive, grow and learn the things that make you, you. Take care of yourself through this pandemic and remember that you are a great part of our #SQLFamily and we need you to be complete. Stay safe and see you out there.

Read More

UpdateHadronTruncationLsn messages in SQL 2019 Errorlog

If you have SQL Server 2019 installed and are using Availability Groups you may incur messages that look like the below message in your SQL Server Errorlog.

UpdateHadronTruncationLsn(6) (force=0): Primary: 0:0:0 Secondary: 892:4720:2 Partial Quorum : 0:0:0 Aggregation: 892:4720:2 Persistent: 892:382:4

According to Microsoft (unofficially as I did not hear this personally), there will be a fix in CU6 for SQL 2019 that will remove these messages by default and if you enable Trace Flag 3605 then it will turn them on to be logged as below.

DBCC TRACEON(3605, -1)

If this changes, I will come and update this post.

Thanks for listening. Have a great day and stay safe and happy!

Read More

Insights from your Database using PowerShell and TSQL

Hello All.

I have been looking for ways for quite sometime to understand my databases better. There are a lot of DMVs and even some DMFs that help us get some information from the SQL Server engine that can be very insightful. Many shops use monitoring tools and they are great because they watch things all the time. I used to rely solely on these tools to get me information about my SQL Server environment. That has shifted since I have been managing a SaaS platform built on top of SQL Server. With hundreds of TB of data and many databases being the same, it becomes pretty daunting to know which databases to care about and how much to care.

This article is about the start of things you can do to get ready for a better maintenance strategy or understanding how your indexes are used.

Let’s begin.


You have many tables in your databases and you want to know how they are used. There are DMVs for index usage stats which will tell you about like sys.dm_db_index_usage_stats and querying them is insightful, but how do the stats change over time? These stats are reset when the instance is restarted and it is good to know that you have 2000 seeks and 500 scans of the index, but when did they happen? Was it on a common day? Common hour?


First the elements of the solution should be defined:

  • PowerShell at least 4.0 (preferable 5.1)
  • SQL Server at least 2008 R2, but could be earlier but this is the earliest that I have run against
  • A way to run a job/task
  • Get the dbatools PowerShell Module
  • Content listed below – and it will be on my Github Repository.

I say PowerShell because this is what I love to use for automation tasks. I will illustrate how to do this with SQL Server Agent, but it will still work with Task Scheduler. Let’s define the solution so that we can put the pieces together for use in your own environment. This will give you a good framework to gather any type of information, whether it be with TSQL, which this solution is using, or straight PowerShell and SMO or other items (like WMI).
First, we need to create a table to hold what I call Iterations to put lines in the sand and allow you to use timeframes in your statistics gathering. We will also need a SCHEMA called “stat” for holding the Index Usage data.


Now the table…

/****** Object: Table [dbo].[Iterations] Script Date: 2/15/2020 3:40:49 PM ******/
CREATE TABLE [dbo].[Iterations] (
    [IterationId] [int] IDENTITY(1,1) NOT NULL,
    [GatherDate] [datetime2](3) NOT NULL,
        [IterationId] ASC
ALTER TABLE [dbo].[Iterations] ADD DEFAULT (getdate()) FOR [GatherDate]

Notice that we have an IDENTITY column which could easily be a SEQUENCE candidate from a DEFAULT on the IterationId column. Also notice that the only other column is the GatherDate as a datetime column. This allows us to put an INT (you could use a BIGINT) column for related tables instead of having dates to join on.
Next up is the table to hold the statistics.

CREATE SEQUENCE stat.IndexUsage_Seq as bigint
/****** Object:  Table [stat].[IndexUsage]    Script Date: 4/20/2020 10:59:19 PM ******/
CREATE TABLE [stat].[IndexUsage](
	[server_name] [nvarchar](128) NULL,
	[database_name] [nvarchar](128) NULL,
	[database_id] [smallint] NULL,
	[object_id] [int] NOT NULL,
	[schema_name] [sysname] NOT NULL,
	[table_name] [sysname] NOT NULL,
	[index_name] [sysname] NULL,
	[index_id] [int] NOT NULL,
	[user_seeks] [bigint] NULL,
	[user_scans] [bigint] NULL,
	[user_lookups] [bigint] NULL,
	[user_updates] [bigint] NULL,
	[last_user_seek] [datetime] NULL,
	[last_user_scan] [datetime] NULL,
	[iterationid] [int] NOT NULL,
	[IndexUsageId] [bigint] NOT NULL,
        CONSTRAINT [PK_stat_IndexUsage__IndexUsageId] PRIMARY KEY CLUSTERED 
	    [IndexUsageId] ASC
ALTER TABLE [stat].[IndexUsage] ADD  CONSTRAINT [DF_IndexUsage_IndexUsageSeq]  DEFAULT (NEXT VALUE FOR [stat].[IndexUsage_Seq]) FOR [IndexUsageId]

Next up is the query and to get the information, encapsulated in a PowerShell function called Get-BmaIndexUsage (Bma is to keep things separate from other functions, Bma is Ben Miller & Associates). There are parameters for the SQL instance and credential with the reporting instance and database to put the data in. This example will give you an idea of how easy it is to get the data and put it in the table.

Index Usage Stats

    @@servername as [server_name],
    db_name() as [database_name],
    db_id() as [database_id],
    t.object_id, as [schema_name], as [table_name], as [index_name],
    $IterationId AS [iterationid]
from sys.tables t
inner join sys.indexes ix on t.object_id = ix.object_id
inner join sys.schemas s on t.schema_id = s.schema_id
left join sys.dm_db_index_usage_stats ius on ius.object_id = ix.object_id and ius.index_id = ix.index_id 
        and ius.database_id = DB_ID()
WHERE t.is_ms_shipped = 0;

Let’s tie this together before we get into why we would even do this activity.

  • Inside DBA database
    • CREATE SEQUENCE stat.IndexUsage_Seq
    • CREATE TABLE dbo.Iterations
    • CREATE TABLE stat.IndexUsage
  • Command File with code to call PowerShell
    • GetIndexUsage.cmd
  • PowerShell file to run all the things to get the data with TSQL and put it in the table above
    • GetIndexUsage.ps1
  • Job in SQL Agent to get the stats and schedule for every hour
  • When all of this is put into place, the next step is to understand how to use the information that has been gathered. With the IterationId in place, the data can be sliced by iteration, or more
REM This would go in the GetIndexUsage.cmd that would be called from the Agent Job in a CmdExec step.
@powershell c:\bin\GetIndexUsage.ps1
Import-Module dbatools

function Get-BmaIndexUsage
	param (
		[string]$ReportSqlInstance = "localhost",
		[string]$ReportSqlDb = "DBA"
	# Scheduled Per Day
	# Run per Database
	$query = @"
	@@servername as [server_name],
	db_name() as [database_name],
	db_id() as [database_id],
	t.object_id, as [schema_name], as [table_name], as [index_name],
	$IterationId AS [iterationid]
--INTO stat.IndexUsage
from sys.tables t 
inner join sys.indexes ix on t.object_id = ix.object_id
inner join sys.schemas s on t.schema_id = s.schema_id
left join sys.dm_db_index_usage_stats ius on ius.object_id = ix.object_id and ius.index_id = ix.index_id and ius.database_id = DB_ID()
WHERE t.is_ms_shipped = 0
ORDER BY [schema_name], [table_name], [index_id];

	if($SqlCredential) {
		$s = Connect-DbaInstance -SqlInstance $ServerInstance -SqlCredential $SqlCredential
	else {
		$s = Connect-DbaInstance -SqlInstance $ServerInstance 
	$dblist = $s.Databases.Name | Where { $_ -notin @("master","model","msdb","tempdb") }
	foreach ($db in $dblist)
			if($SqlCredential) {
				$dt = Invoke-DbaQuery -SqlInstance $ServerInstance -Database $db -Query $query -As DataTable -SqlCredential $SqlCredential
			else {
				$dt = Invoke-DbaQuery -SqlInstance $ServerInstance -Database $db -Query $query -As DataTable
			$dt | Write-DbaDbTableData -SqlInstance $ReportSqlInstance -Database $ReportSqlDb -Schema stat -Table IndexUsage 
			Write-Host "Error occurred in $($s.Name) - $($db) - IndexUsage"

Get-BmaIndexUsage -ServerInstance "localhost" -IterationId 1 

The above PowerShell code would be run from the CMD file that runs and basically connects to the server specified in the bottom of the script, in this case “localhost” or change it to the real one. Then specifying the IterationId of 1 would get the data and put it in the table created with an IterationId of 1, and the next run would be 2 and so forth. If you execute the following TSQL you can get an IterationId from the dbo.Iterations table.

INSERT INTO dbo.Iterations (GatherDate)  OUTPUT inserted.IterationId VALUES (GetDate())

A simple query to the sys.dm_db_index_usage DMV that will show what the statistics look like for the index. Key elements:

  • DB_ID() in the where clause to look only in the database you are in. This DMV covers the entire instance so narrow it down with database_id = DB_ID()
  • Object_id of the table, you can even use the OBJECT_ID(‘tablename’)
  • Optionally you can narrow it down with the index_id as well

If I see that at 12:00 AM my indexes are being used 2 seeks and 1 scan and 4 updates. As shown in the graphic and then at 1:00 AM there are 10 seeks and 1 scan and 40 updates. At 2:00 AM there are 10 seeks and 1 scan and 100 updates and the following hour at 3:00 AM there are 20 seeks and 4 scans and 100 updates total. At 8:00 AM you get on the scene and find the final stats to be 100 seeks and 8 scans and 200 updates.
Let’s digest this, we did not watch every hour and when we did get a chance at 8:00 AM to look and see the final stats at 100 seeks, 8 scans and 200 updates. How do we know what happened? We have these statistics captured every hour by iterationId and datetime. I want to know how many seeks, scans and updates take place between 4:00 AM and 5:00 AM. Most monitoring software packages do not capture these stats at all, let alone capture it by hour to give you insights you need to understand how your index is being used and even potentially give you opportunity to understand when you may want to maintain the index.


  • Here is an example of a scenario that you can consider:
  • I created 14 indexes on a table and I want to know a few things.
  • 1. Are they being used at all?
  • 2. Are they seeked or scanned?
  • 3. How often are they being seeked or scanned?
    ISNULL(I1.user_seeks,0) as user_seeks_1,
    ISNULL(I2.user_seeks,0) as user_seeks_2,
    ISNULL(I2.user_seeks,0) – ISNULL(I1.user_seeks,0) as user_seeks_diff,
    ISNULL(I1.user_scans,0) as user_scans_1,
    ISNULL(I2.user_scans,0) as user_scans_2,
    ISNULL(I2.user_scans,0) – ISNULL(I1.user_scans,0) as user_scans_diff,
    ISNULL(I1.user_lookups,0) as user_lookups_1,
    ISNULL(I2.user_lookups,0) as user_lookups_2,
    ISNULL(I2.user_lookups,0) – ISNULL(I1.user_lookups,0) as user_lookups_diff,
    ISNULL(I1.user_updates,0) as user_updates_1,
    ISNULL(I2.user_updates,0) as user_updates_2,
    ISNULL(I2.user_updates,0) – ISNULL(I1.user_updates,0) as user_updates_diff
FROM stat.IndexUsage I1
INNER JOIN stat.IndexUsage I2 ON I1.server_name = I2.server_name 
        and I1.database_name = I2.database_name
        and I1.object_id = I2.object_id
        and I1.index_id = I2.index_id
    I1.IterationId = 4 
    AND I2.IterationId = 5
    AND I1.table_name = 'MyTable'
    AND I2.table_name = 'MyTable'

The output from the query is below and this would give you insights into the indexes for a table and how they have behaved over the last hour since the iterations are sequential. If your interval is longer than 1 hour then it is 1 interval of time so you can see based on your interval how it behaved.

Index Usage trend difference

As you can see, the data is coming out with the 1 and the 2 and then a diff with the query. This gives you a view if it is 1 hour of data, then you know that the diff of Index 6 is 101 seeks in the last hour. Over time you can trend this data and see some correlations with some queries, or you see lots of 0’s like above, and that will help you know which indexes are used and maybe those are the ones that are needing maintenance more, if the updates are high.

Hopefully this was a good start to understanding how you can get some insights into your database with a little automation. Stay safe and Happy.

Read More

Microsoft has been confirmed as the Premium Sponsor for PASS Virtual Summit

The formal announcement is here (Announcement).

What does this mean to you? This means that you will have 1 on 1 time with Microsoft Engineers and highly sought after content from Microsoft for the Summit.

If you have not had a chance to interact with the engineers at Microsoft with yours or your company’s issues, you are missing out and should consider registering for the Summit this year in November. With registration you get this opportunity as well as to hear from MVPs, MCMs and highly sought after content from the SQL Community of speakers and knowledge experts.

I am excited for the years Summit and as you have heard out on the web, this is your opportunity to contribute to the betterment of PASS from a budget and community perspective. I will be there, and I hope to see you there. Have a great day!

Read More

Using Clipmate for great lookback ability #tsql2sday

#TSQL2sday is today. The theme is Tips and Tricks NOT related to SQL Server or RDBMS systems.

My tip today is about a utility that I have been using for a long time thanks to Lars Rasmussen (b | t) long time very good friend. The utility is called Clipmate and it is a clipboard manager.  It does cost to use it after a trial, but it is so useful that I could easily justify the money.

Here is how it works.  Whenever you copy something (right click Copy, or Ctrl+C) it ends up in the clipboard and Clipmate keeps track of it.  Imagine going through your day whether it is programming, or TSQLing, or just being on the web and marking everything as favorites in your browser. Imagine being able to just copy the URLs and move on and after you


get done going into the Clipmate interface and grabbing them from the list and dragging them into a folder in Clipmate.  This utility has it’s own store of where it puts things and you can create your own folder structure in there too and then you have a utility that you can back up your store and restore it on another machine or do other things creative.

The benefits is that everything you copy goes into this utility and is retrievable so if you had copied some code or something from a website and didn’t paste it then, you have access to it in your Clipmate store.

You will see an Inbox, everything will land in the inbox and you can either let them pile up, or you can create a folder structure on the left inside of Inbox or outside. You see that it indicates in the explorer where the clip came from and the line of the clip so that you can recognize it.  The trick here is that if you click on one of the items in the explorer, that becomes the one in the Clipboard at that moment so that you can go out into the application and Paste it and that will be the thing pasted.  You will also see a Graphic item in the list. I copied an image like a Print Screen into the clipboard and I can click on that and then go paste it into Paint.

Hopefully this information can help you get more out of your clipboard, Lars has a much better discipline than I, and he has a wealth of information of scripts he has saved into folders that do X or Y, and he uses them wherever he can. They are literally saved for posterity.  He has also in the past shared pieces of them with me, which is super cool and helpful when you are trying to help others with things that you have accumulated.

Enjoy your #TSQL2SDAY and enjoy.



Read More

PASS Summit has gone Virtual in 2020

SQL Server

Hey all, this is a historic moment in the SQL Community. PASS Summit will be virtual this year.  As a PASS Blogger I wanted you to know of the news. All the information is in the link and the registration if you just want to go there now, is here (Register for PASS 2020 Now).

If you have attended the Summit in previous years, you will know that networking and meeting people face to face is a great part of the Summit. This year will be challenged, but the committees for the Summit are aware of this. Don’t let that stop you, we are going to be a community in this one and ensure that we have interactions at the heart of our minds. The education will be more accessible this year so don’t miss out on an opportunity to pay less (until June 19th, because prices go up) and get access to all the great speakers and content that are going to be in this years program.

I have only missed one PASS Summit and look forward to seeing you (virtually) there. Don’t miss the great pricing, register before June 19th and we will see you there.


Read More

PowerShell at Michiana PASS User Group Meeting

Issues during Presentation

I had an issue during the demo of this presentation on Gathering Data for Trending using PowerShell and I indicated that I would blog the solution. I found the solution. First the problem.

Problem Code in v3.0

$dbobj = New-Object [PSObject] -Property ([ordered]@{ 
           ID = $db.ID 
           Name = $db.Name 
           Size = $db.Size 
           DataSpaceUsage = $db.DataSpaceUsage 
           IndexSpaceUsage = $db.IndexSpaceUsage 
           LastBackupDate = $db.LastBackupDate 
           RecoveryModel = $db.RecoveryModel 
           PageVerify = $db.PageVerify 
           CompatibilityLevel = $db.CompatibilityLevel 
           IsReadCommittedSnapshotOn = $db.IsReadCommittedSnapshotOn 
           IterationId = 2 
} )


In PowerShell 3.0 and above the syntax changed and in the latest versions, the code stopped working and now has to be different. Now the solution.

Code that works in v5.1

$dbobj = [PSCustomObject][ordered]@{ 
         ID = $db.ID 
         Name = $db.Name 
         Size = $db.Size  
         DataSpaceUsage = $db.DataSpaceUsage 
         IndexSpaceUsage = $db.IndexSpaceUsage 
         LastBackupDate = $db.LastBackupDate 
         RecoveryModel = $db.RecoveryModel 
         PageVerify = $db.PageVerify 
         CompatibilityLevel = $db.CompatibilityLevel 
         IsReadCommittedSnapshotOn = $db.IsReadCommittedSnapshotOn 
         IterationId = 2 


From here things just worked for the PowerShell window but seemed to be a problem in Azure Data Studio, but it may have been able to be resolved with restarting the shell. It still threw an error after restarting the shell, but then it executed fine.

The presentation files are here on my Presentations on Github.

Continue the quest to become a PowerShell DBA.

Read More

PowerShell at #SQLSatOregon with @DBAduck

Mastery Session – Leveling Up with PowerShell for the DBA

I know that most of you have used or heard about using PowerShell, but I am sure that you have things you could learn about using PowerShell in your DBA career. This mastery session is meant to get you closer to mastery of PowerShell for the DBA. I will cover 3 modules: DBAtools module, SQLServer module and the ImportExcel module. In these modules you have a wealth of commands and tools that will help you get more out of your day, using less clicks. The trick is not knowing that these things exist, but understanding the possiblities and patterns that exist in each module. You will learn about the commonalities in the DBAtools module that make it very simple to get the most out of this one. You will learn about some hidden gems in the SQLServer module and why you should care to learn everything you can about it. The ImportExcel module does not have a lot of commands but the commands available are amazing and will allow you to do things very simply without requiring Excel. There will also be a mix of deployment options using PowerShell as well. SSIS project deployment will also be covered. Join me at SQL Saturday Oregon and learn what your next steps are with PowerShell. You will absolutely love it.

Tips and Tricks for the PowerShell DBA

With environments growing and people hiring is lessening, you have to get some tools in your toolbelt to get you ahead of the game. These tips will cover both PowerShell and SQLServer PowerShell tricks. Some of the tips are directly related to performance and others about using features of SMO/SQLServer to get things done faster and more efficiently. Join me for a great session on getting things done.

Register Now for SQL Saturday Oregon

Read More

PowerShell for the DBA at SQLIntersection with @DBAduck

If you are wondering why you would want to use PowerShell as a DBA, read this first – Why Become a PowerShell DBA. Then go to SQLIntersection Registration to register and I will see you there. And if you register, please use the code MILLERB and you will get $50 off registration.

Click the links below to see the list of sessions, workshops and speakers


Leveling Up With PowerShell for the DBA

This workshop is a great one for anyone that wants to Level Up their PowerShell skills relating to DBA tasks. We are going to cover multiple things in this workshop. The tools we will be going over are: DBAtools module, SQLServer module, ImportExcel module. Each one of these has great power and can take a lot of clicking away. They also have the ability to allow you to export data to Excel or CSV and store for other purposes, or even take data from a query and put it directly in a table without having to write SQL or SSIS packages to put or import the data another way. Join me for a fun filled day that will have you coming back for more. It is not required to know PowerShell inside and out to attend. You will learn something new no matter your level of PowerShell expertise or years working with SQL and PowerShell. Join me in Vegas at the MGM for SQL Intersection, you won’t regret it.

An In-Depth Look at the SQL Server Provider in PowerShell

You probably already know that there is a SqlServer module on the PowerShell Gallery. This module contains many cmdlets that will help you with your DBA work. But something you may not know about is the SqlServer Provider that is in there too. This provider will be the subject of this presentation. Have you ever opened file explorer and started browsing through your filesystem? How about if you could have Object Explorer in SSMS as a navigable structure in PowerShell. Well you guessed it, it is. This is the SqlServer Provider. I will take you through the provider and show you how to navigate it, but that is not all. By the end of the session you will understand how you can use it in your PowerShell life to get more out of SqlServer by using Paths. Yes, you heard right, a path like structure that you can navigate and use to make your life simpler in retrieving objects and manipulating them in PowerShell. Join me for another chapter in the quest to become a PowerShell DBA. You won’t want to miss this one.

Synopsis: The SqlServer module is a powerful tool in PowerShell. It continues to receive updates and new cmdlets, which is fantastic. We will talk about these functions that allow you to do more with commands instead of TSQL. But the real value of this session is all about the SQLServer PowerShell Provider. This provider allows you to navigate SQL Server like a file system and that is not all, we will talk (and mostly show) what you can do with it and how you can leverage this provider to do almost anything that you would like to with objects and commands rather than clicking your way through. Join me for an in depth look at the provider and you will be converted to this tech, I am sure of it.

Getting Started with the DBAtools PowerShell Module

There has been a movement towards PowerShell in the SQL Server Community lately. With so much out there, how do you get ahead. I will take you through this massive module that has grown very quickly to over 300 functions for use in your daily DBA job. This module can single handedly migrate an entire server with one command, or you can choose the parts you want to do and just do those. Join me to find out how you can quickly leverage the power of this module. This is one tool you want in your PowerShell toolbox.

Synopsis: DBAtools has been around for 5 years and has finally reached release status. It is now in version 1.x which means it has stabilized the command names and parameter names to be consistent through all the commands. This remains very important because a lot of commands can be used to automate a lot of things. This gives you the ability to use splatting and parameter objects to allow you to reuse and get the most out of the module without typing a whole lot. Join me for a great look into some of the productivity tools in this module and you will be hungry for more.

Temporal Tables In-Depth

Have you ever wanted to know what the data looked like yesterday before a change was made? Have you cringed at the thought of creating triggers to get data changes logged for later analysis? Looking at a new feature in SQL 2016 called Temporal Tables gives you the ability to create a table that automatically keeps track of your data in time. Magic was never meant to be part of IT, but SQL Server has done a great job with this new feature. We will cover how to set it up, how it works, querying the tables including the history table and last but not least, how you can incorporate this feature into existing tables. Join me in a time travelling adventure to find out how you can leverage Temporal Tables in your world.

Synopsis: Temporal tables are not new, but there are still very many that have not used them yet. I will tell you there are many things that you can use them for, but recently we came across an update statement gone awry and had we had a temporal table in place, it would have been much simpler to just put the data back in place and not have to restore databases to do so. I will take you through the scenarios and show you how to put them in place, from a new table, an existing table, and a table that you want to customize your temporal tables indexes to allow you more ability to query the data and see the data at a given point in time. We will cover In Memory OLTP and new features in 2017 and 2019. Join me for a fun filled hour that may just make your head hurt.

Read More

Connecting iSCSI LUNs with PowerShell

When clustering SQL on Windows Server Core, I had the need to configure the ISCSI LUNs to the server without UI.  I know that there is a UI component called iscsicpl.exe that you can run and it will pop up the simple Windows UI to allow you to click through all the configuration, but I wanted to go purist on this one and learn the nuances of configuration via PowerShell the whole way through.

First you would have a SAN LUN configured for iSCSI which would have a IQN like This is a unique identifier for the iSCSI resource and this is what you would connect to for the drive.  To make this connection you would use the iSCSI module that has the commands you will see.  Below is the example of the connection.

# On Client Computer
$TargetPortal1 = ""
$TargetPortal2 = ""
$ISCSIIP1 = ""
$ISCSIIP2 = ""

New-IscsiTargetPortal -TargetPortalAddress $TargetPortal1 -InitiatorPortalAddress $ISCSIIP1 
New-IscsiTargetPortal -TargetPortalAddress $TargetPortal2 -InitiatorPortalAddress $ISCSIIP2

Figure 1.1 – The same as New Discovery Portal in iscsicpl


Start-Sleep -seconds 2

Connect-IscsiTarget -NodeAddress "" `
	-IsPersistent $true -IsMultipathEnabled $true -TargetPortalAddress $TargetPortal1 `
	-InitiatorPortalAddress $ISCSIIP1

Connect-IscsiTarget -NodeAddress "" `
	-IsPersistent $true -IsMultipathEnabled $true -TargetPortalAddress $TargetPortal2 `
	-InitiatorPortalAddress $ISCSIIP2


Figure 1.2 – Connecting with MultiPath IO using 2 different Targets and Initiators

Now you have a drive attached, at least in iSCSI.  This drive will be a raw drive and using Get-Disk  will show them. This cmdlet is in the Storage  module in PowerShell.


Figure 1.3 – Get-Disk in action. Showing the newly attached drive

The next task is to online and configure the new drive so that you can use it. You need to initialize the disk, then create a partition of type GPT and finally format the volume with 64KB blocks for use with SQL Server.

Initialize-Disk -Number 1 -PartitionStyle GPT # Data

New-Partition -DiskNumber 1 -UseMaximumSize -DriveLetter E 

Figure 1.4 – Initialize-Disk and New-Partition to make the drive accessible

Get-Volume -DriveLetter E | 
         Format-Volume -FileSystemLabel DATA -AllocationUnitSize (64KB) -FileSystem NTFS `
                -ShortFileNameSupport $false

Figure 1.5 – Get-Volume and Format-Volume to finish setting up the drive

This concludes this part of the creation of a drive from the iSCSI side of the house.  The entire piece of code is available here.

Happy PowerShelling. Onward in the quest to become a SQL PowerShell DBA.

Read More