UpdateHadronTruncationLsn messages in SQL 2019 Errorlog

If you have SQL Server 2019 installed and are using Availability Groups you may incur messages that look like the below message in your SQL Server Errorlog.

UpdateHadronTruncationLsn(6) (force=0): Primary: 0:0:0 Secondary: 892:4720:2 Partial Quorum : 0:0:0 Aggregation: 892:4720:2 Persistent: 892:382:4

According to Microsoft (unofficially as I did not hear this personally), there will be a fix in CU6 for SQL 2019 that will remove these messages by default and if you enable Trace Flag 3605 then it will turn them on to be logged as below.

DBCC TRACEON(3605, -1)

If this changes, I will come and update this post.

Thanks for listening. Have a great day and stay safe and happy!

Read More

Insights from your Database using PowerShell and TSQL

Hello All.

I have been looking for ways for quite sometime to understand my databases better. There are a lot of DMVs and even some DMFs that help us get some information from the SQL Server engine that can be very insightful. Many shops use monitoring tools and they are great because they watch things all the time. I used to rely solely on these tools to get me information about my SQL Server environment. That has shifted since I have been managing a SaaS platform built on top of SQL Server. With hundreds of TB of data and many databases being the same, it becomes pretty daunting to know which databases to care about and how much to care.

This article is about the start of things you can do to get ready for a better maintenance strategy or understanding how your indexes are used.

Let’s begin.

Problem

You have many tables in your databases and you want to know how they are used. There are DMVs for index usage stats which will tell you about like sys.dm_db_index_usage_stats and querying them is insightful, but how do the stats change over time? These stats are reset when the instance is restarted and it is good to know that you have 2000 seeks and 500 scans of the index, but when did they happen? Was it on a common day? Common hour?

Solution

First the elements of the solution should be defined:

  • PowerShell at least 4.0 (preferable 5.1)
  • SQL Server at least 2008 R2, but could be earlier but this is the earliest that I have run against
  • A way to run a job/task
  • dbatools.io Get the dbatools PowerShell Module
  • Content listed below – and it will be on my Github Repository.

I say PowerShell because this is what I love to use for automation tasks. I will illustrate how to do this with SQL Server Agent, but it will still work with Task Scheduler. Let’s define the solution so that we can put the pieces together for use in your own environment. This will give you a good framework to gather any type of information, whether it be with TSQL, which this solution is using, or straight PowerShell and SMO or other items (like WMI).
First, we need to create a table to hold what I call Iterations to put lines in the sand and allow you to use timeframes in your statistics gathering. We will also need a SCHEMA called “stat” for holding the Index Usage data.

CREATE SCHEMA stat AUTHORIZATION dbo

Now the table…

/****** Object: Table [dbo].[Iterations] Script Date: 2/15/2020 3:40:49 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[Iterations] (
    [IterationId] [int] IDENTITY(1,1) NOT NULL,
    [GatherDate] [datetime2](3) NOT NULL,
    CONSTRAINT [PK_Iterations_ID] PRIMARY KEY CLUSTERED (
        [IterationId] ASC
    )  WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 100) ON [PRIMARY]) 
ON [PRIMARY]
GO
ALTER TABLE [dbo].[Iterations] ADD DEFAULT (getdate()) FOR [GatherDate]
GO

Notice that we have an IDENTITY column which could easily be a SEQUENCE candidate from a DEFAULT on the IterationId column. Also notice that the only other column is the GatherDate as a datetime column. This allows us to put an INT (you could use a BIGINT) column for related tables instead of having dates to join on.
Next up is the table to hold the statistics.

CREATE SEQUENCE stat.IndexUsage_Seq as bigint
START WITH 1 INCREMENT BY 1
GO
/****** Object:  Table [stat].[IndexUsage]    Script Date: 4/20/2020 10:59:19 PM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [stat].[IndexUsage](
	[server_name] [nvarchar](128) NULL,
	[database_name] [nvarchar](128) NULL,
	[database_id] [smallint] NULL,
	[object_id] [int] NOT NULL,
	[schema_name] [sysname] NOT NULL,
	[table_name] [sysname] NOT NULL,
	[index_name] [sysname] NULL,
	[index_id] [int] NOT NULL,
	[user_seeks] [bigint] NULL,
	[user_scans] [bigint] NULL,
	[user_lookups] [bigint] NULL,
	[user_updates] [bigint] NULL,
	[last_user_seek] [datetime] NULL,
	[last_user_scan] [datetime] NULL,
	[iterationid] [int] NOT NULL,
	[IndexUsageId] [bigint] NOT NULL,
        CONSTRAINT [PK_stat_IndexUsage__IndexUsageId] PRIMARY KEY CLUSTERED 
        (
	    [IndexUsageId] ASC
        ) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
GO
ALTER TABLE [stat].[IndexUsage] ADD  CONSTRAINT [DF_IndexUsage_IndexUsageSeq]  DEFAULT (NEXT VALUE FOR [stat].[IndexUsage_Seq]) FOR [IndexUsageId]
GO

Next up is the query and to get the information, encapsulated in a PowerShell function called Get-BmaIndexUsage (Bma is to keep things separate from other functions, Bma is Ben Miller & Associates). There are parameters for the SQL instance and credential with the reporting instance and database to put the data in. This example will give you an idea of how easy it is to get the data and put it in the table.

Index Usage Stats

SELECT
    @@servername as [server_name],
    db_name() as [database_name],
    db_id() as [database_id],
    t.object_id,
    s.name as [schema_name],
    t.name as [table_name],
    ix.name as [index_name],
    ix.index_id,
    ius.user_seeks,
    ius.user_scans,
    ius.user_lookups,
    ius.user_updates,
    ius.last_user_seek,
    ius.last_user_scan,
    $IterationId AS [iterationid]
from sys.tables t
inner join sys.indexes ix on t.object_id = ix.object_id
inner join sys.schemas s on t.schema_id = s.schema_id
left join sys.dm_db_index_usage_stats ius on ius.object_id = ix.object_id and ius.index_id = ix.index_id 
        and ius.database_id = DB_ID()
WHERE t.is_ms_shipped = 0;

Let’s tie this together before we get into why we would even do this activity.

  • CREATE DATABASE DBA
  • Inside DBA database
    • CREATE SCHEMA stat AUTHORIZATION dbo
    • CREATE SEQUENCE stat.IndexUsage_Seq
    • CREATE TABLE dbo.Iterations
    • CREATE TABLE stat.IndexUsage
  • Command File with code to call PowerShell
    • GetIndexUsage.cmd
  • PowerShell file to run all the things to get the data with TSQL and put it in the table above
    • GetIndexUsage.ps1
  • Job in SQL Agent to get the stats and schedule for every hour
  • When all of this is put into place, the next step is to understand how to use the information that has been gathered. With the IterationId in place, the data can be sliced by iteration, or more
REM This would go in the GetIndexUsage.cmd that would be called from the Agent Job in a CmdExec step.
@powershell c:\bin\GetIndexUsage.ps1
Import-Module dbatools

function Get-BmaIndexUsage
{
	param (
		[string]$ServerInstance,
		$SqlCredential,
		[int]$IterationId,
		[string]$ReportSqlInstance = "localhost",
		[string]$ReportSqlDb = "DBA"
	)
	
	# Scheduled Per Day
	# Run per Database
	
	$query = @"
select 
	@@servername as [server_name],
	db_name() as [database_name],
	db_id() as [database_id],
	t.object_id,
 	s.name as [schema_name],
	t.name as [table_name],
	ix.name as [index_name],
	ix.index_id,
	ius.user_seeks,
	ius.user_scans,
	ius.user_lookups,
	ius.user_updates,
	ius.last_user_seek,
	ius.last_user_scan,
	$IterationId AS [iterationid]
--INTO stat.IndexUsage
from sys.tables t 
inner join sys.indexes ix on t.object_id = ix.object_id
inner join sys.schemas s on t.schema_id = s.schema_id
left join sys.dm_db_index_usage_stats ius on ius.object_id = ix.object_id and ius.index_id = ix.index_id and ius.database_id = DB_ID()
WHERE t.is_ms_shipped = 0
ORDER BY [schema_name], [table_name], [index_id];

"@
	if($SqlCredential) {
		$s = Connect-DbaInstance -SqlInstance $ServerInstance -SqlCredential $SqlCredential
	}
	else {
		$s = Connect-DbaInstance -SqlInstance $ServerInstance 
	}	
	$dblist = $s.Databases.Name | Where { $_ -notin @("master","model","msdb","tempdb") }
	
	foreach ($db in $dblist)
	{
		try
		{
			if($SqlCredential) {
				$dt = Invoke-DbaQuery -SqlInstance $ServerInstance -Database $db -Query $query -As DataTable -SqlCredential $SqlCredential
			}
			else {
				$dt = Invoke-DbaQuery -SqlInstance $ServerInstance -Database $db -Query $query -As DataTable
			}
			$dt | Write-DbaDbTableData -SqlInstance $ReportSqlInstance -Database $ReportSqlDb -Schema stat -Table IndexUsage 
			
		}
		catch
		{
			Write-Host "Error occurred in $($s.Name) - $($db) - IndexUsage"
		}
		
	}
}

Get-BmaIndexUsage -ServerInstance "localhost" -IterationId 1 

The above PowerShell code would be run from the CMD file that runs and basically connects to the server specified in the bottom of the script, in this case “localhost” or change it to the real one. Then specifying the IterationId of 1 would get the data and put it in the table created with an IterationId of 1, and the next run would be 2 and so forth. If you execute the following TSQL you can get an IterationId from the dbo.Iterations table.

INSERT INTO dbo.Iterations (GatherDate)  OUTPUT inserted.IterationId VALUES (GetDate())

A simple query to the sys.dm_db_index_usage DMV that will show what the statistics look like for the index. Key elements:

  • DB_ID() in the where clause to look only in the database you are in. This DMV covers the entire instance so narrow it down with database_id = DB_ID()
  • Object_id of the table, you can even use the OBJECT_ID(‘tablename’)
  • Optionally you can narrow it down with the index_id as well

If I see that at 12:00 AM my indexes are being used 2 seeks and 1 scan and 4 updates. As shown in the graphic and then at 1:00 AM there are 10 seeks and 1 scan and 40 updates. At 2:00 AM there are 10 seeks and 1 scan and 100 updates and the following hour at 3:00 AM there are 20 seeks and 4 scans and 100 updates total. At 8:00 AM you get on the scene and find the final stats to be 100 seeks and 8 scans and 200 updates.
Let’s digest this, we did not watch every hour and when we did get a chance at 8:00 AM to look and see the final stats at 100 seeks, 8 scans and 200 updates. How do we know what happened? We have these statistics captured every hour by iterationId and datetime. I want to know how many seeks, scans and updates take place between 4:00 AM and 5:00 AM. Most monitoring software packages do not capture these stats at all, let alone capture it by hour to give you insights you need to understand how your index is being used and even potentially give you opportunity to understand when you may want to maintain the index.

Scenario

  • Here is an example of a scenario that you can consider:
  • I created 14 indexes on a table and I want to know a few things.
  • 1. Are they being used at all?
  • 2. Are they seeked or scanned?
  • 3. How often are they being seeked or scanned?
SELECT
    I1.server_name,
    I1.database_name,
    I1.schema_name,
    I1.table_name,
    I1.index_name,
    I1.index_id,
    ISNULL(I1.user_seeks,0) as user_seeks_1,
    ISNULL(I2.user_seeks,0) as user_seeks_2,
    ISNULL(I2.user_seeks,0) – ISNULL(I1.user_seeks,0) as user_seeks_diff,
    ISNULL(I1.user_scans,0) as user_scans_1,
    ISNULL(I2.user_scans,0) as user_scans_2,
    ISNULL(I2.user_scans,0) – ISNULL(I1.user_scans,0) as user_scans_diff,
    ISNULL(I1.user_lookups,0) as user_lookups_1,
    ISNULL(I2.user_lookups,0) as user_lookups_2,
    ISNULL(I2.user_lookups,0) – ISNULL(I1.user_lookups,0) as user_lookups_diff,
    ISNULL(I1.user_updates,0) as user_updates_1,
    ISNULL(I2.user_updates,0) as user_updates_2,
    ISNULL(I2.user_updates,0) – ISNULL(I1.user_updates,0) as user_updates_diff
FROM stat.IndexUsage I1
INNER JOIN stat.IndexUsage I2 ON I1.server_name = I2.server_name 
        and I1.database_name = I2.database_name
        and I1.object_id = I2.object_id
        and I1.index_id = I2.index_id
WHERE 
    I1.IterationId = 4 
    AND I2.IterationId = 5
    AND I1.table_name = 'MyTable'
    AND I2.table_name = 'MyTable'

The output from the query is below and this would give you insights into the indexes for a table and how they have behaved over the last hour since the iterations are sequential. If your interval is longer than 1 hour then it is 1 interval of time so you can see based on your interval how it behaved.

Index Usage trend difference

As you can see, the data is coming out with the 1 and the 2 and then a diff with the query. This gives you a view if it is 1 hour of data, then you know that the diff of Index 6 is 101 seeks in the last hour. Over time you can trend this data and see some correlations with some queries, or you see lots of 0’s like above, and that will help you know which indexes are used and maybe those are the ones that are needing maintenance more, if the updates are high.

Hopefully this was a good start to understanding how you can get some insights into your database with a little automation. Stay safe and Happy.

Read More

Microsoft has been confirmed as the Premium Sponsor for PASS Virtual Summit

The formal announcement is here (Announcement).

What does this mean to you? This means that you will have 1 on 1 time with Microsoft Engineers and highly sought after content from Microsoft for the Summit.

If you have not had a chance to interact with the engineers at Microsoft with yours or your company’s issues, you are missing out and should consider registering for the Summit this year in November. With registration you get this opportunity as well as to hear from MVPs, MCMs and highly sought after content from the SQL Community of speakers and knowledge experts.

I am excited for the years Summit and as you have heard out on the web, this is your opportunity to contribute to the betterment of PASS from a budget and community perspective. I will be there, and I hope to see you there. Have a great day!

Read More

Using Clipmate for great lookback ability #tsql2sday

#TSQL2sday is today. The theme is Tips and Tricks NOT related to SQL Server or RDBMS systems.

My tip today is about a utility that I have been using for a long time thanks to Lars Rasmussen (b | t) long time very good friend. The utility is called Clipmate and it is a clipboard manager.  It does cost to use it after a trial, but it is so useful that I could easily justify the money.

Here is how it works.  Whenever you copy something (right click Copy, or Ctrl+C) it ends up in the clipboard and Clipmate keeps track of it.  Imagine going through your day whether it is programming, or TSQLing, or just being on the web and marking everything as favorites in your browser. Imagine being able to just copy the URLs and move on and after you

 

get done going into the Clipmate interface and grabbing them from the list and dragging them into a folder in Clipmate.  This utility has it’s own store of where it puts things and you can create your own folder structure in there too and then you have a utility that you can back up your store and restore it on another machine or do other things creative.

The benefits is that everything you copy goes into this utility and is retrievable so if you had copied some code or something from a website and didn’t paste it then, you have access to it in your Clipmate store.

You will see an Inbox, everything will land in the inbox and you can either let them pile up, or you can create a folder structure on the left inside of Inbox or outside. You see that it indicates in the explorer where the clip came from and the line of the clip so that you can recognize it.  The trick here is that if you click on one of the items in the explorer, that becomes the one in the Clipboard at that moment so that you can go out into the application and Paste it and that will be the thing pasted.  You will also see a Graphic item in the list. I copied an image like a Print Screen into the clipboard and I can click on that and then go paste it into Paint.

Hopefully this information can help you get more out of your clipboard, Lars has a much better discipline than I, and he has a wealth of information of scripts he has saved into folders that do X or Y, and he uses them wherever he can. They are literally saved for posterity.  He has also in the past shared pieces of them with me, which is super cool and helpful when you are trying to help others with things that you have accumulated.

Enjoy your #TSQL2SDAY and enjoy.

 

 

Read More

PASS Summit has gone Virtual in 2020

Hey all, this is a historic moment in the SQL Community. PASS Summit will be virtual this year.  As a PASS Blogger I wanted you to know of the news. All the information is in the link and the registration if you just want to go there now, is here (Register for PASS 2020 Now).

If you have attended the Summit in previous years, you will know that networking and meeting people face to face is a great part of the Summit. This year will be challenged, but the committees for the Summit are aware of this. Don’t let that stop you, we are going to be a community in this one and ensure that we have interactions at the heart of our minds. The education will be more accessible this year so don’t miss out on an opportunity to pay less (until June 19th, because prices go up) and get access to all the great speakers and content that are going to be in this years program.

I have only missed one PASS Summit and look forward to seeing you (virtually) there. Don’t miss the great pricing, register before June 19th and we will see you there.

 

Read More

PowerShell at Michiana PASS User Group Meeting

Issues during Presentation

I had an issue during the demo of this presentation on Gathering Data for Trending using PowerShell and I indicated that I would blog the solution. I found the solution. First the problem.

Problem Code in v3.0

$dbobj = New-Object [PSObject] -Property ([ordered]@{ 
           ID = $db.ID 
           Name = $db.Name 
           Size = $db.Size 
           DataSpaceUsage = $db.DataSpaceUsage 
           IndexSpaceUsage = $db.IndexSpaceUsage 
           LastBackupDate = $db.LastBackupDate 
           RecoveryModel = $db.RecoveryModel 
           PageVerify = $db.PageVerify 
           CompatibilityLevel = $db.CompatibilityLevel 
           IsReadCommittedSnapshotOn = $db.IsReadCommittedSnapshotOn 
           IterationId = 2 
} )

 

In PowerShell 3.0 and above the syntax changed and in the latest versions, the code stopped working and now has to be different. Now the solution.

Code that works in v5.1

$dbobj = [PSCustomObject][ordered]@{ 
         ID = $db.ID 
         Name = $db.Name 
         Size = $db.Size  
         DataSpaceUsage = $db.DataSpaceUsage 
         IndexSpaceUsage = $db.IndexSpaceUsage 
         LastBackupDate = $db.LastBackupDate 
         RecoveryModel = $db.RecoveryModel 
         PageVerify = $db.PageVerify 
         CompatibilityLevel = $db.CompatibilityLevel 
         IsReadCommittedSnapshotOn = $db.IsReadCommittedSnapshotOn 
         IterationId = 2 
} 

 

From here things just worked for the PowerShell window but seemed to be a problem in Azure Data Studio, but it may have been able to be resolved with restarting the shell. It still threw an error after restarting the shell, but then it executed fine.

The presentation files are here on my Presentations on Github.

Continue the quest to become a PowerShell DBA.

Read More

PowerShell at #SQLSatOregon with @DBAduck

Mastery Session – Leveling Up with PowerShell for the DBA

I know that most of you have used or heard about using PowerShell, but I am sure that you have things you could learn about using PowerShell in your DBA career. This mastery session is meant to get you closer to mastery of PowerShell for the DBA. I will cover 3 modules: DBAtools module, SQLServer module and the ImportExcel module. In these modules you have a wealth of commands and tools that will help you get more out of your day, using less clicks. The trick is not knowing that these things exist, but understanding the possiblities and patterns that exist in each module. You will learn about the commonalities in the DBAtools module that make it very simple to get the most out of this one. You will learn about some hidden gems in the SQLServer module and why you should care to learn everything you can about it. The ImportExcel module does not have a lot of commands but the commands available are amazing and will allow you to do things very simply without requiring Excel. There will also be a mix of deployment options using PowerShell as well. SSIS project deployment will also be covered. Join me at SQL Saturday Oregon and learn what your next steps are with PowerShell. You will absolutely love it.

Tips and Tricks for the PowerShell DBA

With environments growing and people hiring is lessening, you have to get some tools in your toolbelt to get you ahead of the game. These tips will cover both PowerShell and SQLServer PowerShell tricks. Some of the tips are directly related to performance and others about using features of SMO/SQLServer to get things done faster and more efficiently. Join me for a great session on getting things done.

Register Now for SQL Saturday Oregon

Read More

PowerShell for the DBA at SQLIntersection with @DBAduck

If you are wondering why you would want to use PowerShell as a DBA, read this first – Why Become a PowerShell DBA. Then go to SQLIntersection Registration to register and I will see you there. And if you register, please use the code MILLERB and you will get $50 off registration.

Click the links below to see the list of sessions, workshops and speakers
Sessions: http://bit.ly/SQL_Fall_2019_Sessions  
Workshops: http://bit.ly/SQL_Fall_2019_Workshops  
Speakers: http://bit.ly/SQL_Fall_2019_Speakers

Workshop

Leveling Up With PowerShell for the DBA

This workshop is a great one for anyone that wants to Level Up their PowerShell skills relating to DBA tasks. We are going to cover multiple things in this workshop. The tools we will be going over are: DBAtools module, SQLServer module, ImportExcel module. Each one of these has great power and can take a lot of clicking away. They also have the ability to allow you to export data to Excel or CSV and store for other purposes, or even take data from a query and put it directly in a table without having to write SQL or SSIS packages to put or import the data another way. Join me for a fun filled day that will have you coming back for more. It is not required to know PowerShell inside and out to attend. You will learn something new no matter your level of PowerShell expertise or years working with SQL and PowerShell. Join me in Vegas at the MGM for SQL Intersection, you won’t regret it.

An In-Depth Look at the SQL Server Provider in PowerShell

You probably already know that there is a SqlServer module on the PowerShell Gallery. This module contains many cmdlets that will help you with your DBA work. But something you may not know about is the SqlServer Provider that is in there too. This provider will be the subject of this presentation. Have you ever opened file explorer and started browsing through your filesystem? How about if you could have Object Explorer in SSMS as a navigable structure in PowerShell. Well you guessed it, it is. This is the SqlServer Provider. I will take you through the provider and show you how to navigate it, but that is not all. By the end of the session you will understand how you can use it in your PowerShell life to get more out of SqlServer by using Paths. Yes, you heard right, a path like structure that you can navigate and use to make your life simpler in retrieving objects and manipulating them in PowerShell. Join me for another chapter in the quest to become a PowerShell DBA. You won’t want to miss this one.

Synopsis: The SqlServer module is a powerful tool in PowerShell. It continues to receive updates and new cmdlets, which is fantastic. We will talk about these functions that allow you to do more with commands instead of TSQL. But the real value of this session is all about the SQLServer PowerShell Provider. This provider allows you to navigate SQL Server like a file system and that is not all, we will talk (and mostly show) what you can do with it and how you can leverage this provider to do almost anything that you would like to with objects and commands rather than clicking your way through. Join me for an in depth look at the provider and you will be converted to this tech, I am sure of it.

Getting Started with the DBAtools PowerShell Module

There has been a movement towards PowerShell in the SQL Server Community lately. With so much out there, how do you get ahead. I will take you through this massive module that has grown very quickly to over 300 functions for use in your daily DBA job. This module can single handedly migrate an entire server with one command, or you can choose the parts you want to do and just do those. Join me to find out how you can quickly leverage the power of this module. This is one tool you want in your PowerShell toolbox.

Synopsis: DBAtools has been around for 5 years and has finally reached release status. It is now in version 1.x which means it has stabilized the command names and parameter names to be consistent through all the commands. This remains very important because a lot of commands can be used to automate a lot of things. This gives you the ability to use splatting and parameter objects to allow you to reuse and get the most out of the module without typing a whole lot. Join me for a great look into some of the productivity tools in this module and you will be hungry for more.

Temporal Tables In-Depth

Have you ever wanted to know what the data looked like yesterday before a change was made? Have you cringed at the thought of creating triggers to get data changes logged for later analysis? Looking at a new feature in SQL 2016 called Temporal Tables gives you the ability to create a table that automatically keeps track of your data in time. Magic was never meant to be part of IT, but SQL Server has done a great job with this new feature. We will cover how to set it up, how it works, querying the tables including the history table and last but not least, how you can incorporate this feature into existing tables. Join me in a time travelling adventure to find out how you can leverage Temporal Tables in your world.

Synopsis: Temporal tables are not new, but there are still very many that have not used them yet. I will tell you there are many things that you can use them for, but recently we came across an update statement gone awry and had we had a temporal table in place, it would have been much simpler to just put the data back in place and not have to restore databases to do so. I will take you through the scenarios and show you how to put them in place, from a new table, an existing table, and a table that you want to customize your temporal tables indexes to allow you more ability to query the data and see the data at a given point in time. We will cover In Memory OLTP and new features in 2017 and 2019. Join me for a fun filled hour that may just make your head hurt.

Read More

Connecting iSCSI LUNs with PowerShell

When clustering SQL on Windows Server Core, I had the need to configure the ISCSI LUNs to the server without UI.  I know that there is a UI component called iscsicpl.exe that you can run and it will pop up the simple Windows UI to allow you to click through all the configuration, but I wanted to go purist on this one and learn the nuances of configuration via PowerShell the whole way through.

First you would have a SAN LUN configured for iSCSI which would have a IQN like iqn.1991-05.com.microsoft:winsan-winsql1-target. This is a unique identifier for the iSCSI resource and this is what you would connect to for the drive.  To make this connection you would use the iSCSI module that has the commands you will see.  Below is the example of the connection.

# On Client Computer
$TargetPortal1 = "10.1.1.1"
$TargetPortal2 = "10.1.2.1"
$ISCSIIP1 = "10.1.1.2"
$ISCSIIP2 = "10.1.2.2"

New-IscsiTargetPortal -TargetPortalAddress $TargetPortal1 -InitiatorPortalAddress $ISCSIIP1 
New-IscsiTargetPortal -TargetPortalAddress $TargetPortal2 -InitiatorPortalAddress $ISCSIIP2
ClientISCSI_1

Figure 1.1 – The same as New Discovery Portal in iscsicpl

 

Start-Sleep -seconds 2

Connect-IscsiTarget -NodeAddress "iqn.1991-05.com.microsoft:winsan-winsql1-target" `
	-IsPersistent $true -IsMultipathEnabled $true -TargetPortalAddress $TargetPortal1 `
	-InitiatorPortalAddress $ISCSIIP1

Connect-IscsiTarget -NodeAddress "iqn.1991-05.com.microsoft:winsan-winsql1-target" `
	-IsPersistent $true -IsMultipathEnabled $true -TargetPortalAddress $TargetPortal2 `
	-InitiatorPortalAddress $ISCSIIP2
ClientISCSI_2

 

Figure 1.2 – Connecting with MultiPath IO using 2 different Targets and Initiators

Now you have a drive attached, at least in iSCSI.  This drive will be a raw drive and using Get-Disk  will show them. This cmdlet is in the Storage  module in PowerShell.

ClientISCSI_Get-Disk

Figure 1.3 – Get-Disk in action. Showing the newly attached drive

The next task is to online and configure the new drive so that you can use it. You need to initialize the disk, then create a partition of type GPT and finally format the volume with 64KB blocks for use with SQL Server.

Initialize-Disk -Number 1 -PartitionStyle GPT # Data

New-Partition -DiskNumber 1 -UseMaximumSize -DriveLetter E 
ClientISCSI_Initialize-Disk_New-Paritition

Figure 1.4 – Initialize-Disk and New-Partition to make the drive accessible

Get-Volume -DriveLetter E | 
         Format-Volume -FileSystemLabel DATA -AllocationUnitSize (64KB) -FileSystem NTFS `
                -ShortFileNameSupport $false
ClientISCSI_Get-Volume_Format-Volume

Figure 1.5 – Get-Volume and Format-Volume to finish setting up the drive

This concludes this part of the creation of a drive from the iSCSI side of the house.  The entire piece of code is available here.

Happy PowerShelling. Onward in the quest to become a SQL PowerShell DBA.

Read More

dbatools 1.0 is here and why you should care!

I have been using dbatools for a long time now. I have even contributed a few commands and fixed some bugs.  I present on this topic and help many people with PowerShell and being a DBA.  I am so excited for the launch of 1.0 for dbatools, that I cannot stop pinching myself to make sure that it is true.

Chrissy Lemaire ( b | t ) and company have done a great job keeping this project going as well as supporting the sqlcommunity.slack.com slack environment. Many people are helped in there and it is a great sense of community being fostered there.  I take my hat off to Chrissy for spearheading this project and for inspiring all of us to contribute and use the module in our daily lives.

Experience

I wanted to also share that in my presentations, I show the things that dbatools can do, and I help people get started and hope that they get something out of it.  I recently had a presentation on Complete Database Migration with dbatools and it went across so well, I was inspired again.  Then that same week I was helping a client to get things set up on a new server and it was late at night and I needed to recreate a credential and Operator and Proxy for a job that would run as the proxy.  I scoured emails and looked and looked.  I could not find the password and assumed that the password was sent another way other than email.  Then the light bulb came on.  “I can use dbatools, I thought”.  So I downloaded the latest version to ensure that I was using the best, and proceeded to use Copy-DbaCredential and Copy-DbaAgentOperator and Copy-DbaAgentProxy as well as Copy-DbaDbMail.  They all worked and I could configure everything. Then I used Copy-DbaAgentJob to cap it all off.  I cannot tell you how nice it was to use the tools I had just shown an audience not 2 days earlier.

I hope that everyone is able to get something out of this module and that your day is much better than days before because you have found dbatools.  I again tip my hat to all those that work behind the scenes on building, testing and publishing this module for all of us to use.

Happy Launch Day!!!

Read More
Menu