SQL Server

dbatools 1.0 is here and why you should care!

I have been using dbatools for a long time now. I have even contributed a few commands and fixed some bugs.  I present on this topic and help many people with PowerShell and being a DBA.  I am so excited for the launch of 1.0 for dbatools, that I cannot stop pinching myself to make sure that it is true.

Chrissy Lemaire ( b | t ) and company have done a great job keeping this project going as well as supporting the sqlcommunity.slack.com slack environment. Many people are helped in there and it is a great sense of community being fostered there.  I take my hat off to Chrissy for spearheading this project and for inspiring all of us to contribute and use the module in our daily lives.

Experience

I wanted to also share that in my presentations, I show the things that dbatools can do, and I help people get started and hope that they get something out of it.  I recently had a presentation on Complete Database Migration with dbatools and it went across so well, I was inspired again.  Then that same week I was helping a client to get things set up on a new server and it was late at night and I needed to recreate a credential and Operator and Proxy for a job that would run as the proxy.  I scoured emails and looked and looked.  I could not find the password and assumed that the password was sent another way other than email.  Then the light bulb came on.  “I can use dbatools, I thought”.  So I downloaded the latest version to ensure that I was using the best, and proceeded to use Copy-DbaCredential and Copy-DbaAgentOperator and Copy-DbaAgentProxy as well as Copy-DbaDbMail.  They all worked and I could configure everything. Then I used Copy-DbaAgentJob to cap it all off.  I cannot tell you how nice it was to use the tools I had just shown an audience not 2 days earlier.

I hope that everyone is able to get something out of this module and that your day is much better than days before because you have found dbatools.  I again tip my hat to all those that work behind the scenes on building, testing and publishing this module for all of us to use.

Happy Launch Day!!!

Read More

SQLintersection Workshop attendees change to URL

Hello All.  I am ensuring that you have access to the downloads. In the slide deck there was a link and it pointed to the file.  You need to add /wp-content/ to the URL before uploads and after dbaduck.com.

That means that the uploads are at https://dbaduck.com/wp-content/uploads/nameoffile.zip

I apologize for having the link in the wrong place.  Please feel free to email me at the email in the slides if you cannot get the file.

Read More

PowerShell month in May with DBAduck

I am excited for this month.

SQLSkills Training

Event 1 for this month revolving around PowerShell is May 8-10 at SQLskills.com training in Chicago. I am teaching the first class of the Immersion event on PowerShell for the DBA. You can see the details in the link on the class.

I am super humbled and excited to teach this class for SQLskills and at the time of this writing there are 3 seats left after we opened more seats for the class. Please join me for a great event.

We will be going over how to get started with PowerShell and then dive right in and learn why a DBA should learn PowerShell and why you would ever want to become a PowerShell DBA. There are so many reasons and so much value, I cannot contain it all in this type of post. But we will be having a grand time drinking from a firehose and learning while we do. 🙂

I hope you will join me for this iteration, or plan on another iteration another time. I love PowerShell and look forward to meeting all of you who are going to be there.

PASS Virtual PowerShell Chapter Meeting

Event 2 for this month will be on Wednesday, May 17th at 12:00 PM Eastern Time I will be presenting to the group on Gathering Data for Trending using PowerShell.  You can RSVP here . I will be covering a great topic that I spend a lot of time thinking about. There is a lot of data in SQL Server and a lot of Meta-Data that can tell some great stories about your servers and what the behavior looks like.  Ever since working at Microsoft with the Community Research team on Newsgroup Behaviors, I have been hooked on looking at things in SQL Server as far as behaviors go, not just about features and such.  So I have created some data gathering scripts to get data and put it into tables with snapshots in time and looking at the information in a different light to see if I can derive how things are working in SQL Server.  Join me for a great time and register for the webinar here.

SQL Intersection

Event 3 will be a PreCon at SQLIntersection on May 20th in Orlando at the Disney Swan Hotel.  If you are going to be there, I hope you will join me and if you are thinking about going, please register and join me there.  It is my well-liked PreCon that takes you from 0-60 in using PowerShell as a DBA and a professional that wants to manage SQL Server from a command line instead of clicking your way through everything.  I will be teaching how to leverage the shell as well as balancing the use between SSMS and PowerShell to gain another tool in your toolbelt.

This will be a great month focused on PowerShell.

 

Read More

#SQLIntersection with DBADuck

SQLintersection
In May, I will be at SQLintersection at the Disney World Swan in Orlando, Florida.

You can visit the site to see the full lineup of sessions and workshops. I’m going to be presenting three sessions on SQL Server topics and 1 Workshop on PowerShell in SQL:

IN DEPTH TEMPORAL TABLES IN SQL 2016
Have you ever wanted to know what the data looked like yesterday before a change was made? Have you cringed at the thought of creating triggers to get data changes logged for later analysis? Looking at a new feature in SQL 2016 called Temporal Tables gives you the ability to create a table that automatically keeps track of your data in time. Magic was never meant to be part of IT, but SQL Server has done a great job with this new feature. We will cover how to set it up, how it works, querying the tables including the history table and last but not least, how you can incorporate this feature into existing tables. Join me in a time travelling adventure to find out how you can leverage Temporal Tables in your world.

PRACTICAL POWERSHELL FOR SQL SERVER
This session will cover real world functions that are used for real interaction with SQL Server. There will be a few slides but mostly demos with actual scripts that interact with SQL Server for management and getting data in and out. I will also introduce some outside modules created to assist as well. Things like Reading/Changing the following: Permissions, DB Owner maintenance and database space management. We all could use a few standard things in the SQL Server world of ever changing roles and “I have to have it now” stuff, that Management Studio can get in the way of getting it done faster. Join me for a fun PowerShell hour and never try to reuse a click again.

SMO: A STUDY OF THE SERVER MANAGEMENT OBJECTS
SMO, short for Shared Management Objects (or SQL Management Objects to some), are some powerful programming interfaces to get and manipulate SQL Server in code. These objects are used in custom programs, in scripting with PowerShell and can be used to fully manage a SQL Server. But what do you really do with them? How do you use them? This session will cover some core objects that are used in every day interaction with SQL Server. I will cover how to get access to them, how to use them and best of all you will get to know these mysterious objects

WORKSHOP ON Saturday, May 20, 2017

PowerShell for the DBA from 0-60 in a Day
Think of how many servers or instances you maintain. Putting tools in your toolbox becomes a critical part of your life as a DBA. How many clicks can be reused? We will be taking you from 0–60 and everywhere in between with how PowerShell fits into a DBA’s day in the life. We will go from getting PowerShell connected to SQL Server to walking away with a toolbox starter kit that will help you get automated and allow you to focus on harder problems than modifying database options or getting information from SQL Server and storing it away for later analysis. The toolbox alone will make it well worth your attendance.

Join me in the quest to become a PowerShell DBA.

Hope to see you there!

Read More

PASS Board Elections, Why am I running?

Hey all you SQL PASS people in the #SQLFamily. I am a little late to the game, but wanted to ensure while the voting is active now, that you had why I am running.

In this vast connected world of technology there are so many SQL People I don’t even know if I venture to guess how many there are. My experience has been that PASS has been an organization that has done a lot of good for their members. The problem I have that I want to attempt to address is why the many people who are members don’t really know what that means or don’t understand what they are a part of. I really want to help people connect and to take advantage of all the benefits and connection opportunities that they have inside an organization of this size. PASS is not just a SQL Saturday and a PASS Summit organization. Funding is important to run, but the communities should be fueled by the organization and the sponsors should fuel the organization. As members we should be concerned about supporting the organization by actively participating in all the aspects of PASS, not just taking advantage of the free events like SQL Saturdays.

Now before I get strung up, I want to clarify what taking advantage of what PASS offers means. When you as a member starts out, there is a period of time that you get fed, everything you want, everything you can handle. But after a while, there is a time to give back and that is the eco-system in which we “Should” live in with PASS. I have been a member of PASS for a long time and I have been a Chapter leader for a long time, but I realized during the interview of Board Candidates, that I had not given back enough. Sure I speak at a lot of SQL Saturdays, and at events around the country and in my home state. But I have not volunteered in the PASS organization at all and I really need to give back.

Giving back and connecting our #SQLFamily is why I am running after all. The stuff above is the path that took me to submitting my application. I support all the other candidates and think that the slate this year is amazing and you cannot go wrong. PLEASE VOTE, it is part of volunteering and supporting the organization that we belong to. Get out there (online) and cast your vote and make this election the one with the most participation.

Happy Voting.
PASS Candidates 2016

Voting is in your myPASS profile.
Vote Now in your PASS Profile

Read More

Join me at SQL Intersections in October 2016

SQLintersection
In October, I will be at SQLintersection at the MGM Grand in Las Vegas, Nevada.

You can visit the site to see the full lineup of sessions and workshops. I’m going to be presenting four, count them 4 sessions on SQL Server topics and 2 Workshops on PowerShell in SQL:

Be Friendly to SQL Server with TSQL Best Practices
TSQL is a necessity when interacting with SQL Server so knowing can be half the battle. Performance is always good as the database starts to grow, but building in resilience when you begin is a greater advantage than refactoring. I will go over 5 key things to know when you write TSQL, use DataTypes and/or variables in comparisons and you will also learn about the procedure cache and how to avoid pitfalls there. This is a beginners session but the concepts in this session are a great foundation to begin with. If you are looking for a solid foundation to build on and need the basics to start, this session is definitely for you.

SQL Server Filestream Implementation and Management
Filestream was introduced in SQL 2008 and has been a part of the engine since then. The reasons to use Filestream over BLOB has been talked about in many different ways and for many different reasons. I will take you through how to get SQL Server configured to use Filestream, what that means and how to implement a column that is of Filestream type. I will also go over where the files live and how to stream them outside of the SQL engine instead of pulling them through the engine. Join me for an in depth discussion on this technology. We will also touch on FileTables which was new in SQL 2012.

Managing Availability Groups with PowerShell
Today Availability Groups are all the rage. Being a PowerShell guy I prefer to manage things with PowerShell. The SQL team has granted us a great toolset in the SQLPS/SqlServer module that allows you to fully manage Availability Groups simply. Join me for a fun filled hour of PowerShell and Availability Groups, you won’t look at them the same way after we are finished.

SQL Server on Server Core with PowerShell
Ever wondered what the craze of Server Core is all about? There is a lot of power in Core and with it comes the admin challenge of NO GUI. But there are settings that you want to change in the OS, how do you do that? This session will take you through some of the challenges that are not really challenges when you know. We will use the builtin method of making some changes and also use PowerShell and commandline tools to get you where you want to be with Core. I built a 6 node cluster on Server Core and created it with PowerShell (Cluster and all) and it runs very nicely with a little footprint in RAM for the effort.

WORKSHOP ON TUESDAY, OCTOBER 25, 2016

PowerShell for the DBA from 0-60 in a Day
Think of how many servers or instances you maintain. Putting tools in your toolbox becomes a critical part of your life as a DBA. How many clicks can be reused? We will be taking you from 0–60 and everywhere in between with how PowerShell fits into a DBA’s day in the life. We will go from getting PowerShell connected to SQL Server to walking away with a toolbox starter kit that will help you get automated and allow you to focus on harder problems than modifying database options or getting information from SQL Server and storing it away for later analysis. The toolbox alone will make it well worth your attendance.

WORKSHOP ON SATURDAY, OCTOBER 29, 2016

PowerShell for the DBA from 60-120 in a Day
This is the next installment of becoming a PowerShell DBA. Now that you have been introduced to PowerShell and have an idea of what you can do, we take it to the next level and start you thinking about tool building and using PowerShell to actually manage instances. We take a look at modules you can use every day and then we talk about building your own modules. PowerShell is becoming more prevalent in the world of DBAs but still has not come to a level that I would like to see in our careers of working smarter not harder. We will be looking at practical items that you can use PowerShell to enhance because DBA work can contain a mass amount of clicking around if you only use Management Studio. We will take the challenge to remove as many clicks as possible from your daily management of databases and database servers. We cover Database Maintenance, monitoring and data gathering. We talk about managing SQL Server instances with PowerShell and last but not least, we will see how to use it to manage Jobs and SQL Server Agent. Join me in the quest to become an effective PowerShell DBA.

Join me in the quest to become a PowerShell DBA.

Hope to see you there!

Read More

SQL Server TDE on Mirror and Log Shipping

Today in our Virtual Chapter meeting for VCDBA, I presented on SQL Server Encryption including TDE.  The question was posed as to whether or not the TDE would take affect if the mirror was established first and then you enabled TDE on the database.  I was not sure at the time only because I had not done any tests, so I did not want to say yes or no unless I could substantiate it.

I set off to do some tests.  First I created a Mirrored database using the demo code from the VC presentation.

use master
GO

CREATE DATABASE [TESTTDE] ON  PRIMARY 
( 
	NAME = N'TESTTDE', 
	FILENAME = N'C:\sqldata\S1\TESTTDE.mdf' , 
	SIZE = 10240KB , 
	FILEGROWTH = 10240KB 
)
 LOG ON 
( 
	NAME = N'TESTTDE_log', 
	FILENAME = N'C:\sqldata\S1\TESTTDE_log.ldf' , 
	SIZE = 10240KB , 
	FILEGROWTH = 10240KB 
)
GO

Now once the database is created on instance 1 I backed up the database and a log backup. Then restored it to the second instance with it’s log with no recovery. Once it was primed and ready, I established a mirror and saw the database was synchronized.  Now the trick is to use create the components to establish the Encryption Hierarchy.

USE master;
GO
CREATE MASTER KEY 
ENCRYPTION BY PASSWORD = 'INSTANCE1_Use1Strong2Password3Here!';
go

CREATE CERTIFICATE MyServerCert 
WITH SUBJECT = 'My DEK Certificate'
GO


Now that we have the MASTER KEY and the Certificate in place, you need to backup the certificate to a set of files and restore it to the second instance.

BACKUP CERTIFICATE MyServerCert
TO FILE='C:\SQLBACKUP\MyserverCert.cer'
WITH PRIVATE KEY (FILE='C:\SQLBACKUP\MyServerCert.pvk',
	ENCRYPTION BY PASSWORD='My1Secure2Password!')
GO

NOW CHANGE TO THE SECOND INSTANCE

-- Make sure that you change to the second instance

USE master;
GO
CREATE MASTER KEY 
ENCRYPTION BY PASSWORD = 'INSTANCE2_Use1Strong2Password3Here!';
GO
CREATE CERTIFICATE MyServerCert
    FROM FILE = 'C:\SQLBACKUP\MyServerCert.cer' 
    WITH PRIVATE KEY (FILE = 'C:\SQLBACKUP\MyServerCert.pvk', 
    DECRYPTION BY PASSWORD = 'My1Secure2Password!');
GO

Now that you have the Encryption Hierarchy established for both instances, let’s encrypt the database.

-- Make sure that you are in the Principal instance 
-- or this will not work because you won't be able 
-- to USE TESTTDE on the Mirror

USE TESTTDE
GO

CREATE DATABASE ENCRYPTION KEY
WITH ALGORITHM = AES_128
ENCRYPTION BY SERVER CERTIFICATE MyServerCert;
GO

ALTER DATABASE TESTTDE
SET ENCRYPTION ON;
GO

What will happen in the Mirror is that it will encrypt the database, even though you won’t be able to see it.  When it is fully encrypted, and you fail the database over in the Mirror, it will indicate that the database is encrypted. On the Principal you can see it with the query below.

SELECT 
    db.name,
    db.is_encrypted,
    dm.encryption_state,
    dm.percent_complete,
    dm.key_algorithm,
    dm.key_length
FROM 
    sys.databases db
    LEFT OUTER JOIN sys.dm_database_encryption_keys dm
        ON db.database_id = dm.database_id;

Now for the Log Shipping copy

To establish the test for the Log Shipping, do the same as you did for the mirror but restore the Full, then the Log with Standby so that you can see that it flags it as encrypted.  First establish the Log Shipping without encryption set up, and once you are in Standby on the Shipped side, you can use the existing Certificate to test the idea of encryption afterwards.

Steps:

  1. CREATE DATABASE
  2. SET Recovery Model to Full
  3. Backup Database
  4. Backup Log
  5. Change to second instance
  6. RESTORE DATABASE with NORECOVERY
  7. RESTORE LOG WITH STANDBY
  8. Change to first instance
  9. USE TESTTDELOGSHIPPING
  10. CREATE DATABASE ENCRYPTION KEY
  11. ALTER DATABASE TESTTDELOGSHIPPING SET ENCRYPTION ON
  12. BACKUP LOG
  13. Change to second instance
  14. RESTORE LOG TESTTDELOGSHIPPING WITH STANDBY
  15. Run the above query to see if the database is encrypted. You will see is_encrypted = 1

This shows that the Mirroring and Log Shipping processes are affected by the Database Encryption Process.

Happy Encrypting….!

Read More

sys.dm_db_index_physical_stats Easter Egg

I was reading through documentation on the DMV sys.dm_db_index_physical_stats and found that there are a couple of things you need to be aware of when using this DMV.  First, we know that there are parameters for the DMV.

Parameters

  • Database Id
  • Object ID
  • Index Id
  • Partition Id
  • Level of interrogation (DEFAULT, LIMITED, DETAILED, SAMPLED, NULL = LIMITED).

The interesting thing is that when any of the first 4 parameters are NULL, the default is to evaluate ALL of the items in that parameter.  For example, if you specify NULL in the Database ID then it will evaluate ALL databases in the server. If Object ID is NULL then it will evaluate all objects.  So I knew all that, but here is what I did not realize, quoting from the documentation:

passing values that are not valid to these functions may cause unintended results. For example, if the database or object name cannot be found because they do not exist or are spelled incorrectly, both functions will return NULL. The sys.dm_db_index_physical_stats function interprets NULL as a wildcard value specifying all databases or all objects.”

Additionally, the OBJECT_ID function is processed before the sys.dm_db_index_physical_stats function is called and is therefore evaluated in the context of the current database, not the database specified in database_id. This behavior may cause the OBJECT_ID function to return a NULL value; or, if the object name exists in both the current database context and the specified database, an error message may be returned. The following examples demonstrate these unintended results.”

Now that is nice, but here is what it means. If you are specifying a database id for use in this function and the database id is not the “current” database, then if the OBJECT_ID() function is used, it will evaluate it in the “current” database and NOT the database that is represented by the database id in the statement. If the object does not exist in the database, then it will result in NULL and will then proceed to do an evaluation on all objects in the database represented by the database id.

The other caveat is that if in the current database there is an Object by that name that results in an object id being returned, when the function begins, it will not be able to find the object id returned and will actually error. Take a look at the examples below and notice that you will want to be in the database that you intend to use if you are specifying anything beyond the database id. Don’t get caught with NULL in the statement without you knowing about it.

 

use master
GO
SELECT *
FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks2012'), OBJECT_ID('Person.Person'), NULL, NULL, 'LIMITED')

-- Results were for 190 rows 
/*
** What happens here is that it evaluates OBJECT_ID('Person.Person') to NULL 
** because it does not exist in master as shown below.
*/

use master
GO
SELECT OBJECT_ID('Person.Person')
-- Result is NULL
GO
use AdventureWorks2012
GO
SELECT OBJECT_ID('Person.Person')
-- Result is 1765581328

/*
** Let's create a table in master of the same name
** as in the AdventureWorks2012 database.
*/

use master
GO
CREATE SCHEMA Person AUTHORIZATION dbo
GO
CREATE TABLE Person.Person (id int)
GO
SELECT *
FROM sys.dm_db_index_physical_stats(DB_ID('AdventureWorks2012'), OBJECT_ID('Person.Person'), NULL, NULL, 'LIMITED')
GO
-- You get the error below
/*
Msg 2501, Level 16, State 40, Line 3
Cannot find a table or object with the name '.StateProvinceCountryRegion'. Check the system catalog.
*/
Read More

PASS Summit 2014 – What to expect

When thinking about attending PASS Summit each year I don’t spend much time deciding. I have not missed a Summit yet and hope I won’t miss one in the future. This post is meant to help those that may be deciding whether or not to go or for those that have decided to go but really don’t know what to expect.

At PASS Summit there are many things to do. The list below resonates in my mind every year:

  • Networking
  • Learning from others
  • Finding those you have not seen and catch up with them
  • Meet new people, your favorite MVP or MCM or just a great figure to finally put a face to the name
  • Learn new things

Some of my favorite Summit experiences include meeting people like (not a definitive list) Kalen Delaney, Paul Randall, Kimberly Tripp, Brent Ozar, and the list could go on. Meeting these people for me was a treat and has made my PASS trips great to reconnect and chat with them and so many others to name off. The PASS Attendee parties are also one of the highlights because it was a chance to talk with peers and some of the famous ones in a setting that was not about database. You learn things like Kalen loves air hockey and some DBAs are really funny, but more than that, you can chat with the leaders of the PASS organization and board members.  These are times I will remember for a very long time.

First Timers

For you first timers, you are in for a treat. This is the first Summit for you and hopefully won’t be the last. For you there is finding and meeting people and speakers that you have seen for many years, maybe, and now they are there to shake hands with and chat with you. They are not untouchable, they are real. Friendly SQL Server Professionals that will actually be willing to chat with you.  There is the Expo where you can see products you have heard about for a while and actually see how they work.  It is NOT all sales and hype, there are actually some very good demos there.  You will always get some sales but this is your chance to get to know the vendors better.  Here are a list of things I would do as a first timer having been through many Summits and I could actually be a great first timer now. 🙂

  • Find out who is presenting and mark the ones that you MUST see and then those that you would like to if the opportunity presents itself
  • BUY THE RECORDINGS, no matter how good of a time manager you are, you will inevitably get overwhelmed with the amount of stuff that is out there and the people who are there, this will give you the opportunity to watch the recorded sessions you missed because you are networking.
  • Did I already mention buying the recordings before the show?
  • Make time for the Community Lounges, SQLCAT Support Center and anything that you cannot get online or in a presentation. Get in front of people and BRING BUSINESS CARDS to give out so that people get to know who you are and they may give you their contact information. BUILD A NETWORK.
  • Go to the MUST see presentations since education is still a huge part of the Summit
  • Attend the Keynotes or watch them, sometimes they surprise you with announcements you won’t get anywhere else.
  • Get to know the Board of Directors of PASS. Hold them accountable as they lead the organization that you are a member of.
  • Investigate opportunities to volunteer for PASS or one day run for the Board.  Passionate people is what makes PASS great, remembering that passion begins with PASS. 🙂
  • Attend the online First Timers presentations that are out there.  I believe there are many but these are great events. I believe that there is even one now onsite at the PASS conference. Get to know what is there for that year.
  • Find the SQL MVPs, they are out there.  These volunteers in the SQL Community have passion towards SQL Server and are always willing to share.
  • Find out who the Microsoft Players are and their role and network with them.
  • Many more things that you can do to enjoy and always want to come back for another Summit.

Preparing for your next experience today

If you begin with the next year in mind, set up the relationships that you will renew each year at the Summit and you will find yourself refreshed and renewed each Summit you attend. Do things to keep in touch with people you meet, like getting their contact information, giving yours out to set up collaboration during the year you are away. I keep all the cards I receive in a place in my backpack and it is amazing how many I have at the end of Summit. Obviously this is not an opportunity for free consulting, but to get a quick opinion or a quick answer to a question, your network is your best friend. The next thing you should do to complete your experience is when you leave Summit, look for the email where registration opens for the next Summit and most often it is a very significant discount to register before the year ends than to wait until the next year. Save some money.  Finally, get to know the players and you will be playing for the rest of your career.

Valuable Lessons Learned

The most valuable lessons I have learned at my PASS Summits I have attended is that people at PASS Summits are real people. They have lives outside of their career and they have failings just like me. I don’t know everything and they don’t know everything, that is why we get together and chat with each other because they share what they have learned and experienced and I do the same, and we come away better for it. The people you meet at PASS and the leaders that lead PASS are just like you and me, they are not better than you or me and we are not better than them, we have just had different experiences and that is what makes a PASS Summit FUN. Swapping stories, learning new things and renewing the PASSion that you have gained for SQL Server and the technologies that surround it.  Chatting with the Microsoft folks and the MCM and MVP crowd can enlighten you in an environment that cannot be replicated at work, take advantage of this opportunity, you won’t regret the time you spend.  But most of all Unwind, HAVE FUN and take it all in, you will be exhausted when you get done, but it will be an experience second to none.

I hope to see you there and hope your experience will be well worth the time. Here’s to all the SQL Community or #SQLFamily out there that will be attending, can’t wait to see you there.

Read More

MCM SQL Server 2008 Achieved

Another step in my career as I have passed the Lab Exam and have become an MCM in SQL Server 2008.  I believed I could achieve it after passing my knowledge exam the first time.

The first attempt at the lab was back in October and it was disappointing to find out that I did not pass the first time. I thought about what I could have done better and determined that I let time get away from me.  I had the experience I thought I needed to do well on the lab exam, but could not get to the point of finishing all the objectives and then there were those that I had not had experience with.  I found myself rehashing my experience to find that important part of getting prepared.  I did not think that study would help enough, so I started with my own lab environment (VM with SQL Server on it) and I ensured that I knew how to work with most of the engine features in BOL and thought of my strategy for the second attempt.

As I reflect on my attempt and prior to that day, I am grateful for the time I took to teach the concepts I was learning in my lab environment, to others. I also love the User Groups in Utah for allowing me to solidify some concepts I felt were important to know and be very familiar with by presenting at the Group meetings.  I may have taken it sooner had I had a study buddy to bounce things off of, but I found that the post from Rob Farley I got a new lease on life for the test.  I was confident in what I knew, but I am not as great in managing my time in scenarios of stress.  It helped to have the test administered via Lync at my house, which was much better than where I took it last time.

The day of the test, the proctors were awesome and took good care of me and explained the process.  They were very understanding and that set my day off to a good start. During the test, I employed some techniques that I learned in preparing, that were all about time management to allow me to get through the Lab and demonstrate what I knew how to do. I think that my time management really helped me the most as I already had the experience I believed I needed to pass the test.

Thanks to all the other MCMs that shared their expertise in presentations, emails and all those that were totally encouraging me to attempt my MCM experience.  But most of all to a supportive manager and employer (www.healthequity.com) that stood by me in the expense and time to take this exam. I am honored to be among the MCMs in SQL Server and looking forward to the road ahead.

Read More
Menu