Quick and Dirty UPDATE STATS Job with T-SQL and PowerShell

When performance issues rear their pretty little heads I tend to look at the simplest things last. I am the person who is looking at the code first – because in many cases, if your code was better we wouldn’t be having this discussion. Whether it’s a series of nested views or one single user-defined function that doesn’t play well with the optimizer, there is usually something in the code itself that is contributing to the performance issue.

But what about the times when it isn’t the code or you can’t touch the code (I’m looking at you NHibernate…and all of your red-headed step-child offshoots and fellow ORMs). What about the cases when the poor performance situation only occurs periodically, or once a week on the same day, every week, and while you can see the slow-downs in your queries, and all of your waits and plans, the source of the problem isn’t any particular smoking gun – it’s lots of guns, firing off at random times, and the casualties are in your interfaces that depend on the data from the queries.

The one thing I don’t immediately go to is UPDATE STATS – and I should be looking at this. It’s something easy to run and if you still have issues, you have ruled it out.

Over estimation on rows = bad times for the optimizer.

In this case, I ran UPDATE STATS on one table in particular and immediately saw a marked improvement in my test query.

Ahhhh….much better.

This morning I ran the same query and things were back to where they were with performance. This was somewhat disheartening, but the troubleshooting must continue.

But what about auto-update stats? Is this on? Yes, but auto update statistics is only going to update based on a percentage sample of the records in the table. If your table contains millions of records, you may not be updating your statistics sufficiently to see an improvement. I am updating statistics with a FULL SCAN of the table. For larger tables, this may take some time, but you may also see a difference in your execution plan versus only updating statistics based on a sample.

In an effort to rule out whether or not statistics are definitely a factor, I want to UPDATE STATS on all the tables in my query, and at a specific time – sometime the day before we expect our slowdown to occur. I also want to be able to easily see how long the process ran and the duration of the update for each table. I could write all this to a table, and maybe I will do this later, but viewing this from job history is what I want right now – it is easy to look at and easy for everyone else to find and see as well.

Creating a job with multiple steps can sometimes be a bit painful and tedious, if the job is going to have A LOT of steps. With PowerShell, a server name and a database, I can dynamically create this job.

The result from this is a job that will update statistics on your list of tables, one at a time. All you have to do from there is give it a schedule.

All those steps…courtesy of the magic that is POWERSHELL!

For this script, I was getting my list of tables based on a specific query that was problematic. If you are doing this for a stored procedure, you can query for the list of dependent tables instead, and instead of hard coding the tables in this code, pass in the name of your stored procedure and get the list that way.

I hope if you have a situation where some one-off stats updates are needed you find this script to be useful. As always, comments and feedback are appreciated. 🙂

 

Last known good CHECKDB…for ALL your databases!

This morning Wednesday I walk into the office and immediately hear that CHECKDB is the source of issues on one of the servers and is the reason behind some errors that have been happening. While I don’t think this is the case (it might look like it on the surface but there is something else that is happening that is the actual cause) I also wanted to find out what CHECKDB was running at the time the errors occurred.

I needed information on when CHECKDB ran for each database. When you look for what you can run to find when CHECKDB was last run you find this blog post and also this blog post on grabbing this info. While these were very informative, they were for one database at a time. I need this for all the databases so I can try to not only find out when each one ran, but also use these time stamps to figure out the duration.

I took the code from Jason’s post and made a few changes, running this for all the database with sp_MSForEachDB.

The results gave me some good information to go on – the database where the errors occurred was not the same database where CHECKDB was running during that time frame. There were also some FlushCache messages in the log – some suspect that this is also tied to the errors that are happening. Since all this ties back to NHibernate code, I suspect something else is going on and further digging is needed.

Situation developing. 🙂

Blocking – I Saw What YOU Did and I Know Who YOU Are!

When I was growing up I remember my mom talking about an old, scary, movie that she saw when she was young. In the movie some teenagers were making prank phone calls saying “I saw what you did and I know who you are.” One of the calls happened to be made to a guy that just killed his wife. Joan Crawford played a woman that was romantically inclined to said murderer. She eventually meets her demise when he stabs her because she knew about the first murder.

How every DBA feels when there is blocking

While blocking in SQL server might not be a felony offense (it isn’t…but it should be – WHO’S WITH ME?) as the DBA you not only want to know what is being blocked, but also who is doing the blocking and what in the H-E-Double-Hockey-Sticks they are doing.

At SQL Saturday Orlando I talked about this very thing and the query I defer to for the information.

IMA GONNA HUNT U DOWN!

It might look complicated but it is actually very simple – query sys.sysprocesses with a cross apply using the sql_handle to get the text of the query, and then an outer apply with the same query again but you are joining to the blocking spid so you can get the text for the query that is doing the blocking. Beyond that, you can filter on various columns and refine your output

WHY ARE YOU BLOCKING YOURSELF? WHY ARE YOU BLOCKING YOURSELF? WHY ARE YOU BLOCKING YOURSELF?

Of course, I know you can’t run this without the code (and I know that’s why you’re here…because I SAW WHAT YOU DID!)

If you are looking for a good way to troubleshoot blocking I hope this helps. If you have some folks running queries that are making you stabby, run this, find out what is going on, and then remove their access try to help them so they aren’t making you stabby any more. Then tell them they need to buy you a beverage because they are still alive.

UPDATED: PowerShell – WE HAZ YUR ACTIVE DIRECTORY RIGHT HERE!

I discovered something last week – I had not blogged about little things that I thought I had blogged about. What the heck does that mean? It means that I tried to reference my blog for something because I thought “I totally blogged about that”…and found out that was not the case.

Starting now, I am fixing this situation. There was something that popped up today that called for a PowerShell script and the Get-ADGroupMember cmdlet – get a list of users from a list of groups. Some users are in there more than once so this needs to be a distinct list, unless you are into manually cleaning up things like this, and then I will be sad for you. Because that is kinda sad.

I originally wrote a script with two arrays (one for the initial list and one for the de-duped list of users), but even though this is quick and dirty, that was a little too dirty. Enter the Group-Object cmdlet – it takes this list of names and groups them. No black magic this time. Just a cmdlet, that comes baked into PowerShell giving me what I need.

What? You wanted the code too? Oh, OK.

There you have it – quick, dirty and to the point. Enjoy. 🙂

UPDATE: Mathias Jessen tweeted a one liner for this….so no need for the one array! Woohoo!

I was trying to do this but was also just trying to get it done, and if in doubt, I slap things in arrays. Thanks Mathias!

 

It’s About Time for…#SQLSatOrlando!

Next week I am returning to Orlando not only to visit the mouse at his house, but also to speak at SQL Saturday Orlando! Last year was my first time there and I had a great time. This also happened after the event was pushed back to November 2016 because of Hurricane Matthew.

This year I will talking about deadlocking and blocking – something that is an issue for so many DBAs. Whether this issue is indexes or bad code…or you have been trying in use indexes to cover up bad code, blocks and deadlocks can happen in even the best environments under the right circumstances. If you are going to be at SQL Saturday Orlando come to my session where we will talk about detecting and preventing these arch enemies of the DBA!

AzureRM, PowerShell and Deleting Your VM the Easy Way

Before I worked on server migrations to Azure, I had only worked with Azure for a SQL Saturday session, and I was only using Azure as a host to the virtual machines (VMs) that I used to demonstrate PowerShell scripts running on a windows server that also accessed active directory. My only exposure to this technology was limited and focused on what I could do with the VM – not with how the VM was created or how I could use PowerShell to deploy a VM.

That professional experience with Azure finally came, and I got to work on PowerShell scripts that could be used to create VMs in Azure. With a script you have something that is consistent and less prone to human error, but even with that sometimes something is set up wrong that can’t be changed once the VM has been provisioned. Alternatively, maybe you are only using Azure for the testing of an application of project, you are done and the VM is no longer needed. You could simply decommission the VM, but if there is no need for the VM to still exist you probably want to delete it and all the parts associated with it.

The first one of these happened to me – there were three VMs where the configuration wasn’t right and it wasn’t something that could be changed from the portal. The VMs would have to be deleted and re-provisioned. I modified the creation script to have the right configuration for these VMs and got those created so work could continue. At this point I started to work on a script that would remove the VMs base on the name and resource group, including removal of all the parts that were tied to the VMs, such as the virtual network and the storage account.

Fast forward to today – I am working on my session for PASS Summit 2017. Instead of creating every part separately with the PowerShell cmdlets, PowerShell is used to apply a template that contains all the parameters and variables for a VM. If you need to make a change it is made to the JSON template, not to the PowerShell. Since I am testing, I am creating VMs and then I want to blow them away right after. I attempted to use the PowerShell script I had used before to remove VMs but that failed – enough time had passed that the cmdlets have been updated and what I ran before was no longer valid. I tried to revise the existing script and then ran across the OMG-I-can’t-believe-I-didn’t-do-this-before answer – just delete the resource group.

If you have contained all parts of the VM in the same resource group, and there is nothing else in the resource group other than the VM, cleanup is this simple. From what I recall about the server migration, there were multiple VMs associated with the resource group. If the resource group had only contained the VMs I wanted to blow away, I could have potentially run this for that clean-up operation. In this case, I was also removing disks from each of the VMs ahead of removing the VMs and had to run a script for that first. Looking back, if everything had been in one resource group, and had there been nothing else in there, I could have run this one line and been done.

For now, this solution works for me, but I know a more specific cleanup script may be more useful – one that will remove a VM and all of its parts based on the VM and the resource group. This will have to happen when I am done with my other testing, but in the meantime, this one line of code allows me to create and blow away VMs while using my personal Azure account…and not run up a ton of charges. Let me know if you find this useful or if you have any suggestions here – would love to have feedback!

PASS Board Elections – Voting Deadline is TODAY!

In case you have been under a rock, or buried in work, or otherwise occupied, the PASS Board elections are going on and end TODAY! There are some amazing candidates (thanks to the selection committee for all of their work in bringing us this fine list of folks) running for a handful of seats. Big question – have you voted yet?

When you vote you are exercising your voice and opinion on who from the SQL community best represents your views on the direction and future of the PASS organization. That said, the total number of people voting during each election cycle has steadily declined, with fewer that 1000 people voting last year.

With this being one of the easiest ways to be involved with PASS, you would think that these numbers would be increasing. If you have not voted yet, go check out the candidates. If you look at all of them, and just can’t decide who to vote for, give that Chris Hyde a second look – while it might seem like personal bias, I honestly feel he is well equipped for this position and will represent PASS and the SQL community with the same enthusiasm and vigor he has brought to the Albuquerque user group and all the other locations where he has spoken for user groups or SQL Saturday events (including the North Texas SQL Server User Group).

The voting deadline is 12 noon PST today (2pm CST). If you haven’t voted yet then put that query down and get to it!

Wanna Get the Kids Interested in Coding? Make it a Game. Literally!

Former co-worker and friend Daniel Janik is trying do something that I will eventually have to do – get a teenager interested in something that can lead to a career.

I’ve been teaching my 17 yr old SQL Server and other various topics. I have quite a bit of experience training technology and speaking but it never fails; after about 5 minutes his eyes glaze over and he nods and “umhums” and then starts checking his phone.

I have one nephew that is already intrigued by the PowerShell that I am doing and seems to have a genuine desire to write code. This makes me SOOOOOOOO HAPPY! My hope is that he is able to continue what he is learning now and migrate into a skill set that will afford him a good career. And if he happens to maybe “hack” into something that he shouldn’t have, I will help with his bail, smother him with hugs and kisses and tell him “OMG…I AM SO PROUD OF YOU!!!”

Check out Daniel’s blog post, download the game and give it a whirl. I plan on showing my nephews this to get the opinions of the younger, not-a-teenager set. I doubt my niece will be interested in this right now…unless there was a Disney princess in there with the sea monsters. I will leave it to Daniel to incorporate that in the 2.0 version. 🙂

 

Holy Blocking Chains, Batman – I’m Speaking at PASS Summit!

For the first time ever I will doing a general session at PASS Summit this year.

Waiting for an email with news one way or the other last week was angst ridden, to say the least. After receiving no notification with the first wave on Tuesday, I got even more anxious and tried to occupy myself with work and other things. I didn’t want to get excited over something that wasn’t even a thing yet.

Then late Friday – acceptance.

Yeah, that’s right. ME!

It didn’t sink in right away. I had myself so convinced I wasn’t going to get selected. Maybe my head and heart were still bracing for the disappointment that was not to happen. Why would I be thinking this way? This is the third time I have submitted and the first time I have had a general session accepted. I did have the opportunity of doing a lightning talk last year. I also got to compete in Speaker Idol, which was an amazing and valuable experience that is only available to those that have not done a general session before. More on this later.

The first year I submitted I had been writing abstracts the same way I normally did for SQL Saturday events, not knowing there was more involved for a PASS Summit abstract. I logged in and was about to fill out everything for each of my sessions when I saw all these fields to fill out that I didn’t know about. I quickly got things written for them, along with my main abstracts, and submitted.

But I didn’t get selected.

The next year PASS offered a program for feedback on Summit abstracts. I thought “Yes, this is great! I will write my abstracts, get feedback, and I will be speaking at Summit this year!”

But I didn’t get selected for a general session.

Both times I was disappointed. Both times I did a fair amount of sulking and grumbling, but I am only human. In these cases, my best was beat out by others that were better. Kind of a hard pill to swallow, but I think these challenges are what make us who we are – IT professionals that are constantly striving to improve our own skills while creating content that will convey knowledge and better the SQL Server community.

Last year however, I was selected for a lightning talk and I finally got to participate in Speaker Idol. Doing my lightning talk on How to Keep Your DBA from KILLING YOU was fun and a great experience.

Speaker Idol? Wow…how do I put this? If you are a new speaker, and have not spoken at PASS Summit before (general session or better), throw your name in the hat for this! If selected, you will have the opportunity to get up and present something in five minutes, and get immediate feedback from a panel of judges, who also happen to be well known and experienced speakers in the SQL community. When this was first done in 2014, I heard about the brutality of the first round, so I made sure I was there for the subsequent rounds. Your audience for this? Mostly speakers or aspiring speakers. Don’t be surprised if the final round is standing room only. Or if there are acrobatics (lookin’ at you, Rob).

Moving on to 2017…

Earlier this year it was announced that the Summit selection process was changing yet again. There would be no feedback to speakers and you were also limited to submitting three sessions. There was also a list of topics, with some marked as “hot topics” or topics that PASS really wanted content on. It was also stated that they wanted new content. That session you had been peddling to every SQL Saturday that would have you? Not new content.

I took all this info and combined it with a ramped-up abstract writing and review process. What is ramped-up? Reading the sessions that were selected last year, and having a reviewer that is a seasoned PASS Summit speaker. It also helped that my reviewer was super critical. Once it was announced that submissions were being accepted I logged in and grabbed all the sections that I would need to fill out and got started on my abstracts, outlining details before writing anything. There was even one session I outlined while on a plane, that I looked at later to write up and decided it was COMPLETE GARBAGE…AND WHAT WAS I THINKING???

What did I learn from the PASS Summit submission and selection process this year?

  • Review, review, review. Write them early, review, come back later and review again. Have other experienced PASS Summit speakers review your abstracts. Don’t like their feedback or think they are being mean? Ummmm….they are trying to help you get your abstract to a level that is worthy of PASS Summit. LISTEN TO THEM!!! And buy them a beer.
  • Nothing is guaranteed. In a previous blog post I told y’all how to pull all the submitted session into a table and even gave you some PowerShell code to do it. I was able to compare that to the list of speakers selected and there were some amazing people that have spoken and PASS Summit before that didn’t make the cut this year. Sessions are reviewed and selected by human beings. This is a gamble, and regardless of how good of an abstract you have, you are at the mercy of the session review committee and what they feel is good enough for Summit. This is a monumental task, but they hunker down and get it done so that all of us can have an amazing selection of sessions to pick from.

I hope I get to see everyone at PASS Summit this year, whether or not they are speaking. Out of the conferences that I have been to over the years, I have found PASS Summit to not only be the best of them, but it was where I discovered that the #SQLFamily thing is real. If you are open to it, you can get hooked into groups that will welcome you and include you all because you are there for the same reason they are – to expand your knowledge and career in the area of SQL Server.

See y’all in October!

Speaking – SQL Saturday Baton Rouge!

This Saturday will mark the fourth year that I will be speaking at SQL Saturday Baton Rouge at LSU. Out of all the SQL Saturday events I have participated in since 2014, Baton Rouge is one of the few that I have been to every year since. Houston is another one…and actually, these may be the only ones only because they have had an event every year.

Not only am I speaking on Automation with PowerShell and Deadlocks and Blocking, but I am also participating in a panel discussion on Careers in IT. I’m excited to be invited along with some of the other speakers to be a part of this. Looking back on my own school days, I knew I would have a career in IT, but little did I know I would detour from a path in development to the world of SQL Server, and becoming a DBA.

How do they make this happen? Work. Lots of hard work. After helping with SQL Saturday Dallas 2015, joining the NTSSUG board in 2016, and then having an organizing role in 2016, I found out how much goes into the planning of these events. If you have attended a SQL Saturday or you are going to in the future, be sure to say “THANK YOU!” to all the organizers and sponsors. If you want to get more involved in the SQL Server community, SQL Saturday is a great way to do that – just show up at the event, find an organizer and tell them that the SQL Kitten sent you to be their humble servant volunteer for the day…or you could just say you want to volunteer and leave out the other stuff because it might make it weird.