Category Archives: DBA

Blocking – I Saw What YOU Did and I Know Who YOU Are!

When I was growing up I remember my mom talking about an old, scary, movie that she saw when she was young. In the movie some teenagers were making prank phone calls saying “I saw what you did and I know who you are.” One of the calls happened to be made to a guy that just killed his wife. Joan Crawford played a woman that was romantically inclined to said murderer. She eventually meets her demise when he stabs her because she knew about the first murder.

How every DBA feels when there is blocking

While blocking in SQL server might not be a felony offense (it isn’t…but it should be – WHO’S WITH ME?) as the DBA you not only want to know what is being blocked, but also who is doing the blocking and what in the H-E-Double-Hockey-Sticks they are doing.

At SQL Saturday Orlando I talked about this very thing and the query I defer to for the information.

IMA GONNA HUNT U DOWN!

It might look complicated but it is actually very simple – query sys.sysprocesses with a cross apply using the sql_handle to get the text of the query, and then an outer apply with the same query again but you are joining to the blocking spid so you can get the text for the query that is doing the blocking. Beyond that, you can filter on various columns and refine your output

WHY ARE YOU BLOCKING YOURSELF? WHY ARE YOU BLOCKING YOURSELF? WHY ARE YOU BLOCKING YOURSELF?

Of course, I know you can’t run this without the code (and I know that’s why you’re here…because I SAW WHAT YOU DID!)

If you are looking for a good way to troubleshoot blocking I hope this helps. If you have some folks running queries that are making you stabby, run this, find out what is going on, and then remove their access try to help them so they aren’t making you stabby any more. Then tell them they need to buy you a beverage because they are still alive.

UPDATED: PowerShell – WE HAZ YUR ACTIVE DIRECTORY RIGHT HERE!

I discovered something last week – I had not blogged about little things that I thought I had blogged about. What the heck does that mean? It means that I tried to reference my blog for something because I thought “I totally blogged about that”…and found out that was not the case.

Starting now, I am fixing this situation. There was something that popped up today that called for a PowerShell script and the Get-ADGroupMember cmdlet – get a list of users from a list of groups. Some users are in there more than once so this needs to be a distinct list, unless you are into manually cleaning up things like this, and then I will be sad for you. Because that is kinda sad.

I originally wrote a script with two arrays (one for the initial list and one for the de-duped list of users), but even though this is quick and dirty, that was a little too dirty. Enter the Group-Object cmdlet – it takes this list of names and groups them. No black magic this time. Just a cmdlet, that comes baked into PowerShell giving me what I need.

What? You wanted the code too? Oh, OK.

There you have it – quick, dirty and to the point. Enjoy. 🙂

UPDATE: Mathias Jessen tweeted a one liner for this….so no need for the one array! Woohoo!

I was trying to do this but was also just trying to get it done, and if in doubt, I slap things in arrays. Thanks Mathias!

 

It’s About Time for…#SQLSatOrlando!

Next week I am returning to Orlando not only to visit the mouse at his house, but also to speak at SQL Saturday Orlando! Last year was my first time there and I had a great time. This also happened after the event was pushed back to November 2016 because of Hurricane Matthew.

This year I will talking about deadlocking and blocking – something that is an issue for so many DBAs. Whether this issue is indexes or bad code…or you have been trying in use indexes to cover up bad code, blocks and deadlocks can happen in even the best environments under the right circumstances. If you are going to be at SQL Saturday Orlando come to my session where we will talk about detecting and preventing these arch enemies of the DBA!

Speaking – SQL Saturday Baton Rouge!

This Saturday will mark the fourth year that I will be speaking at SQL Saturday Baton Rouge at LSU. Out of all the SQL Saturday events I have participated in since 2014, Baton Rouge is one of the few that I have been to every year since. Houston is another one…and actually, these may be the only ones only because they have had an event every year.

Not only am I speaking on Automation with PowerShell and Deadlocks and Blocking, but I am also participating in a panel discussion on Careers in IT. I’m excited to be invited along with some of the other speakers to be a part of this. Looking back on my own school days, I knew I would have a career in IT, but little did I know I would detour from a path in development to the world of SQL Server, and becoming a DBA.

How do they make this happen? Work. Lots of hard work. After helping with SQL Saturday Dallas 2015, joining the NTSSUG board in 2016, and then having an organizing role in 2016, I found out how much goes into the planning of these events. If you have attended a SQL Saturday or you are going to in the future, be sure to say “THANK YOU!” to all the organizers and sponsors. If you want to get more involved in the SQL Server community, SQL Saturday is a great way to do that – just show up at the event, find an organizer and tell them that the SQL Kitten sent you to be their humble servant volunteer for the day…or you could just say you want to volunteer and leave out the other stuff because it might make it weird.

(UPDATED) PASS Summit 2017 Session Data (You know you want it…so go get it!)

UPDATE #1 –
The following changes were made to the PowerShell script for the PASS Summit 2017 session scrape:

  • Abstract field is now varchar(max). ‘Cause precons get all the chars.
  • Added SubmittedSession column.
  • Upper bound increased to include precons.

UPDATE #2 –
Before I could get update #1 posted, it appeared all the general sessions and lightning talks were removed from the submissions page and replaced with submitted precons. Don’t fret – the PowerShell script will still grab all the submitted sessions (for now). There were also some more precons added so the upper bound has been changed yet again.

 

The deadline for PASS Summit submissions came and went. Either you got yours in or you didn’t. What does this mean? Time to gather the data.

Go get it! You know you want to!

Never having done a web scrape before, this was the perfect subject for my first time – grabbing all the sessions submitted to PASS Summit 2017…and doing it with PowerShell! Here is the script I used for this. I have accounted for the following:

  • Apostrophes (aka single quote). They will break your insert unless you have two of them, and for some reason, people seem to use them all over the place.
  • Formatting the string data for insert. No, your data will not magically come out right in your insert with single quotes so you need to add them.
  • Additional ID and deleted fields.
  • Speaker URL and ID. Will be using this to scrape speaker details later.
  • Accurate lower and upper bounds. These were arrived at by trial and error (you’re welcome), as well as the clean up of the data I scraped. More on this later.

Now for the code –

 

There will also be some webids that error out – this means the session doesn’t exist for that specified webid. You will see an error in the output to the PowerShell window, but the script will keep running.

Once you run the script and it completes, you will have your table of session data. You will notice that there are more sessions there than have been submitted, according to the website. If you do a count of the sessions by speaker you will also find that there are more than three sessions (the maximum allowed) for quite a few speakers. Why is this? You are pulling from a data source that contains sessions other than the ones that were submitted to Summit (apparently). This means you will have to clean up the data – I included the deleted field for this purpose. Look at the duration and filter for the 75 minute and 10 minute sessions and update your deleted field accordingly.

After this, you will have sessions that meet the duration, but you will find the speaker is not listed on the submission site when doing an advanced search. Clean the data some more.

Are you done cleansing your data yet? You might think so, but you aren’t. You see sessions that look like they were submitted for Summit but are not on the site. These appear to be sessions that were created and saved, but not officially submitted. You can rule out quite a few of these if you look for speakers that have more than three sessions. Get those counts, check the website for those speakers, and clean your data.

“Good lord, am I done yet?”

No. No you’re not.

There will be speakers that have three sessions in your table but only submitted one or two. This sounds worse than it is, but it is a little painful. How do you do this? You’re going to search the speakers with the advanced search on the website. Do these in groups based on the number of sessions a speaker has, ordering by speaker name. You can exclude the the speakers you have already reviewed in the previous clean-up steps, deleting their names from your list as you go. When you find a speaker with three sessions in your list and fewer sessions submitted, mark the session that was not submitted as deleted. Next, do the speakers with two sessions as they may have only submitted one. Finally, look at the speakers with one session as there may be ones that didn’t officially submit anything.

“But what about the code for the clean-up?”

No. You know how to do this, and may write it differently (and better) than me. My code was ugly and I don’t feel like cleaning it up and posting it here. I gave you the script to grab it all. Stop complaining or I will send you to time-out.

The speaker info grab will be a separate blog post. With the script in this post you might be able to use the syntax and craft your own PowerShell script for this. Or you could wait for my blog on it.

Yeah…I haz the datas!

Blogging reboot and PowerShell – Get System Information

After whining about not blogging enough, I am going to do something about this. Whether it is PowerShell or SQL, simple or complex, I know that others can benefit from my knowledge and expand their skill sets.

And you are like “That’s great Amy…where’s the PowerShell we are here for?” Ok, Ok….keep your drawers on! 😉

Recently I was tasked with gathering the system information from all of the servers at a client. Another opportunity for some PowerShell dominance.

With this script you can generate system information files and save them to a specified location. It makes sure a connection can be made to the server first, and then outputs the file. The files are created one at a time, so if you pass in a longer list of servers, you shouldn’t crash your machine. From my testing, this will take some time to run as these files don’t output quickly. Despite that, the output is worth it. This can be modified to pull your list of servers from a file or from a Central Management Server (CMS) instance.

Hope you find this useful. Happy PowerShellin’ 🙂