(UPDATED) PASS Summit 2017 Session Data (You know you want it…so go get it!)

UPDATE #1 –
The following changes were made to the PowerShell script for the PASS Summit 2017 session scrape:

  • Abstract field is now varchar(max). ‘Cause precons get all the chars.
  • Added SubmittedSession column.
  • Upper bound increased to include precons.

UPDATE #2 –
Before I could get update #1 posted, it appeared all the general sessions and lightning talks were removed from the submissions page and replaced with submitted precons. Don’t fret – the PowerShell script will still grab all the submitted sessions (for now). There were also some more precons added so the upper bound has been changed yet again.

 

The deadline for PASS Summit submissions came and went. Either you got yours in or you didn’t. What does this mean? Time to gather the data.

Go get it! You know you want to!

Never having done a web scrape before, this was the perfect subject for my first time – grabbing all the sessions submitted to PASS Summit 2017…and doing it with PowerShell! Here is the script I used for this. I have accounted for the following:

  • Apostrophes (aka single quote). They will break your insert unless you have two of them, and for some reason, people seem to use them all over the place.
  • Formatting the string data for insert. No, your data will not magically come out right in your insert with single quotes so you need to add them.
  • Additional ID and deleted fields.
  • Speaker URL and ID. Will be using this to scrape speaker details later.
  • Accurate lower and upper bounds. These were arrived at by trial and error (you’re welcome), as well as the clean up of the data I scraped. More on this later.

Now for the code –

 

There will also be some webids that error out – this means the session doesn’t exist for that specified webid. You will see an error in the output to the PowerShell window, but the script will keep running.

Once you run the script and it completes, you will have your table of session data. You will notice that there are more sessions there than have been submitted, according to the website. If you do a count of the sessions by speaker you will also find that there are more than three sessions (the maximum allowed) for quite a few speakers. Why is this? You are pulling from a data source that contains sessions other than the ones that were submitted to Summit (apparently). This means you will have to clean up the data – I included the deleted field for this purpose. Look at the duration and filter for the 75 minute and 10 minute sessions and update your deleted field accordingly.

After this, you will have sessions that meet the duration, but you will find the speaker is not listed on the submission site when doing an advanced search. Clean the data some more.

Are you done cleansing your data yet? You might think so, but you aren’t. You see sessions that look like they were submitted for Summit but are not on the site. These appear to be sessions that were created and saved, but not officially submitted. You can rule out quite a few of these if you look for speakers that have more than three sessions. Get those counts, check the website for those speakers, and clean your data.

“Good lord, am I done yet?”

No. No you’re not.

There will be speakers that have three sessions in your table but only submitted one or two. This sounds worse than it is, but it is a little painful. How do you do this? You’re going to search the speakers with the advanced search on the website. Do these in groups based on the number of sessions a speaker has, ordering by speaker name. You can exclude the the speakers you have already reviewed in the previous clean-up steps, deleting their names from your list as you go. When you find a speaker with three sessions in your list and fewer sessions submitted, mark the session that was not submitted as deleted. Next, do the speakers with two sessions as they may have only submitted one. Finally, look at the speakers with one session as there may be ones that didn’t officially submit anything.

“But what about the code for the clean-up?”

No. You know how to do this, and may write it differently (and better) than me. My code was ugly and I don’t feel like cleaning it up and posting it here. I gave you the script to grab it all. Stop complaining or I will send you to time-out.

The speaker info grab will be a separate blog post. With the script in this post you might be able to use the syntax and craft your own PowerShell script for this. Or you could wait for my blog on it.

Yeah…I haz the datas!

SQL Saturday NEW YORK CITY? Get a rope!

This weekend I will be speaking at SQL Saturday New York City. My last time here was two years ago for the last SQL Saturday here and I had a great time. If you are in the area come on out for some awesome speakers and a great day of learning!

Anyone remember this? No? Just me, huh? Ok.

Blogging reboot and PowerShell – Get System Information

After whining about not blogging enough, I am going to do something about this. Whether it is PowerShell or SQL, simple or complex, I know that others can benefit from my knowledge and expand their skill sets.

And you are like “That’s great Amy…where’s the PowerShell we are here for?” Ok, Ok….keep your drawers on! 😉

Recently I was tasked with gathering the system information from all of the servers at a client. Another opportunity for some PowerShell dominance.

With this script you can generate system information files and save them to a specified location. It makes sure a connection can be made to the server first, and then outputs the file. The files are created one at a time, so if you pass in a longer list of servers, you shouldn’t crash your machine. From my testing, this will take some time to run as these files don’t output quickly. Despite that, the output is worth it. This can be modified to pull your list of servers from a file or from a Central Management Server (CMS) instance.

Hope you find this useful. Happy PowerShellin’ 🙂

Podcast with SQL Data Partners

Recently I had the pleasure of meeting Carlos Chacon with SQL Data Partners and found out he had a podcast (I know…I live under a rock…with cats). He asked if I was interested in being on said podcast – ummmm…yes, please! Click here (Episode 33) if you wanna go listen to the podcast or here if they get all up in your grill about streaming stuff at the office and you wanna just read the transcript. Thanks to Carlos for the opportunity! Enjoy! 🙂

Cannot drop the database because it is being used for replication – the Right Fix and the Long Way

One of the things I am still trying to get past when it comes to writing about tech is the fact that I can’t always be completely original. I can write about different experiences and scripts but in many cases someone will have already written about the same thing. Part of tech blogging is realizing that even if someone else has already written about something you want to write about WRITE ABOUT IT ANY WAY. Maybe you had a task that was slightly different from what others have written about or maybe you can embellish on something they left out. Bottom line, you can be creative and reach people.

Another part of tech blogging is the fact that sadly there are people blogging about the wrong other ways to resolve certain issues. Your parents told you not to believe everything you read on the internet – are they liars or is that “expert” telling you the best way to do backups is with maintenance plans the one serving up the bull?

I had been asked to detach a database. The person doing it couldn’t and tossed it to me. Cool. No big deal. Then I ran the script to detach and got the error message 3724 – Cannot drop the database because it is being used for replication.

Well, isn't that special.

Well, isn’t that special.

Was there replication? Yes. Were there any publications associated with this database? There used to be but not any more (at least not visible at the publisher server in SSMS). There were some other issues that needed resolving but those had nothing to do with this situation. I pop over to my other screen and start looking for the right scripted solution (I figure I will have to do this again so better that I save it off to a script with some notes).

First, I want to show that the database is enabled for replication – sp_helpreplicationdb is going to give me this information.

help

Ok. Now what?

Based on the results from sp_helpreplicationdb , I now have confirmation that my database is at least enabled for replication. The next thing I need to do is turn off replication for this database with sp_removedbreplication.

Fixed - WITH ONE LINE OF CODE!!!

Fixed – WITH ONE LINE OF CODE!!!

After ran this I was able to detach the database. And all was right with the world. On Saturday night. Did I mention this was on Saturday night? I’m a party animal.

Anyways, back to the tale of the long way to do this…after my search I had opened a few of the results in new tabs in my browser. One of them was here. After a little more searching based on this link I had what I needed with the right syntax. I resolved the issue but when I was done I still had to go back and close the new tabs I had just opened. It was then that I saw a completely different solution. While this other solution may have allowed you to drop the database, it required far more steps than the one I presented above. Also, in my case, I did not want to drop the database, just detach it. Restoring the database from a backup that contains no data wipes out my database, and was not part of the requirement. So essentially, to be able to detach the database I would have to run sp_helpreplicationdb and sp_removedbreplication OR I would have to take a backup, validate that all the data that was needed was in the backup, create the other database, backup that database, copy the backup file, restore the backup with replace over the database I am trying to drop, and then detach or drop the database. In this case, the desired end result was to migrate the database files to a new server and attach them. This would be changed to restoring from the backup of the original database. If you have a case where you are having to do this for multiple databases, this would be multiplied.

Frustrated after finding this, I took to twitter.


I decided I would do the right thing and give the internet one more page with the RIGHT SOLUTION. While we can’t stop people from posting the longer, less practical solutions to things we can post the right ones and keep posting them.

candy

They might have a few pieces of candy left.

Now that we have established the internet lies to you, go call your mother.

T-SQL Tuesday #73 – Those Darn Elves

t-sqltuesdaySince I have not blogged in a while, and I saw it was T-SQL Tuesday, I thought I would participate in this FOR THE FIRST TIME EVER! This month SQLBalls asks us if our SQL Servers are on the naughty or nice list. Since I have recently transitioned to a new role, I am still getting familiar with the servers and the environments I am working with. The servers of my past are but a memory, but a fresh one. I cannot even begin to think of all the different “naughty” things that were done on those servers, and how some days it felt like a losing battle. How can you fight a security or code change when you have been over ridden by management, only to have that change come back and bite you weeks or months later? The “I told you so” you might feel like popping off will fall on deaf ears and you will just be stuck fixing the problem…until it gets to be too much and you decide it is time for action.

I have done this before. The day had been a long one and I thought I was finally going home when I got drug into a call on a data issue – something was changed that shouldn’t have been. When and by whom? I didn’t know or have any way of finding this information (days later when I tried to get a backup that was several months old to validate this data from when it was deployed; there was no backup – all all – but that is a completely different issue for another time). I had my suspicions but no proof. Could have been a naughty developer elf logging in with an elevated SQL account they knew the password for. Changing the password? “Out of the question” they say. It is everywhere.

If this were the only event that had happened THAT DAY. It wasn’t. This one was production (hence the conference call). The others (yes, more than one) were pre-production. Messes that had to be cleaned up because the developer elves thought they knew better than the DBA and that they could do it on their own. This production issue was the last straw. Elves were running a muck and had to be reigned in, and they weren’t going to like it.

All the elevated permissions in the pre-production servers – gone. I didn’t care if it was a DEV server. Am I the meanest DBA in the world? So says some. Scrooge? Well, if you are into name calling and want to go there, then ok, but I get to call you names too. In this case I did not care – I was fixing things that were only broken because someone abused a privilege. It should also be said that there was some relation in the names off all the tables involved with all the issues that occurred on this day.

If I had to wag my finger and any naughty part of the SQL Server instances it would have to be at security…and I am partially to blame. It can be difficult to keep up with all the changes that happen across a large environment when it comes to assigning permissions, and if you have more than one DBA, the situation is compounded by the fact that you might not always know what the other is doing and vice versa. They might grant something that you would otherwise veto for cause. You might take care of a permissions issue one way when they would handle it differently.

While I worked on some Powershell code to pull back users from specific AD groups and incorporate alerts for some of those groups, sadly the bandwidth was not there to fully roll this out. I did however create some triggers that would send email alerts when a change was made at the server and database levels, and I made them nameless and encrypted.

 

 

The lack of a name for each of these is intentional, as is the encryption. The last thing I wanted was someone seeing these ans what they were doing, and if they had permissions to do so, disabling or dropping them to avoid having their nefarious behavior tracked. Even better would have been to put additional triggers in place to prevent the dropping of these no matter what, but I decided not to go there.

Note there is nothing there for the name for each of these – this is courtesy of the devious mind of Rob Volk. He might have too much time on his hands but this is pretty darn crafty. What you name these triggers is up to you but you have to MAKE NOTE OF WHAT THE NAME ARE!!! When I did these they were a combination of a few tabs and spaces – like “space space space tab tab” but with those actual characters. The result looks like this:

 

triggers_noname

It should go without saying use this code at your own risk and always thoroughly vet and test anything before applying it to a production environment.

If this helps further cement my meanest DBA creds then I guess I am doing it right. Sometimes the elves developers can get out of hand and it is up to Santa the DBA to make sure they know they are being watched.

 

My First Time Submitting to SQL PASS Summit – Part 1

I have now officially submitted to SQL PASS Summit 2015. Five sessions. All me (well, and a few others on the panel I submitted). The hard part is over…but how did this all happen?

My story starts back in the year 2013. My first time to attend a SQL PASS Summit. It was in Charlotte, NC. I didn’t know these were normally in Seattle. It didn’t really seem to matter too much either. I was a first timer. And they labeled me as such.

first_timer_2013

I didn’t mind. I knew I was a newb. I embraced it. I signed up for and watched the webinar for first timers put on by Denny Cherry. I also got a first timer buddy (aka Someone I can follow around like a puppy until I am comfortable enough and have found other suitable, like minded, people that will welcome me and allow me to accompany them to places that sell beer).

I met so many new people while at Summit that year and out of all the conferences I had been to in the past, this was by far the best experience I had ever had. One this that happened numerous times when talking to people was hearing the words “Have you thought about presenting?” I’m all “Wow…these people are nice…and encouraging…wait…presenting? Say what?”

Did they need fresh meat? A new crop of willing yet unsuspecting folks to throw their hat in the ring, to hopefully be selected, and dive head first off that cliff into something that might become an obsession that would challenge them in ways they had never thought of before? Maybe. Maybe they did. Maybe it was a little cultish. Well, pass the kool-aid.

I came home and gave it some thought. I really wanted to do this but did not know where to start. I felt like I had stuff to say. Stories to tell. Experiences to share. But where to start? I decided I would do a panel. I rounded up some experienced speakers for a particular SQL Saturday, created my session and submitted it. That was the easy part. I then decided we would have weekly internet video chats so that everyone could get acquainted, and I could get their take on how they saw the panel going. They were also able to provide me much needed guidance with my slide deck (something else I had never done before) and how the session needed structure. I took all this in and did my slides accordingly.

The time finally came for the SQL Saturday event and my panel session. I made the trip out there and met up with everyone. I attended my very first speaker dinner. I could not believe I was there and in the same room with some of the brightest and well known minds in the SQL community. This was it – my first taste of what would consume that entire year.

The panel went well. My fellow panelists and other patted me on the back for a job well done. Feedback was good. I was elated. I was hooked. I thought I was ready to handle what was up next – my first solo session. Just me and my slides. Another SQL Saturday. Another city.

To be continued…

Code Upload – Pragmatic Works webinar

For those that attended my webinar last week with Pragmatic Works I have uploaded the code…finally. For what it is worth, the weekend was busy and I was working on my SQL PASS Summit submissions. Then, just as I was getting this together on Monday night, I was not feeling right and realized I was making mistakes and that is no good. Tried to go to work on Tuesday but even they knew something was wrong with me. Went home and found out later my nephew was also sick. No need for any further details. Just trying to get better. Thanks for your patience. 🙂

 

The lovely people at Pragmatic Works sent me some of your questions from the webinar. Without further adieu…

Do you have a suggestion for BEGINNERS learning books to get a basic understanding of PS??

Great question! While I have a TON of books on all kinds of things (SQL Server, C#, web dev stuff, etc.) I do not have a single book on Powershell. How did I accomplish all this then? Simple – using this pesky thing we call the internet. I have also attended some great Powershell sessions at SQL Saturday events.

Did they leave any beer bottles?

The roofers? Thankfully no.

Do not criticize the king of soda pop, Dr. Pepper!

Preach it!

When will VBScript stop functioning for SQL Server or Windows Server. As in will there be a total switch?

VBScript? Hmmmm…great question. No clue. This might be a Microsoft cockroach. Like replication.

why we need a PS-Drive – why we can just use an arraay directly?

You mean passing in the server name from the array to the Invoke-SqlCmd? Yeah, you could do that. That’s just not how I wrote this. 🙂

what is her email again?

Leave a comment here if you would like to email me and I will get back to you.

does the ( $servers | Measure-Object).count syntax do?

That counts the number of items in the array.

What is the purpose of the New-PSDrive?

New-PSDrive is a method of connecting to a SQL Server instance. Another method would be to declare a connection with SMO (Server Management Objects).

Does it work on windows 2003 as well?

I have not tested this against Windows Server 2003.

so, ps-drive is the equivalent of object explorer in SSMS?

Well, kinda. Maybe. More the equivalent of browsing a drive in a cmd.exe window.

what city in Austin?

Ummmm….Austin is a city.

What permission you require on AD for this excercise

To make the AD changes, I have no clue. Not my area and I also intentionally created the Azure VM so I could make all the changes I needed to without worrying about a silly thing like permissions. As far as reading from AD (which is the main Powershell script for this part) I am not sure about this either but I am assuming you need some level of read permissions (which is all that script is doing).

There are some things where I may tend to run something, and if anyone says anything, ask forgiveness later. Security is pretty tight where I am – I have gotten dinged on xp_cmdshell but so far nothing on reading from AD. If for some reason you can’t run anything against AD, I would find the person over that and find out why, and let them know what you are trying to do.

Thanks again to everyone who attended. If you have any other questions leave a comment. 🙂

3/12 – Webinar : Beginning Automation with Powershell

Anything I can do to reduce the time I have to spend dealing with otherwise time-sucking tasks gives me a happy. Currently, where I am, I have reduced our daily deployment process to a single Powershell cmdlet with a log file review. All the steps that normally went into each deployment (applying SQL scripts, data updates, updating the TFS task, sending an email that the deployment is done) are handled by Powershell, and the process accepts multiple deployments at the same time. As long as there are no errors in the log file for a deployment, there is nothing more to do. There’s an error? Run the same cmdlet with the rollback option.

I am still working on improving this process, from trying to get the error handling to work the way I want it to, to incorporating the automation of check-in to source control of the changes that were just applied. It is a work-in-progress that will continue to reduce the amount of time spent on what would otherwise be manual tasks.

This work led me to the idea for a session on Powershell and automation of processes. Join me and Pragmatic Works on Thursday, March 12th from the comfort of your own desk at 11am EST/10am CST for Beginning Automation with Powershell. This demo-heavy session will go over some different Powershell cmdlets and automation concepts that you can take and build upon in your own environment. Powershell is a tool that no DBA should work without. A development background is helpful but not necessary – what DBA would not want to spend a few hours writing code for a process that will save them from having to do a repetitive manual task over and over again? Automation frees up time for us to do other projects that we want to spend time on and grow our skill sets.

Have questions about this? Leave a comment below. See y’all there. 🙂

#SQLSatPhoenix This Weekend!

Both the SQLSat Austin and Albuquerque events were great – now on to Phoenix. I will be giving two sessions there – Deadlock, Block & Two Smoking Barrels: Breaking Down Blocking and Deadlocks and Making the Leap from Developer to DBA.

The first of the two is a new session for me. When it comes to blocking and deadlocks, I like to be able to identify the cause (and sometimes the culprit) and have the information needed to resolve the issue. Some methods are easier, while with others, you also have to think about the footprint of the troubleshooting itself.

With Making the Leap from Developer to DBA, this session has evolved with time and my own experiences. Anyone who is looking to transition to being a DBA full time should attend to get a real-life perspective on migrating into this role and some of the things you can look forward to.

On top of all this, I have also been invited to be part of the WIT Panel. Come join my self and some other amazing women during lunch as we share and discuss confidence and our experiences in IT.

I’m looking forward to another great SQL Saturday as well as meeting everyone in Phoenix. See y’all there! 🙂