dbatools to the backup/restore rescue


Today’s blog post is part of T-SQL Tuesday. T-SQL Tuesday is the brainchild of Adam Machanic. It is a monthly blog party on the second Tuesday of each month. Everyone is welcome to participate.


dbatools has been around since 2014. At first, it started with one contributor and was solely dedicated to migrating SQL Server instances, but now it’s grown into an awesome open source DBA project with over 30 contributors.

We have nearly 100 commands for you to enjoy and about 10 of them revolve around Backup and Restore. Even more are planned, as you can read in Stuart Moore’s blog post titled Easier SQL Server Restores using DBATools. Stuart is a long-established SQL/PowerShell powerhouse backup/restore and we’re honored to have him on board.

our backup/restore commands (present)

Want to know more about our snapshot commands? Check out team member Constantine Kokkinos‘ blog post “T-SQL Tuesday 85: Managing database snapshots with dbatools”. A snapshot may not exactly be a backup/restore but its spirit is, fa sho.


Restore-SqlBackupFromDirectory is super useful in a pinch, too, but it’s not quite fleshed out to our standards, so it doesn’t have a corresponding webpage. We expect this will be renamed by the next release.

our backup/restore commands (soon)

  • Restore-DbaBackup
  • Restore-DbaSnapshot
  • New-DbaSnapshot
  • Test-DbaBackup
  • ???

Coming up, we’re going to have a fully featured Restore-DbaBackup command that can easily restore a single file, to a point in time, from a Ola Hallengren styled directory structure or a formatted hastable (like the one I’ll pass to process my Test-DbaLastBackup command).

We’re eager to keep going. Have any ideas? Let us know by filing an issue or joining our Slack Channel on the SQL Community Slack and telling us what you need.

my favorite backup/restore command

I love this entire toolset but damn, I’ve been needing a command like Test-DbaLastBackup for a long time. Testing backups can be such a pain, but PowerShell can make it easy for your entire estate. Not just one SQL instance where you’ve got some specific stored procedure installed, but all of your SQL servers — from one workstation!

I created Test-DbaLastBackup because I wanted to test all of my backups and using PowerShell was the easiest way to do it. It parses Get-DbaBackupHistory for the last full backup, restores it, runs a DBCC CHECKTABLE and then drops the test database.

Here’s a test I did on my SQL Server 2016 instance. I ran Ola Hallengren’s Full backups for both system and user, then executed Test-DbaLastBackup -SqlServer sql2016. Simple AF!

The command’s webpage is at dbatools.io/Test-DbaLastBackup. Pretty easy, eh? That goes for every command we have, too. Just dbatools.io + /The-CommandName.

Back to the command, Test-DbaLastBackup. It:

  • Restores the test databases (named dbatools-testrestore-$dbname by default) to the specified -SqlServer unless you specify a -Destination.
  • Then it’ll restore to the Destination (if remote, so long as the backups are on a shared directory — which mine always are).
  • You can even specify a MaxMB if you don’t have space for super large database restores (in that case, the -VerifyOnly switch could come in handy).
  • Want your Data and Log files to go somewhere other than default? Use the -DataDirectory and -LogDirectory params.

This command is only in its infancy, too! Next up, team member Christian Solje will be adding additional features like Point in Time restore.


Need to know more about our commands? In every release, we require some basic docs so just run Get-Help Test-DbaLastBackup -Detailed and you’ll be greeted with Examples, docs for each parameter, a synopsis a description and other useful stuff.


Intrigued and haven’t installed dbatools yet? Hit up our Downloads page to see how to install it on your system.

youtube and twitter

We have a YouTube channel if you want to watch some videos. That’s the YouTube channel for the SQL Community Collaborative where you can watch videos about open source, (mostly) SQL PowerShell projects like dbareports.io and, of course, dbatools.io.

We are also on Twitter as @psdbatools.

one more thing

If you think we’ve done something that sucks, let us know, either on GitHub or Slack. We welcome and even enjoy alternative approaches and civilized debate about processes. Or hey, if you love something we’ve done, feel free to drop by the Slack channel too.

We like to release early, release often which means that we’re nimble and responsive to changes and requests. (It also means that, while we are at v0.8.69, version 1.0 which is due in Summer 2017 will likely introduce breaking – but beautiful – changes. This is thanks to our Style Sheppard Klaas Vandenberghe.)

in conclusion

We’re a bunch of DBA’s who, like you, want to have an even easier time managing SQL Server. dbatools uses PowerShell to help do exactly that. Want to join the fun? We’re an inclusive group — even if you don’t know PowerShell, we’ll help you. Even if you don’t know Git, we’ve got you covered. Come with your experience, your scholarly background or your enthusiasm and passion. We’d love to hear from you.

Thanks for reading. I’m looking forward to the next T-SQL Tuesday!

Posted in PowerShell, SQL Server

This is How I Work

Inspired by Adam Bertram being inspired by Lifehacker’s How I Work series, I’m also doing a post about how I work ;) I really liked Adam’s balanced assessment so I decided to follow in his footsteps there, too.

Where are you located?

Belgium, land of the free, home of the beer. Also home of the Belgian PowerShell User Group, which I run with Luc Dekens.


I moved here back in 2012 from Washington D.C. Prior to my short stint in DC, I lived in Southern California, Northern California and Southern Louisiana.

What is/are your current gig(s)?

I’m a Systems Engineer/DBA for General Dynamics Information Technology. I’ve always heard about General Dynamics growing up because they’ve got a bunch of rocket scientists so it’s pretty cool to get to work for them.

I also have a couple other side gigs. I’ve always preferred it that way; get my primary source of income and health insurance from a corporation then do other stuff on the side. So I’ve had netnerds.net since 1997. The work I do through netnerds is generally web development or systems engineering. But between finishing my masters at Regis University and doing community work, I haven’t had much time for contracting lately.

My other paying side gig is RealCajunRecipes.com. I run this site with my mom and best friend. My mom does some pretty amazing work on Facebook. We’ve got about 90,000 likes as of today. The money we get from Google Ads helps pay for my bombass homelab.

What’s one word to describe your work?


I frikken love what I do. All of it is fun, even when it’s exhausting and even when it sucks.

What apps, software, or tools can’t you live without?

The obvious things: PowerShell, SQL Server Management Studio, RSAT, Remote Desktop Client, putty, GitHub Desktop

Not as obvious:
Snagit 13
Ethervane Echo
PowerShell Studio

Snagit 13 is amazing! I used to use Camtasia and iMovie (well, I still do sometimes) but mostly use Snagit these days. I also use Handbrake to remove black bars on the side of videos and compress the hell out of videos.

I also love leather-bound notebooks and glitter pens. Oh! And I can’t live without my lil Address bar. It’s the first thing I turn on when I log into any desktop.


What does your workspace look like?

Here’s my home office.


That flag on the right is the Acadiana (Cajun) flag. My wife is kind and keeps my space clean – I’m messy by default. Also, I love Apple products, lighthouses and Aveda candles.

The monitor looks all lit up because it is — I bought some Luminoodle bias lighting to help prolong the time I can spend at a computer ;)

What’s a typical work week look like?

My work week starts on Sunday when I spend the first half of the day procrastinating and the second half doing school work. On weekdays, I get up and get my energy from vitamins and energy drinks or tea, depending on whatever phase I’m in.

At work, I do mostly DBA stuff and try to avoid doing SharePoint stuff. Then I go home and either work or hang out. When I work, I usually do dbatools and/or host a webinar for the PASS PowerShell Virtual Chapter.

My weekly output varies depending on my level of burnout. Sometimes I have to force myself to go to bed at 4am because everything is awesome and I’ve got the Flow or I crash around 8pm, exhausted from staying up till 4am for weeks straight ;)

Reading Steve McConnell’s Code Complete helped me with accepting this oscillation between Flow and burnout, which he pointed out is normal for programmers. I’ve tried to moderate but that always just ends up in a premature onset of burnout.

Speaking of, I once had burnout for 3 years. 2008-2011 was pretty much just spent partying. So if you ever get prolonged burnout, don’t worry. Recovery is possible and things won’t have changed so much that you can’t pick it up. The basics will always remain the basics, unless you work with JavaScript.

What do you like the best about your role?

I really love that I get to work with PowerShell all the time. I get to choose the tools I want to use and I’m encouraged to automate.

Also, my role happens to be in Belgium and I love living in Europe. My wife loves it. Our cats love it. We feel very fortunate to be surrounded by beer, different languages and cool people. The chill but enthusiastic members of the European tech community are so down to earth and amazing to be around, too. It’s heaven.

What’s something about you that no one knows about?

I used to work for Ani Difranco. Well, she was my client. I wrote a really long story about it back in 2004. Warning: the writing is cheesy and reminds me of fanfic.

What do you listen to while you work?

I’m totally addicted to Spotify and have over 100 public playlists. Overall, the heaviest in rotation are the House of Cards soundtrack, Hacker’s Soundrack, Spotify’s Trapaholics and then my own Dark or Slow, which has songs that are either dark or slow (but always sexy).

What do you wish you could change about your work?

I wish that I could enjoy downtime a bit more. I often feel guilty, as though I should be doing something. Unless I’m traveling. So I guess – I wish I could enjoy downtime at home without guilt. I guess in this way, I kinda feel like Adam. There’s just so much to do.

Is there anything else you’d like to add that might be interesting to readers?

When I first won MVP back in 2015, I became totally overwhelmed and went into a 2 month depression. I was SUPER excited, but then I didn’t know how to handle the attention. It took me a while to accept that I could live up to what Microsoft saw in me.

A few things made it better. First was seeing this thread on Reddit where SharePoint MVPs pointed out that one downside to being an MVP is that people assume you know everything about the product, when really each of us have our own specialties.

I always knew people would expect me to be a PowerShell syntax and architecture expert and I’m not. That thread helped me accept this fact. I’m extremely honored that I can help make a difference in the world of SQL Server and PowerShell, however. That’s what I love and it’s my passion. But when I need syntax help, I do what other people do and ask members of the PowerShell team or other MVPs who specialize in the language itself. I usually leave it to others to debate semantics, which I totally appreciate and ultimately benefit from.

The second thing that helped was going to the MVP Summit and being surrounded by MVPs who were really supportive of my work. SQL Server MVP Aaron Nelson has always been a key person for me in this MVP adventure. He’s always so excited to introduce me to people and he nudges me along when I’m burned out or trying to avoid meeting new people because I’m shy.

And in general, the PowerShell MVPs are just an incredible group of professionals. Like, many of us have strong opinions, but at the end of the day, everyone is so respectful and accepting, even when we’re really direct or impassioned. I’m honored to be a part of such a great group; it’s been positively life changing even if my brain got off to a rocky start :)


Posted in General

PowerShell: Too Many If Elses? Use Switch.

One of my favorite things in PowerShell (and other languages) is the switch statement. It’s clean and a much better alternative to a ton of if elses. Ever had this happen?

The better way to do it is by using the switch command.

Just look how pretty that is. It gets better, though! My buddy Klaas Vandenberghe showed me a more succinct way to use switch.

Even less code and makes total sense. Awesome. There’s even more to switch — the evaluations can get full on complex, so long as the evaluation ultimately equals $true. Take this example from sevecek. Well, his example with Klaas’ enhancement.

If you weren’t familiar with switch, I hope this was helpful. I know I was excited about its elegance when I was first introduced by Lee Holmes and Klaas’ enhancement made it even better.

Posted in PowerShell

Immediately Output Runspace Results to the Pipeline

In my previous post, I presented the template I use anytime I need to add multithreading to my scripts.

Recently, I had a request to add multi-threading to Read-DbaBackupHeader. This was the first runspace in which I had to output the results from the runspace to the host — usually I just needed commands to run in the background, like with bulk-inserts to SQL Server.

So two interesting things came out of this particular change. First, I made use of the time that was spent waiting for runspaces to finish.

Now, I’m basically doing this

Ultimately, though, this is what I had to do to get the output right to screen.

Now, I came upon an issue with the $restore sometimes not returning anything. I’m not sure why; perhaps it has something to do with pooling. SQL Server didn’t return an error reading the backup header, it just returned nothing. To handle this issue, I basically restarted the runspace if the result came back as null. The code is here if you’d like to see how I handled it.

Copy/Pastable Code

I generally like to provide code that actually does something useful but in this case, it just complicated things. So this outputs text — but it does it directly to the pipeline without waiting until all runspaces have finished.

If you’d like to see this with a couple more comments, check out the gist.

Hmm, I wonder if this runspace will be easier for others to understand. Looks like I’ll be linking this blog post in my previous one. Hope this helped!

Posted in PowerShell, SQL Server

Runspaces Simplified (as much as possible)

Last year, I was looking into multi-threading in PowerShell and, with the help of Dr. Tobias Weltner, Boe Prox and Martin Pugh ultimately decided on runspaces.

Then, I presented at psconf.eu about using runspaces to speed up SQL Server/CSV imports. Runspaces took me from about 90,000 rows per second to 230,000 rows per second on average.


Runspaces can be intimidating. I had heard about them, took a look at the code and was like “Ah, that looks complex. I’ll learn that later.” Because of this, I wanted to ease the audience into runspaces and repeatedly went over the bulk insert code to familiarize the audience with the functionality that I was eventually going to multi-thread.

It seems like that approach worked. The audience wasn’t overwhelmed (or didn’t admit to it ;)) — mission accomplished!

All of the code and the PowerPoint seen in the video can be downloaded in my directory on PSConfEU’s GitHub Repository.

Runspace Template

Ultimately, when I want to add a runspace to a command, I download code.zip from my presentation and copy/paste/edit 6-runspace-concept.ps1. When I add multi-threading to my scripts, I just copy and paste the code below, then modify. Understanding everything that’s going on isn’t immediately necessary.

If you’re using Runspaces as an end-user, try Boe Prox’s PSJobs module instead. But if you don’t want dependencies, you can paste the code below.

So that’s basically it. Adding multithreading honestly just requires a bunch of copy/pasting. I generally modify Steps 2 and 3.


  • BLOCK 1: Create and open a runspace pool. You don’t have to, but it increases performance so I always just leave it in. CreateRunspacePool() accepts min and max runspaces.

    I’ve found that 1 and number of processors+1 (thanks to Steffan for informing me why 5 was always my quad processor’s sweet spot.) Then I play with MTA and STA and see which one works better.

    I also create a runspaces array to keep track of the runspaces. Unfortunately, you can’t do like $pool.runspaces to get the collection, so you have to make your own.

  • BLOCK 2: Create reusable scriptblock. This is the workhorse of the runspace. Think of it as a function.

  • BLOCK 3: Create the runspace and add to runspace pool.

    If you write a lot of PowerShell functions, it should be apparent what it’s doing. Basically AddScript($scriptblock) is the function name, then AddArgument($connstring), .AddArgument($datatable) and AddArgument($batchsize) are the parameters.

    Note that you may find yourself passing a lot of parameters because the runspace will be mostly unaware of the variables that exist outside of the $scriptblock.

  • BLOCK 4: Add runspace to runspaces collection and start it

  • BLOCK 5: Wait for runspaces to finish

  • BLOCK 6: Clean up

  • BLOCK 7: Look at $results to see any errors or any other return

Up next

Now there are a couple Runspace commands in v5 but there’s no New-Runspace so this code still applies to v5. I generally code for v3 so this script won’t be changing much in the near future.

In the next blog post, I’ll detail a slightly different runspace and immediately outputs the results of the runspace to the pipeline.

Edit: Actually, I just finished that blog post and decided to paste the code here. It’s good for repetition and shows another simplified runspace, this time without comments. If you’d like more info on this runspace, check out the post.

Posted in PowerShell

Contributing to dbatools & dbareports using GitHub Desktop

Repost from dbatools.io: Recently, I presented to the Philadelphia PowerShell User Group about Contributing to Open Source PowerShell projects using GitHub Desktop. I chose this topic because a number of people have asked for step-by-step instructions on how to work with GitHub.

I love the command line, but with Git? I think it’s important to get the concepts down first and the Desktop version makes that a lot easier to contribute to projets like dbatools, dbareports and even PowerShell itself.

In the presentation, I pretty much follow along with the tutorial on our wiki called Your First Pull Request. But! I also included some information about working with PowerShell, PowerShell modules and PowerShell-centric GitHub repositories that was not in the tutorial, so there’s still some good information to gain even if you already know how to make a Pull Request.

So if you’ve wanted to get involved in the project but were a bit hesitant because you aren’t comfortable with git, check out this informative and fun video and Get-Contributing ;)

Posted in PowerShell, SQL Server

Share VPN with OS X Sierra Internet Sharing

After finding that it suited my requirements, I finally decided on a solid VPN – F-Secure Freedome – as recommended by a friend in security. Then I needed to share my connection with my Roku, but Freedome is an application that has to be run on a phone or computer and routers aren’t supported.

Once I confirmed that sharing a Freedome encrypted connection is supported and within the TOS, I began the 6-hour journey into sharing my VPN connection with my Roku.

Try, Try Again

First, I tried simply connecting to Freedome then turning on my Mac’s Internet Sharing. I found that as soon as I successfully connected to Freedome, my Internet Sharing would stop working.

So I revisited my blog post that I wrote a few years ago, How to Setup NAT on Lion and Mountain Lion, because I figured I needed to do some command-line magic.

The commands I used there were deprecated, so I moved on and found this awesome comment on StackExchange to get me started. This comment provides a majority of the code below. But it was only mostly working. This post, Share your VPN with Mac OS X El Capitan, felt like it got me closer.

And finally, this post suggested that I setup two rules instead of one and voila! Success.

Setup the infrastructure

Set this part up however you need, I did this wired to wired and it worked, too.

So I connected my Mac to my primary wireless network haxx, then plugged the Mac’s Ethernet port to the upstream of my second wireless router otherhaxx. I setup the otherhaxx the way I would any other wireless router and connected my Roku wirelessly to otherhaxx.


Note that some of these IPs won’t appear until later. Namely, Internet Sharing creates the 192 subnet once it starts and the Roku won’t get its IP until everything is setup. Also, was my first broadband IP ever back in 1997 :D

Now back to the Mac. I did some ifconfigs and found that my wireless connection was en1 and Freedome created an interface named utun0. Mac Internet Sharing created a network with the subnet 192.168.2.x.

I then set Freedome to connect on startup.

Ensure you add a single trailing empty line

My code formatter won’t let me add them, but in the original post, the author said a single trailing empty line in each file is required.

Create /private/etc/nat-rules

This is where I spent most of my time testing and retesting different configs. This right here is the key to sharing your VPN connection with Internet Sharing. I tried doing it without explicitly listing the subnet but was unable to get it to work.

Create /usr/local/nat-pf.sh

This file is cool because it’s universal; you really only need to edit nat-rules when you need to make modifications. Oh, also, one of my Macs refused to acknowledge that net.inet.ip.fw.enable existed and it still works anyway. I don’t think that particular entry matters.

Create /Library/LaunchDaemons/org.user.natpf.plist

No idea if this works, but it seems to. I should reboot and test ;)

Clean up and launch the daemon

Now that the files have been created, ensure they’ve got the proper owners and permissions.

Didn’t work?

I ended up editing /private/etc/nat-rules and executing /usr/local/nat-pf.sh more times than I can count. Mess with that until it works. Ensure you’re using the appropriate device names be examining the output of ifconfig.

Posted in Networking, OS X & iPhone

Guest blogger Cláudio Silva: Why Expand-SqlTLogResponsibly?

Hey yall! I invited one of the most active dbatools developers, Cláudio Silva, to do a guest post about his motivation for joining the team and why he built this particular command. So without further ado… here’s Cláudio :D

Hi everyone my name is Cláudio Silva, I’m a MS SQL Server DBA, and have worked with SQL Server since version 2000. Previously, I worked as SQL developer for 8 years before becoming a full time DBA. I’m also a PowerShell enthusiast!

I was invited by no one but PowerShell MVP Chrissy LeMaire (thank you Chrissy!), to share why I come up with the idea of develop this function in PowerShell and why give it away to community.

I will start with the second part. When I first discovered the dbatools project I was pleasantly surprised by such a project that is so useful. Remember when SQL Server 2005 mainstream support ended last April? Well, like many other organizations, we will have to migrate so I decide to analyse, test it and get to know the limitations of dbatools.

When I saw that is an open source project, that Chrissy was soon coming to Portugal to present about migrations at TugaIT, I thought I want to share with her my opinions and be part of the team. There are so much fun work we can develop and help the community that being help me on the last years. This is my opportunity to give back some love! And here I am.

Now, going to what really matters: why did I create a PowerShell function (Expand-SqlTLogResponsibly) to help SQL Server database transaction log to growth responsibly?

A little background

Transaction logs are one of the most important parts of every database and it has a performance side that should not be neglected.

The transaction log is composed by one or more VLF (Virtual Log Files). Having a transaction log with too many VLFs can hurt your database performance in many ways.


  • Transaction log backups may slow down
  • Database recovery can take to much time (I have seen by my own hours for recovery after database shutdown just because transaction log has more that 1000 VLFs)
  • Also can affect insert/update/delete performance.

Ok, but what is a Virtual Log File and how they are created?

When we say to SQL Server growth (or auto-growth hit) the transaction log file SQL Server is dividing that amount of space in various “logic files” (VLFs) with the same size. Here, the number of VLFs vary from the quantity of space that is being added.
Read more about it in the following link from Paul Randal (b), also note that in SQL Server 2014 this algorithm has changed (Important change to VLF creation algorithm in SQL Server 2014).

How can I see how much VLFs I have right now?

You can use the command DBCC LOGINFO (‘<database name>’) to know how many VLFs your transaction log. Each row returned is one VLF.

What have we to do to fix this value?

We have to do 2 major steps:

  1. Shrink the transaction log size
  2. Grow the file manually to a specific size
  1. We need to take into account the database RECOVERY MODEL because if database is using FULL recovery model then we will need to backup transaction log and only after that we can use the DBCC command “DBCC SHRINKFILE (‘<logfile_name>’, TRUNCATEONLY)”. After that you have to check your transaction log size and/or number of VLFs in order to know if you have to repeat this step (BACKUP/SHRINFILE)
  2. You need to use:

ALTER DATABASE <databasename> MODIFY FILE (NAME = ‘<transactionloglogicalfilename>’, SIZE = <newtotalsize>)

I see, but for the second step if I want to growth the file by 50GB (because was the size I saw before start the step 1) should I do it at once? Or should I execute many times with smaller sizes until get the 50GB?

This is a very good question. If you have too few VLFs for a 50GB log file when the VLF become available to be re-used? Or if we want to shrink the file (to growth) we can only do this by removing VLFs that are marked for reuse.

If we growth the file by 50GB (prior SQL 2014) we will get 16 VLFs of roughly 3200MB each, which means that we can’t reuse the VLF before 3200MB being reached (this could not be bad if you do not need to shrink again). That is why our function suggest to growth the file in 8GB chunks (will create 16 VLFs of roughly 512mb) until achieve the desired size (for 16GB or higher target size). Would be 8 them for 16 – 24 – 32 – 40 – 48 – 50. We have to execute the command seven times to reach the value we want.

Now that I am more familiar with the concepts, why do you convert this steps into PowerShell function?

The quick answer is automate all this steps! Call the function and sit back watch it do all the work for you!

Really, just type the Instance, database and your desired target size. We will figure out the best chunk size for your needs. You would not need to choose the chunk size, not even write and call the ALTER DATABASE command 7 times (picking last example).

NOTE: If you want, you can specify the -IncrementSizeMB parameter to say the desired chunck size.

This include the backups of the transaction log?

YES! Jason Squires have contributed with code for the backup/shrinkfile part! Which means if you do your backups to file system you can say the path and the function will do ALL that work for you!

With a single command we can get up to 5 log backups plus X alter databases commands executed without any further actions needed.

Please visit the Expand-SqlTLogResponsibly dbatools page to view examples and give it a try!

If you have any doubt or want to share your thoughts, drop me a line in comments or send DM on Twitter.

Thanks for reading!


Posted in PowerShell, SQL Server