As outlined in the blog post decreasing module import times, we’ve taken multiple approaches to reduce the import times, including combining all commands into one large
Last week, I took it a step further.
I don’t know the in-depth internals of Import-Module, but I know that importing a DLL filled with C# cmdlets is extremely fast. For instance, Microsoft’s
SqlServer module imports 100 commands in less than a second. Sometimes it’s closer to half a second!
I wondered if we could somehow compile our commands into a C# binary but that seemed far-fetched. One thing we could do, though, is use compression! It works for SQL Server backups, could it work for PowerShell?
Turns out that the approach worked! I believe this is due to the performance benefits of streaming and reduced I/O. Note this technique is part of a multi-pronged approach which includes runspace and not using Get-ChildItem.
Here’s how I did it. First, each time I publish the module, I rebuild allcommands.ps1 then zip it. This reduces the size of our module on disk a bit as well, too, since the uncompressed ps1 is over 5MB and the zip is less than 1MB 👍
Set-Content -Encoding UTF8 -Path C:\github\dbatools\allcommands.ps1 -Value "### DO NOT EDIT THIS FILE DIRECTLY ###"
Get-ChildItem -Path "C:\github\dbatools\functions\*.ps1" -Recurse | Get-Content | Add-Content C:\github\dbatools\allcommands.ps1
Get-ChildItem -Path "C:\github\dbatools\internal\functions\*.ps1" -Recurse | Get-Content | Add-Content C:\github\dbatools\allcommands.ps1
Remove-Item -Path C:\github\dbatools\allcommands.zip -ErrorAction Ignore
Compress-Archive -Path C:\github\dbatools\allcommands.ps1 -DestinationPath C:\github\dbatools\allcommands.zip
Remove-Item -Recurse C:\github\dbatools\allcommands.ps1 -ErrorAction Ignore
Next, I added the following to our module file. This code is run each time the module imports. You’ll notice that it opens the zip and streams it right in as a script block.
Add-Type -Assembly System.IO.Compression.FileSystem
$zip = [System.IO.Compression.ZipFile]::OpenRead((Resolve-Path -Path "$script:PSModuleRoot\allcommands.zip"))
$stream = $zip.Entries.Open()
$reader = New-Object IO.StreamReader($stream)
$ExecutionContext.InvokeCommand.InvokeScript($false, ([scriptblock]::Create(($reader.ReadToEnd()))), $null, $null)
What’s the trade off? More CPU usage on your part for the moment it takes to stream the file (though you’ll save on I/O) and for me, it’s an extra (automated) step.
Ultimately, this approach shaved off about a third of our import time. If you’re looking to squeeze as much speed out of your import as possible, compression can help. And don’t forget, if you have super slow imports, it may be your Execution Policy.