Creating a WCF Net.TCP Service and Client using PowerShell

I’m currently attending Regis University for my grad degree, and just finished up a networking class. For my final, I chose to explore the Windows Net.TCP Port Sharing Service. I had this glorious vision of translating some C# code to 100% PowerShell code, but eventually gave into the fact that PowerShell doesn’t support reflection, and a bit of the code had to be written in C#.

Ultimately, I also learned that TCP Port sharing is really just a port proxy facilitated by SMSvcHost.exe, and the differentiator, instead of being a port number, is a namespace. Below is the code I used for proof of concept. You may find it useful when testing your own C# code.

Thanks goes to Allan and this user for getting me started with the WCF PowerShell code.

Posted in PowerShell

Securely Administer Your Remote Windows Network using RDP over SSL

Back in 2013, I wrote a blog post about setting up RD Gateway in Windows 2012 using an AD domain certificate. This post is directed to Windows 2012 R2. There isn’t much difference, but in this tutorial, I’ll demonstrate how to setup RD Gateway with a globally recognized SSL certificate.

Like my previous post about setting up an SSL VPN on Windows 2012 R2, I strongly suggest you forego self-signed and even Enterprise AD certificates, and just use a certificate from cheapsslsecurity.com. This prevents non-domain devices from having to install your CA’s root cert. Getting a legimate cert can take as little as 5 minutes, costs just $5.99 per year and can be obtained in 12 easy steps.

Overall, there are three major steps to getting this going:

  1. Obtain and install your SSL certificate
  2. Install & Configure RD Gateway
  3. Setup your client.

Install the SSL Certificate

Step 1

Follow my tutorial for getting a legit $5.99 cert, down to creating the .pfx file.

Remember to use the external hostname of your RD Gateway server. Say, for example, rdgateway.acme.com instead of rdgateway.ad.local.

Step 2

Import your PFX to the local machine’s Certificate store. To do this, certlm -> Personal -> Certificates -> Right-click, All Tasks -> Import -> Next -> Select your Cert -> Enter your password -> Next -> Finish.

rootstore

Install and configure RD Gateway

Step 1

Add the Remote Desktop Services role. Server Manager -> Manage -> Add Roles and Features
-> Role-based or feature-based installation.

You will be tempted to check the other one, but don’t. That’s for virtual desktop deployment.

role-based

Step 2

Click Remote Desktop Services.

rdgw1

Step 3

Click next a few times, until you’re on the Role Services window. Check only Remote Desktop Gateway.

rdgw2 Read more ›

Posted in Security, Windows

Setup an SSTP SSL VPN in Windows Server 2012 R2

So here’s what’s awesome about Secure Socket Tunneling Protocol SSL VPNs: they give your connecting client an IP and make it a full-on part of the network. And this is all done over port 443, a commonly used port which is often enabled on firewalls. SSTP SSL VPNs are not like some fake “SSL VPNs” that just give users a webpage and some sort of RDP.

It’s also relatively easy setup. While there are a lot of tutorials that show how to setup SSTP SSL VPNs using AD CA generated certificates, I strongly suggest you forego that, and just use a globally recognized certificate. This prevents outside users from having to install your CA’s root cert. It also prevents them from having to make a registry change if your CRL is not published and available online. All around, a $5.99 cert that can be obtained in 12 steps is well-worth the time and money invested.

This tutorial will cover how to easily setup an SSTP SSL VPN in Windows 2012 R2 using a legit cert. If you want to use your own domain’s cert, there are other websites that provide step-by-steps. advancedhomeserver.com is my preferred tutorial.

Overall, there are four major steps to this:

  1. Install the appropriate certificate
  2. Setup Routing and Remote Access
  3. Configure NPS (Optional)
  4. Setup your client.

Install the SSL Certificate

Step 1

First, follow my tutorial for getting a legit $5.99 cert, down to creating the .pfx file.

Step 2

Import your PFX to the local machine’s Certificate store. To do this, certlm -> Personal -> Certificates -> Right-click, All Tasks -> Import -> Next -> Select your Cert -> Enter your password -> Next -> Finish.

rootstore

Install and configure the RRAS role

Step 1

Add the Remote Access role. Server Manager -> Manage -> Add Roles and Features -> Remote Access.

1a

Step 2

Click Next a couple times, then just click DirectAccess and VPN. DirectAccess seems cool, but it’s only intended for mobile domain-joined computers, which I’m not looking to support.

2
Read more ›

Posted in Security, Windows

Super Cheap SSL Certs for Your Home Lab or Small Business

I <3 cheapsslsecurity.com. They (and probably others) sell globally recognized $5.99 annual certs. Now, all of my lab stuff (Lync/RD Gateway/NetScaler/SSL VPN) is encrypted, and I no longer have to manually install my Domain CA’s root cert on my phones/other devices. They don’t have an affiliate program, so I’m not getting any money from this blog post. I just want to spread the joy.

Certs can be purchased using Visa or PayPal, and the whole process takes about 5 minutes. And now that you can verify your identity using email, gone are the days of faxing incorporation paperwork to Certificate Authorities. Even the Certificate Signing Request process has been simplified and can be completed online at cheapsslsecurity.com or the open source CSRgenerator.com.

Now to get your own legitimate $5.99 SSL Cert in 12 steps.

Step 1

You can choose different vendors, but I always use Comodo because why not.

ssl
Read more ›

Posted in Security

Using .NET DataTable Computes and Column Expressions in PowerShell

As a SQL Server DBA, I’m used to SQL being able to perform aliasing, super fast subselects, and easy aggregation. My most recent project, Invoke-Locate.ps1, however, uses SQLite because of its portability and speed when performing simple selects and inserts. The downside is that SQLite doesn’t handle some subselects very well, and I was left with manipulating data on the client-side.

Once I ported locate, I wanted to use the data locate gathered to replicate the results of another favorite tool of mine in Linux, du, which displays current disk usage information. This is helpful when you want to find out what’s using all of your disk space. Specifically, I wanted to replicate the output seen when performing du -all -h.

suse-du2

For the PowerShell translation, I ultimately used $datacolumn.Expression and $datatable.Compute to help aggregate information about disk space usage when the SQL-equivalent subselects proved too resource intensive for SQLite. There was a pure SQLite solution using temp tables, the code was ugly, and tasks like this seem exactly what Expressions and Computes were made for.

psdu

You may notice that du says 69MB, while my output shows 68MB. This is because of slightly different rounding. Windows Explorer shows the size of the folder as 68.3MB.

Below I outline what I’ve learned about DataTable Computes and Column Expressions and how I used them.

What is DataTable Compute?

DataTable Computes aggregate data within an entire table in the format $datatable.Compute($expression, $filter). Think of the filter as a WHERE clause, and the expression like a SELECT Agg(). Microsoft provides the following example (translated to PowerShell). In this example, the computation would show the SUM of all Orders where the EmpID column equals 5.

I had multiple columns to compute, so I added new columns to my data table, and populated each new column with the computed information.

This code basically translates to “For every directory, give me the sum of kilobytes, megabytes and gigabytes of all of its subdirectories, and add this data to the row’s totalkb, totalmb, and totalgb columns.”

What is a DataColumn Expression?

Column expressions are intended to filter rows, calculate the values in a column, or create aggregate columns. They provide similar functionality akin to WHERE, SUM() or CASE in a SQL statements. Microsoft’s data column expression page is actually pretty thorough and provides the following straightforward examples:

Here’s another simple example: say you have a column in your datatable that contained file sizes in bytes. Column Expressions can be used to display the file size as kilobytes, megabytes and gigabytes.

Sample Project

In this sample project, we’ll populate a data table with file information, then use Compute and Expression to display the results in our desired format.

Create Basic DataTable and Populate with Data

Here, we’ll create a DataTable that has just three columns: name, directory and bytes.

du4

Now Transform The Data

Here are simple expressions that concatenate strings, and change bytes into kilobytes, etc. In my own PowerShell project, Invoke-Locate.ps1 I actually used a SQLite view to accomplish similar results.

Create columns which display the full path using directory and filename.

Create columns to display bytes as kilobytes, megabytes, and gigabytes.

Here’s what the datatable looks like now:

du5

Create new columns that will contain the sizes of each directory

Populate new totals columns using DataTable Computes

In the example below, I’m performing the equivalent of a sub select in SQL. With DataTables, however, each column must be populated line by line because Computes are run against an entire table, so you can’t just have one summarizing column.

Now the datatable has useful information, but it’s not quite clean enough. Nor does it look like the Linux du command, which is what I was after.

du3

Use the totals data to make a single column which displays human readable file sizes

Now to use a slightly more advanced Column Expression: the IIF. Here, I created a new column, then used the IIF statement, which “gets one of two values depending on the result of a logical expression.” The code displays the simplified size of a file. If the size is less than 1 MB (< 1025kb), then show the result in KB. Otherwise, if the size is less than 1GB, show it in MB, otherwise, show it in GB.

The Output

du2

If you recall, we started this table with only 3 columns, and through the use of Expressions and Computes, used those three columns to produce more useful output. Too see all the code in one shot, check out the gist.

Posted in PowerShell, SQL Server

Working with Basic .NET DataSets in PowerShell

This is mostly for my reference, but you can use it, too :)

Create Basic Datatable

dt

Create Basic DataSet (Collection of Tables)

dt

Selecting

Create Datatable from CSV

Using this method, you can add 140k rows a second.

Managing Duplicates

Create new table using another datatable’s schema

Filling Datatables with data from SQL Server

Posted in PowerShell

Find Duplicates in PowerShell Datatable using LINQ

I don’t have a practical application for this method, but I wanted to figure out how to do it anyway. Doing so was frustrating but kinda fun. It forced me to learn about lamdas in PowerShell which I’ll probably use at some point in the future. Nivot Ink had some straightforward syntax that helped me, along with Matt Graeber’s post on PowerShellMagazine.

Esentially, I was looking to recreate the following SQL statement in LINQ: SELECT ColumnName, COUNT(*) as DupeCount FROM TableName GROUP BY ColumnName HAVING COUNT(*) > 1. The PowerShell LINQ Equivalent is this basically:

Sample Project

In this sample project, we’ll create some fake data, then query it using LINQ. I tested this on larger data sets and found it got exponentially slower. The OleDBConnection method I wrote about earlier still smokes this technique. Then again, when it comes to .NET, I have little idea of what I’m doing, so maybe it can be optimized. I’m trying to figure out a way to query this list with CreateQuery, too.

Create Datatable with Dupes

Prep the dupe check

First we have to decide what columns to examine. In this example, duplicate artist+album constitutes a duplicate.

Populate new datatable with results and clean up

The above code was prep work, now we’re going to actually execute the query, and populate a datatable with the results. Unfortunately, there’s no good way to convert a Linq list to a Datatable so we’ll just use .Clone() to clone build the new table schema, and then perform an ImportRow.

And voila, the results

get-dupes

Want to see the code all in one shot? Check out the gist.

Posted in PowerShell

High-Performance Techniques for Importing CSV to SQL Server

If you’ve ever tried to use PowerShell’s Import-CSV with large files, you know that it can exhaust all of your RAM. Previously, I created a script on ScriptCenter that used an alternative tecehnique to import large CSV files, and even imported them into SQL Server at a rate of 1.7 million rows a minute. I’ve now gotten that number up to a massive 3.2 million rows a minute for non-indexed tables and 2.6 million rows a minute for tables with clustered indexes.

Note: I’ve added the VB.NET code equivalent at the end of this post, courtesy of a fellow redditor. The VB.NET executable imported about 4 million rows a minute.

Importing into SQL Server

Two of the fastest ways to import data is to use bcp or SqlBulkCopy. BCP is finicky, and requires a separate download. My intention is to create a script that can be used on any machine running PowerShell v3 and above, so these scripts will be using SqlBulkCopy. The catch is that SqlBulkCopy.WriteToServer() requires a Datatable, Datarow, or IDataReader as input. This means your plain-text CSV data must be transformed into one of these objects prior to being passed to WriteToServer().

I’ve tested three different methods for transforming CSV data to a consumable object. The first method uses SqlBulkCopy, StreamReader and DataSet batches. The second streams an OleDbconnection query result set to SqlBulkCopy. The third uses SqlBulkCopy, VisualBasic.FileIO.TextFieldParser and DataSet batches. I actually tested logparser, too, but while logparser is amazing (and way way faster) for parsing files, it doesn’t take advantage of sqlbulkcopy, so these approaches actually outperform logparser (and my tests with SSIS for that matter).

logparservsole

Setup

My lab is a VMWare virtualized lab which consists of a Windows 8.1 machine with 8GB RAM, and a SQL Server 2014 instance with 4GB RAM (poor thing). Both are on on the same ESX host which sports a directly connected Western Digital VelociRaptor 1TB. The database’s recovery model is set to SIMPLE.

The 1.1 GB CSV in this example contains 9.8 million+ rows of tab-delimited longitude and latitude data. A zip containing the CSV is available for download at geonames.org.

First, I had to create the SQL table.

Then I tested the various methods. You’ll probably notice that these scripts don’t have much error handling. I wanted to keep the code easy to read and straightforward. Finalized scripts with error handling and all that will be placed on ScriptCenter shortly.

StreamReader to DataTable batches

This one is by far the fastest, at up to 3.2 million rows per minute on a non-indexed table.

no-index2

Realistically, you’ll probably be importing to an indexed table, which performs up to 2.6 million rows per minute.

26million

Basically, the script performs the following

  1. Creates the SQL Bulk Copy connection
  2. Creates the PowerShell datatable, along with its columns
  3. Reads the CSV file using System.IO.StreamReader
  4. Using readline(), loads the CSV data row by row into the datatable
  5. Performs the bulk import every x number of rows
  6. Empties the datatable
  7. Rinse repeat starting at step 4 until the end of the CSV file

In order to avoid having to specify column names each time you import a new CSV, the SQL table column order and CSV column order must be the same. If you need to rearrange your columns, I suggest creating a view and importing to the view.

If you’re wondering why batching was used, it’s because the datatable seems to get exponentially slower as it grows above a few hundred thousand rows. I actually wasn’t able to import all 9 million rows into the datatable before my memory was exhausted. I tested various batch sizes, and 50k worked best for me.

The downside to using this script is that it doesn’t handle embedded delimiters well. So if you have a comma delimiter, and your data contains “This Works”,”This, doesn’t”, then it will fail on “This, doesn’t.” You can address this with a regex (as seen below, after the code).

The Script

As I mentioned earlier, you can use regex if you have delimiters embedded in quotes. This slows down the script by a fair degree, but here’s some code to get you started.

Streaming a 32-bit OleDbconnection

What I like about this method is that it’s still high performance (~2million rows/minute) and you can select subsets of data from CSV if you wish. Just change the CommandText to have a where clause, for example: $cmd.CommandText = “SELECT count(*) FROM [$tablename] WHERE Category=’Cajun'”. If your CSV doesn’t have column names, you can use the built-in column names F1, F2, F3, etc.

Here are the results from a million row subset:
ole

The downside to this script is that you have to use an x86 shell, but because it doesn’t use any RAM, the performance actually isn’t impacted. Switching to another shell may just get a little complex if you embed this script in a function. I’ve mitigated this to the best of my ability by telling it to call the 32-bit shell with the current scriptname and arguments.

The Script

VisualBasic.FileIO.TextFieldParser to DataTable batches

This technique is the slowest, but if you’ve got embedded delimiters and need to stay in x64, you may want to consider it.

vbsmall

The Script

VB.NET Equivalent

Using a VB.NET compiled exe imports about 4 million rows a minute.

Conclusion

There are a number of tools that can be used to import CSV files into SQL Server. If you’re looking to use PowerShell, the StreamWriter will give you the fastest results, but if your code has embedded delimiters, this may not be the method for you. Using OleDBConnection is a powerful, fast way to import entire CSVs or subsets of CSV data, but requires the use of an x86 shell, unless you download the Microsoft Access Database Engine 2010 Redistributable which provides a 64-bit text driver. Microsoft.VisualBasic.FileIO.TextFieldParser is slower, but works in 64-bit and processes embedded delimiters well.

Posted in PowerShell, SQL Server