Simply because we use PowerShell to manage items in the enterprise, we want to be as effective as possible. Looking for techniques to eke out much more efficiency is crucial, specifically as our tasks scale. What you do for 10 Active Directory user accounts, may well be different for 10,000 accounts. And just before I get too far into this, comprehend that accessing native .NET class techniques and properties in PowerShell is usually quicker than their cmdlet counterparts.

If you want correct performance, you could create your personal C# application. If you have the capabilities to do either of these, you almost certainly aren&#8217t reading this post. I also am of the opinion that if your job is so efficiency sensitive, PowerShell might not be the right tool for the job. There is usually going to be a little overhead when making use of PowerShell cmdlets, but that&#8217s the trade-off we get for ease of use. My program is to look at a selection of strategies for carrying out some thing in parallel. This write-up is an extension of PowerShell Management at Scale.

For my scenario, I want to use some Active Directory cmdlets and search for products in parallel. I&#8217m testing from a Windows eight.1 domain member desktop. My test network is almost certainly more resource constrained than yours so your results may well differ.

I want to use Get-ADUser to retrieve a list of user names.

$ names = 'jfrost','adeco','jeff','rgbiv','ashowers','mflowers'

If you appear at help for Get-ADUser, you&#8217ll see that the -identity parameter does not accept an array of products.

The Active Directory identity parameter does not accept an array of items. (Image Credit: Jeff Hicks)

The Active Directory identity parameter does not accept an array of things. (Image Credit: Jeff Hicks)

So a command like this will fail.

Get-ADUSer &ndashidentity $ names

We can see that identity accepts pipeline input, however. Running this with Measure-Command took 311 ms. The ActiveDirectory module was currently loaded. I could use a ForEach enumeration and loop via each and every name.

foreach ($ name in $ names) Get-ADUser -identity $ name 

This took 113 ms for me. I know we&#8217re not speaking much of a difference, but my set of names is little. Nonetheless, you can see that even a easy pipeline involves some overhead.

Let&#8217s up the ante, and test with a list of 1,000 names.

Measure-Command Get-ADUser 

This took 4.53 seconds making use of the pipeline.

Measure-Command $ [email protected]() foreach ($ name in $ list) $ a+= Get-ADUser -identity $ name 

This took a tad longer at five.34 seconds, but that&#8217s not as well far off from our previous strategy. Now that we&#8217ve attempted employing the cmdlet a couple of ways, let&#8217s see about wrapping it up in anything like a workflow.

Workflow DemoAD Param([string[]]$ Name) foreach -parallel ($ user in $ name) Get-ADuser -Identity $ user 

The –parallel parameter, valid only in a workflow, will throttle at 32 products at a time by default. Let&#8217s invoke this locally and measure how extended it requires to full.

measure-command $ x = demoad $ list

Workflows also incur an overhead price tag. This took virtually seven minutes to comprehensive! In this predicament, a workflow is possibly not a good concept.


But, I was operating the workflow locally, and it nevertheless had to make numerous network connections to the domain controller. Possibly I&#8217ll get much better benefits invoking the workflow on the domain controller.

measure-command $ y = demoad $ list -PSComputerName chi-dc04

That&#8217s better. That took 1 minute 50 seconds. Knocking out a lot of network overhead is my guess. Since operating this remotely seems to be an improvement, let&#8217s try with remoting.

I currently know that employing a PSSession will enhance performance, so let&#8217s jump correct to that.

$ sess = New-pssession -ComputerName chi-dc04

There are a few methods we could run this in parallel.

Measure-Command foreach ($ name in $ list) $ z+= invoke-command Get-ADUser -Identity $ making use of:name -session $ sess -HideComputerName 

This wasn&#8217t as well bad, exactly where the process took 24 seconds. Despite the fact that keep in mind each result is acquiring deserialized back across the wire and essentially this is nevertheless getting the user accounts sequentially. Since I started out with great results piping the list of names to Get-ADUser, let&#8217s see if running it in the PSSession is any greater.

Measure-Command $ z+= invoke-command -scriptblock param([string[]]$ names) $ a = "" $ a = $ names -session $ sess -HideComputerName -ArgumentList @(,$ list) 

Completed all 1,001 names in 4.42 seconds. As a side note, you may possibly be wondering why I didn&#8217t use the Get-ADUser command.

$ making use of:list | Get-ADUser

It appears that PowerShell doesn&#8217t like performing that unless the variable is becoming employed as an explicit parameter worth. I had to resort to a parameterized scriptblock and pass $ list as a parameter value. Due to yet another quirk with Invoke-Command, I had to define the –ArgumentList worth as an array of values even even though it is the only argument.

When I attempted making use of the following line of code:

-ArgumentList @($ list)

PowerShell only sent the first name from $ list.


What did we find out? In terms of undertaking some thing in parallel, just all about all of the techniques have been no much better than using the pipeline and letting Get-ADUser do its issue. That&#8217s not to say every cmdlet will give the same result. Active Directory is probably optimized for obtaining single user accounts by identity. Filtering may be a distinct story. Subsequent time, we&#8217ll appear at Get-ADComputer and see what is involved in browsing a number of areas in parallel. More than probably if we get great results there, we can apply the exact same techniques to Get-ADUser.

The post An Introduction to Parallel PowerShell Processing appeared very first on Petri.

Petri IT Knowledgebase