I currently work with a client around an environment orchestration product written entirely in PowerShell. It's in charge of bringing up various test environments based on around various application stacks. It's sitting at around 50,000+ lines of code across all of the modules and heavily depends on connecting to remote virtual machines.

From day 1, this product was built using a combination of techniques to deliver configuration changes to remote virtual machines. The majority of the time Invoke-Command �was used, but there are dozens of cases where the ComputerName parameter was used with various cmdlets like Get-Service, Get-CimInstance, Get-WmiObject, etc.

The ad-hoc method of connecting to remote VMs has worked fine for years since the virtual machines were being built from a standardized VMM template and were all within a single internal network with no firewall restrictions. This meant that you could connect via WinRM via Invoke-Command, DCOM via Get-WmiObject, enumerate files with Get-ChildItem and a UNC path and any other way to remotely interact with configuration items on the VMs.

We're now attempting to build these environments in places like a firewalled off DMZ, Azure, and AWS. These are different environments that have different restrictions and different methods of properly managing the VMs remotely.

I now have to re architect the way the product was built to talk to remote VMs due to these differences. We need to deploy a VM in Azure, AWS or internally the same way; New-VirtualMachine. To this requires a standard approach to communicating to each of these VM instances. Since each VM instance is Windows, the way to do this is through WinRM sessions.

Everything needs to go through WinRM session. I'm talking about wrapping all of these remote calls in a scriptblock and executing them locally.

For example, a typical way to enumerate files on a remote machine would only use Get-ChildItem and point to a UNC path.

Get-ChildItem \\REMOTESERVER\c$\Folder

No big deal, right? It works just fine. Now, are all the ports required for that open to work through a DMZ? Is the host firewall configured on your Azure VMs have the ports open? You get my point.

What about copying files over SMB? Copy-Item C:\Folder \\REMOTESERVER\c$ �is pretty straightforward. True. But, again, SMB is a different protocol which requires different open ports. Why not just use Send-File or the new ToSession parameter in Copy-Item to transfer the bits over WinRM?

Just because you can point commands directly to remote machines doesn't necessarily mean you should.

Instead, why not use Invoke-Command and execute the same command locally

Invoke-Command -ComputerName REMOTESERVER -ScriptBlock { Get-ChildItem C:\Folder }

Notice that it requires a little more typing but gives you one distinct advantage. Standardization. By using Invoke-Command to execute all remote code allows you the convenience to open a single port across all of your VM instances and network configuration. It also allows you to apply a single configuration across all your VMs and ensure the code will run successfully.

The next time you're working on a PowerShell project that might require the ability to work within a single environment consider your remote communication strategy carefully.

Join the Jar Tippers on Patreon

It takes a lot of time to write detailed blog posts like this one. In a single-income family, this blog is one way I depend on to keep the lights on. I'd be eternally grateful if you could become a Patreon patron today!

Become a Patron!