Are you struggling to find the perfect PowerShell module for your unique scenario? With thousands of modules available, it might feel like you should just settle for what’s out there. Well, that could leave your solution incomplete or inefficient. Why not start creating modules instead?
This guide will walk you through creating modules to build robust, reusable solutions tailored to your needs.
Turn your scripts into powerful building blocks you can reuse across different projects!
Building a Computer Inventory Module
In this guide, we’ll create a PowerShell module for gathering computer hardware information. This module will help system administrators collect and report on memory, storage, and processor details across multiple systems.
Our module will feature:
- Functions to gather specific hardware information
- Remote system support using PowerShell sessions
- Standardized output format for consistent reporting
This practical example demonstrates essential module development concepts while creating a useful tool for system administration.
Setting Up the PowerShell Module
Managing your scripts across multiple systems can quickly become chaotic. But when your workflows become an uphill battle, PowerShell modules come in handy. A module is a structured way to group and reuse scripts to save time and reduce errors.
Let’s combine key concepts to build a PowerShell module.
Start by creating the module directory and defining the module itself to organize your work.
## Create the module directory in the all-user location mkdir 'C:\Program Files\PowerShell\Modules\ComputerInventory' ## Create the module to hold the module functions Set-Content -Path 'C:\Program Files\PowerShell\Modules\ComputerInventory\ComputerInventory.psm1' -Value ''
The Set-Content
command creates a module named ComputerInventory
in the all-user path. This location is chosen because it makes the module accessible to anyone logging onto the machine, which is crucial in enterprise environments where multiple users need access to the same PowerShell functionality. Unlike user-specific locations, this centralized path ensures consistent module availability and easier management across the system.
Verify the module’s availability:
## The module is already showing up as available Get-Module ComputerInventory -ListAvailable
Although currently a shell, this confirms it will load adequately later.
Scaffolding Functions
A well-structured module is critical, but what’s inside makes it truly useful. Avoid wasting time figuring out what each part does instead of being productive by creating scaffolding for your module functions.
Open the module in a text editor like VS Code, then scaffold functions.
Start by creating placeholder functions with descriptive names.
function Get-MemoryInfo { [CmdletBinding()] param() } function Get-StorageInfo { [CmdletBinding()] param() } function Get-ProcessorInfo { [CmdletBinding()] param() }
The function names follow a consistent verb-noun naming convention.
The function names in PowerShell follow a verb-noun naming convention, which is a standardized naming pattern. In this case, the functions are named:
- Get-MemoryInfo
- Get-StorageInfo
- Get-ProcessorInfo
Each function name starts with the verb “Get” (indicating it retrieves information) followed by a noun that describes what information it retrieves (Memory, Storage, or Processor).
This naming convention is important in PowerShell because it makes functions predictable and easier to understand – users can quickly grasp what a function does just by looking at its name.
Verify their existence by running the following:
Get-Command -Module ComputerInventory
When you run the command Get-Command -Module ComputerInventory
, you would see output similar to this:
CommandType Name Version Source ----------- ---- ------- ------ Function Get-MemoryInfo 1.0.0 ComputerInventory Function Get-ProcessorInfo 1.0.0 ComputerInventory Function Get-StorageInfo 1.0.0 ComputerInventory
This command lists all the functions available in the ComputerInventory module, which includes the three functions we created: Get-MemoryInfo, Get-StorageInfo, and Get-ProcessorInfo.
At this stage, the module includes function shells. Let’s enhance these functions by defining consistent output using custom objects.
Standardized Output with Custom Objects
Inconsistent outputs across scripts can turn a simple task into a nightmare of data parsing and troubleshooting. In professional PowerShell development, ensuring consistent outputs is a cornerstone of effective scripting.
Standardizing output with custom objects helps maintain consistency across functions.
In the following script:
- The custom objects include
ComputerName
,HardwareCategory
, andInfo
properties. - The
HardwareCategory
property groups similar hardware types, andComputerName
is designed for multi-computer scalability.
function Get-MemoryInfo { [CmdletBinding()] param() $outObject = @{ 'ComputerName' = '' 'HardwareCategory' = 'Memory' 'Info' = $null } $outObject } function Get-StorageInfo { [CmdletBinding()] param() $outObject = @{ 'ComputerName' = '' 'HardwareCategory' = 'Storage' 'Info' = $null } $outObject } function Get-ProcessorInfo { [CmdletBinding()] param() $outObject = @{ 'ComputerName' = '' 'HardwareCategory' = 'Processor' 'Info' = $null } $outObject }
First, let’s re-import the module to make sure we have the latest version:
Import-Module ComputerInventory -Force
Now you can run the functions to see their output:
PS> Get-MemoryInfo Name Value ---- ----- Info HardwareCategory Memory ComputerName PS> Get-StorageInfo Name Value ---- ----- Info HardwareCategory Storage ComputerName PS> Get-ProcessorInfo Name Value ---- ----- Info HardwareCategory Processor ComputerName
Each function returns a hashtable with empty ComputerName and Info properties, but with their respective hardware categories defined.
Adding a Session Parameter for Remote Support
Imagine needing to run your scripts across dozens or even hundreds of computers. If each function required manually specifying a computer name, it would be cumbersome and error-prone. Fortunately, PowerShell Remoting provides a solution.
Instead of a ComputerName
parameter, use a Session
parameter to leverage PowerShell Remoting:
function Get-MemoryInfo { [CmdletBinding()] param( [Parameter(Mandatory)] [System.Management.Automation.Runspaces.PSSession]$Session ) $outObject = @{ 'ComputerName' = $Session.ComputerName 'HardwareCategory' = 'Memory' 'Info' = $null } $outObject } function Get-StorageInfo { [CmdletBinding()] param( [Parameter(Mandatory)] [System.Management.Automation.Runspaces.PSSession]$Session ) $outObject = @{ 'ComputerName' = $Session.ComputerName 'HardwareCategory' = 'Storage' 'Info' = $null } $outObject } function Get-ProcessorInfo { [CmdletBinding()] param( [Parameter(Mandatory)] [System.Management.Automation.Runspaces.PSSession]$Session ) $outObject = @{ 'ComputerName' = $Session.ComputerName 'HardwareCategory' = 'Processor' 'Info' = $null } $outObject }
This parameter ensures flexibility when scaling to multiple systems.
The Session parameter is designed to use PowerShell Remoting for executing commands on remote computers. Here’s what makes it powerful:
- It’s defined as a mandatory parameter that accepts a PSSession object (specifically of type System.Management.Automation.Runspaces.PSSession)
- The Session parameter automatically provides the computer name through $Session.ComputerName, which gets populated in the output object
This approach offers several advantages:
- It allows for efficient scaling when working with multiple systems
- Instead of creating new connections for each command, you can reuse the same session for multiple operations, which is more efficient than establishing individual connections for each function call
- You can test the functions by creating a single PSSession and using it across all the inventory functions, as shown in the example where a test session is created with: $testSession = New-PSSession -ComputerName SRV2
Save and re-import the module:
ipmo ComputerInventory -Force
Testing the Functions
How do you ensure that a module works after building it? Testing is essential to confirm that your module’s functions perform as expected and return accurate data. Skipping this step could lead to surprises in production environments.
Establish a remote session and test the module:
$testSession = New-PSSession -ComputerName SRV2 Get-MemoryInfo -Session $testSession Get-StorageInfo -Session $testSession Get-ProcessorInfo -Session $testSession
Each function should return an object with the expected properties and the correct computer name. These functions form the foundation of a robust inventory tool.
Based on the code shown, when you test these functions with a remote session, the output would look something like this:
PS> $testSession = New-PSSession -ComputerName SRV2 PS> Get-MemoryInfo -Session $testSession Name Value ---- ----- Info HardwareCategory Memory ComputerName SRV2 PS> Get-StorageInfo -Session $testSession Name Value ---- ----- Info HardwareCategory Storage ComputerName SRV2 PS> Get-ProcessorInfo -Session $testSession Name Value ---- ----- Info HardwareCategory Processor ComputerName SRV2
Each function returns a hashtable containing the computer name (from the session), the specific hardware category, and an Info field (currently null but designed to hold the actual hardware information).
Conclusion
In this article, you’ve learned why creating your own PowerShell modules is essential for tackling unique challenges that no off-the-shelf module can address. We explored how custom modules can be a game-changer for specialized configurations or processes within your environment.
This is just the beginning of our journey with the ComputerInventory module. In upcoming blog posts, we’ll expand this foundation by adding real hardware information gathering capabilities, error handling, and advanced remote management features.
Stay tuned as we transform this basic framework into a powerful tool for system administrators!