PowerShell Script Provider

Write your own PowerShell provider using only script, no C# required. Module definition is provided by a Windows PowerShell 2.0 Module, which may be pure script, binary or a mix of both.

Debugging is as easy as any ordinary ps1 script file:

image

All functions in backing module reflect the same signature as those found on MSDN. This means that you go to MSDN documentation on providers to learn about how to write the corresponding script.

Current Release PSProvider 0.4

Samples and Templates

Roadmap

0.1
  • ContainerCmdletProvider support through "ModuleBoundProvider" provider
  • Demo provider included navigating a Hashtable
  • Can be debugged in the debugger of your choice: console, ISE, PowerGUI.
0.2
  • NavigationCmdletProvider support
  • Providers rename to ContainerScriptProvider and TreeScriptProvider
  • Container Sample & Tree Template modules
  • Supports: Clear-Item, Copy-Item, Get-Item, Invoke-Item, Move-Item, New-Item, Remove-Item, Rename-Item, Set-Item
0.3
  • IContentCmdletProvider support
  • New Commands: New-ContentReader, New-ContentWriter implement IContentReader, IContentWriter
  • Adds support for: Add-Content, Clear-Content, Get-Content, Set-Content
0.4 (Current Release)
  • IPropertyCmdletProvider support
  • Adds support for: Clear-ItemProperty, Copy-ItemProperty, Get-ItemProperty, Move-ItemProperty, New-ItemProperty, Remove-ItemProperty, Rename-ItemProperty, Set-ItemProperty
0.5
  • Dynamic Parameter support
0.6
  • Security Interfaces
  • Adds support for: Get-ACL, Set-ACL

http://psprovider.codeplex.com/

PowerShell ISE Hacking: Change default save encoding to ASCII

If you want to do this, you won’t really need much of an explanation as to why I’m posting this. As to the rest of you, never mind; stick with your BigEndian Unicode (hint: powershell console and other console applications prefer ASCII) :)

First up, create yourself a Windows ISE Profile script that will be loaded by default when ISE starts (and/or when you open a new “Tab”)

# run this one-liner from within ISE through the interactive window (command pane):
if (-not (test-path $profile)) { md -force (split-path $profile); "" > $profile; psedit $profile }

Now, put this one-liner (well, it could fit on one line) in your $profile:

# watch for changes to the Files collection of the current Tab
register-objectevent $psise.CurrentPowerShellTab.Files collectionchanged -action {
    # iterate ISEFile objects
    $event.sender | % {
         # set private field which holds default encoding to ASCII
         $_.gettype().getfield("encoding","nonpublic,instance").setvalue($_, [text.encoding]::ascii)
    }
}
Every time the tabs "files" collection changes, it will set the default save encoding to ASCII for all files in that tab. As the profile is loaded in each tab, all files in all tabs will default to ASCII when saving. No more "save as" annoyances; just hit save and ASCII will be used for encoding. "Save as" will still let you save as unicode if you wish.

Have fun!

PowerShell 2.0 – PSCX Labs: Invoke-Reflector

This is a lot of fun if you spend a lot of time tinkering around with APIs in PowerShell. This function (included in the upcoming PSCX 2.0, alias: refl) will let you open Lutz Roeder’s Reflector for any Type or Cmdlet. Reflector will automatically load the correct Assembly and will highlight the relevant Type, without you having to do diddley-squat. Examples and help will display with -?

function Invoke-Reflector {
<#
    .SYNOPSIS
        Quickly load Reflector, with the specified Type or Command selected.
    .DESCRIPTION
        Quickly load Reflector, with the specified Type or Command selected. The function will also
        ensure that Reflector has the Type or Command's containing Assembly loaded.
    .EXAMPLE
        # Opens System.String in Reflector. Will load its Assembly into Reflector if required.
        ps> [string] | invoke-reflector
    .EXAMPLE
        # Opens GetChildItemCommand in Reflector. Will load its Assembly into Reflector if required.
        ps> gcm ls | invoke-reflector
    .EXAMPLE
        # Opens GetChildItemCommand in Reflector. Will load its Assembly into Reflector if required.
        ps> invoke-reflector dir
    .PARAMETER CommandName
        Accepts name of command. Does not accept pipeline input.
    .PARAMETER CommandInfo
        Accepts output from Get-Command (gcm). Accepts pipeline input.
    .PARAMETER Type
        Accepts a System.Type (System.RuntimeType). Accepts pipeline input.
    .PARAMETER ReflectorPath
        Optional. Defaults to Reflector.exe's location if it is found in your $ENV:PATH. If not found, you must specify.
    .INPUTS
        [System.Type]
        [System.Management.Automation.CommandInfo]
    .OUTPUTS
        None
#>
     [cmdletbinding(defaultparametersetname="name")]
     param(
         [parameter(
            parametersetname="name",
            position=0,
            mandatory=$true
         )]
         [validatenotnullorempty()]
         [string]$CommandName,

         [parameter(
            parametersetname="command",
            position=0,
            valuefrompipeline=$true,
            mandatory=$true
         )]
         [validatenotnull()]
         [management.automation.commandinfo]$CommandInfo,

         [parameter(
            parametersetname="type",
            position=0,
            valuefrompipeline=$true,
            mandatory=$true
         )]
         [validatenotnull()]
         [type]$Type,

         [parameter(
            position=1
         )]
         [validatenotnullorempty()]
         [string]$ReflectorPath = $((gcm reflector.exe -ea 0).definition)
     )

        # no process block; i only want
        # a single reflector instance

        if ($ReflectorPath -and (test-path $reflectorpath)) {

            $typeName = $null
            $assemblyLocation = $null

            switch ($pscmdlet.parametersetname) {

                 { "name","command" -contains $_ } {

                    if ($CommandName) {
                        $CommandInfo = gcm $CommandName -ea 0
                    } else {
                        $CommandName = $CommandInfo.Name
                    }

                    if ($CommandInfo -is [management.automation.aliasinfo]) {

                        # expand aliases
                        while ($CommandInfo.CommandType -eq "Alias") {
                            $CommandInfo = gcm $CommandInfo.Definition
                        }
                    }

                    # can only reflect cmdlets, obviously.
                    if ($CommandInfo.CommandType -eq "Cmdlet") {

                        $typeName = $commandinfo.implementingtype.fullname
                        $assemblyLocation = $commandinfo.implementingtype.assembly.location

                    } elseif ($CommandInfo) {
                        write-warning "$CommandInfo is not a Cmdlet."
                    } else {
                        write-warning "Cmdlet $CommandName does not exist in current scope. Have you loaded its containing module or snap-in?"
                    }
                }

                "type" {
                    $typeName = $type.fullname
                    $assemblyLocation = $type.assembly.location
                }
            } # end switch


            if ($typeName -and $assemblyLocation) {
                & $reflectorPath /select:$typeName $assemblyLocation
            }

        } else {
            write-warning "Unable to find Reflector.exe. Please specify full path via -ReflectorPath."
        }
}

Have fun!

PowerShell 2.0 – Developer Essentials #1 – Initializing a Runspace with a Module

These days I'm incredibly busy both in my professional and private life, so I’ve not had a lot of time to construct the usual meaty posts I like to write. Instead I figured I could write a series of short – very short – posts centered around the little tricks that you would need to be an efficient developer when targeting PowerShell. Here's the first tip: how to create a Runspace and have one or more Module(s) preloaded. The RunspaceInvoke class is a handy wrapper that will do most of the plumbing for you if you just want to run scripts or commands. If you want to manually construct your own Pipeline instances then you must work with the Runspace class directly.

    InitialSessionState initial = InitialSessionState.CreateDefault();
    initialSession.ImportPSModule(new[] { modulePathOrModuleName1, ... });
    Runspace runspace = RunspaceFactory.CreateRunspace(initial);
    runspace.Open();
    RunspaceInvoke invoker = new RunspaceInvoke(runspace);
    Collection<PSObject> results = invoker.Invoke("...");

Have fun!

SharePoint Resources & Localization – What, Where and Why?

This is a just a short post to remind myself (and you, frustrated googlers/bingers) about the different ways resources are defined and accessed within SharePoint. While some .NET applications use embedded resources or satellite assemblies, SharePoint has a preference for raw RESX files. These resx files are dumped in one of two places:

12\CONFIG\RESOURCES

- application-level "global" resources
- propagated at site definition instantiation
- later modifications require stsadm -o copyappbincontent
- they live in <approot>\App_GlobalAppResources
- accessed via HttpContext.GetGlobalResourceObject / TemplateControl.GetGlobalResourceObject
- http://msdn.microsoft.com/en-us/library/system.web.httpcontext.getglobalresourceobject.aspx
- declaratively accessible <asp:foo runat="server" text="<%$ Resources: myapp.core, ResKeyName %>" />
   (where myapp.core represents myapp.core.resx)

12\RESOURCES
- farm-level global resources
- available to all applications
- remain in 12\RESOURCEs, not copied anywhere
- typically used programatically via SPUtility.GetLocalizedString  ( Microsoft.SharePoint.Utilities )
- http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.utilities.sputility.getlocalizedstring.aspx

PowerShell 2.0 – Partial Application of Functions and Cmdlets

This is unashamedly a post for developers, in particular those with an interest in functional languages. With the advent of PowerShell 2.0, some of you may have noticed that ScriptBlocks - which I suppose could also be called anonymous functions or lambdas - gained a new method: GetNewClosure. Closures are one of the essential tools for functional programming., something I’ve been trying to learn more about over the last few years. I don’t really have an opportunity to use it in work other than the hybrid trickery available in C# 3.0, but I have been tinkering a lot with PowerShell 2.0 to see if some of the tricks of the functional trade could be implemented. It’s just a shell language, but there are some nice features in there that enable a wide variety of funky stuff.

Partial Application

In a nutshell, partial application of a function is when you pass in only some of the parameters and get a function back that can accept the remaining parameters:

# define a simple function
function test {
    param($a, $b, $c);
    "a: $a; b: $b; c:$c"
}

# partially apply with -c parameter
$f = merge-parameter (gcm test) -c 5

# partially apply with -c and -a then execute with -b (papp is an alias)
& (papp (papp (gcm test) -c 3) -a 2) -b 7

# partially apply the get-command cmdlet with -commandtype
# and assign the result to a new function
si function:get-function (papp (gcm get-command) -commandtype function)

This is by no means a complete implementation of a partial application framework for powershell. The merge-parameter function (aliased to papp) currently only works with the default parameterset and does not mirror any of the parameteric attributes in the applied function or cmdlet. I'm not saying it couldn't do that, but this is purely a proof of concept. The module is listed below and is also available from PoshCode at http://poshcode.org/1687

# save as functional.psm1 and drop into your module path
Set-StrictMode -Version 2

$commonParameters = @("Verbose",
                      "Debug",
                      "ErrorAction",
                      "WarningAction",
                      "ErrorVariable",
                      "WarningVariable",
                      "OutVariable",
                      "OutBuffer")

<#
.SYNOPSIS
    Support function for partially-applied cmdlets and functions.
#>
function Get-ParameterDictionary {
    [outputtype([Management.Automation.RuntimeDefinedParameterDictionary])]
    [cmdletbinding()]
    param(
        [validatenotnull()]
        [management.automation.commandinfo]$CommandInfo,
        [validatenotnull()]
        [management.automation.pscmdlet]$PSCmdletContext = $PSCmdlet
    )
    
    # dictionary to hold dynamic parameters
    $rdpd = new-object Management.Automation.RuntimeDefinedParameterDictionary

    try {
        # grab parameters from function
        if ($CommandInfo.parametersets.count > 1) {
            $parameters = $CommandInfo.ParameterSets[[string]$CommandInfo.DefaultParameterSet].parameters
        } else {
            $parameters = $CommandInfo.parameters.getenumerator() | % {$CommandInfo.parameters[$_.key]}
        }        
                
        $parameters | % {
            
            write-verbose "testing $($_.name)"
                                    
            # skip common parameters        
            if ($commonParameters -like $_.Name) {                                  
                
                write-verbose "skipping common parameter $($_.name)"
                
            } else {
                
                $rdp = new-object management.automation.runtimedefinedparameter
                $rdp.Name = $_.Name
                $rdp.ParameterType = $_.ParameterType
                
                # tag new parameters to match this function's parameterset
                $pa = new-object system.management.automation.parameterattribute
                $pa.ParameterSetName = $PSCmdletContext.ParameterSetName
                $rdp.Attributes.Add($pa)
                
                $rdpd.add($_.Name, $rdp)
            }
            
        }
    } catch {
    
        Write-Warning "Error getting parameter dictionary: $_"
    }
    
    # return
    $rdpd
}

<#
.SYNOPSIS
    Function that accepts a FunctionInfo or CmdletInfo reference and one or more parameters
    and returns a FunctionInfo bound to those parameter(s) and their value(s.)
.DESCRIPTION
    Function that accepts a FunctionInfo or CmdletInfo reference and one or more parameters
    and returns a FunctionInfo bound to those parameter(s) and their value(s.)
    
    Any parameters "merged" into the function are removed from the available parameters for
    future invocations. Multiple chained merge-parameter calls are permitted.
.EXAMPLE

    First, we define a simple function:
    
    function test {
        param($a, $b, $c, $d);
        "a: $a; b: $b; c:$c; d:$d"
    }
    
    Now we merge -b parameter into functioninfo with the static value of 5, returning a new
    functioninfo:
    
    ps> $x = merge-parameter (gcm test) -b 5
    
    We execute the new functioninfo with the & (call) operator, passing in the remaining 
    arguments:
    
    ps> & $x -a 2 -c 4 -d 9
    a: 2; b: 5; c: 4; d: 9
    
    Now we merge two new parameters in, -c with the value 3 and -d with 5:
    
    ps> $y = merge-parameter $x -c 3 -d 5
    
    Again we call $y with the remaining named parameter -a:
    
    ps> & $y -a 2
    a: 2; b: 5; c: 3; d: 5
.EXAMPLE

    Cmdlets can also be subject to partial application. In this case we create a new
    function with the returned functioninfo:
    
    ps> si function:get-function (merge-parameter (gcm get-command) -commandtype function)
    ps> get-function
                
.PARAMETER _CommandInfo
    The FunctionInfo or CmdletInfo into which to merge (apply) parameter(s.)
    
    The parameter is named with a leading underscore character to prevent parameter
    collisions when exposing the targetted command's parameters and dynamic parameters.
.INPUTS
    FunctionInfo or CmdletInfo
.OUTPUTS
    FunctionInfo
#>
function Merge-Parameter {    
    [OutputType([Management.Automation.FunctionInfo])]
    [CmdletBinding()]
    param(
        [parameter(position=0, mandatory=$true)]
        [validatenotnull()]
        [validatescript({
            ($_ -is [management.automation.functioninfo]) -or `
            ($_ -is [management.automation.cmdletinfo])
        })]
        [management.automation.commandinfo]$_Command
    )
    
    dynamicparam {
        # strict mode compatible check for parameter
        if ((test-path variable:_command)) {
            # attach input functioninfo's parameters to self
            Get-ParameterDictionary $_Command $PSCmdlet
        }
    }

    begin {
        write-verbose "merge-parameter: begin"
        
        # copy our bound parameters, except common ones              
        $mergedParameters = new-object 'collections.generic.dictionary[string,object]' $PSBoundParameters
        
        # remove our parameters, leaving only target function/CommandInfo's args to curry in
        $mergedParameters.remove("_Command") > $null
        
        # remove common parameters
        $commonParameters | % {
            if ($mergedParameters.ContainsKey($_)) {
                $mergedParameters.Remove($_)  > $null
            }
        }
    }
    
    process {
        write-verbose "merge-parameter: process"
        
        # temporary function name
        $temp = [guid]::NewGuid()

        $target = $_Command

        # splat our fixed named parameter(s) and then splat remaining args
        $partial = {
            [cmdletbinding()]
            param()
            
            # begin dynamicparam
            dynamicparam
            {                
                $targetRdpd = Get-ParameterDictionary $target $PSCmdlet
        
                # remove fixed parameters
                $mergedParameters.keys | % {
                    $targetRdpd.remove($_) > $null
                }
                $targetRdpd
            }
            begin {
                write-verbose "i have $($mergedParameters.count) fixed parameter(s)."
                write-verbose "i have $($targetrdpd.count) remaining parameter(s)"
            }
            # end dynamicparam
            process {
                $boundParameters = $PSCmdlet.MyInvocation.BoundParameters
                
                # remove common parameters (verbose, whatif etc)
                $commonParameters | % {
                    if ($boundParameters.ContainsKey($_)) {
                        $boundParameters.Remove($_)  > $null
                    }
                }
                
                # invoke command with fixed parameters and passed parameters (all named)
                . $target @mergedParameters @boundParameters
                
                if ($args) {
                    write-warning "received $($args.count) arg(s) not part of function."
                }
            }
        }
        
        # emit function/CommandInfo
        new-item -Path function:$temp -Value $partial.GetNewClosure()
    }
    
    end {
        # cleanup
        rm function:$temp
    }    
}

new-alias papp Merge-Parameter -force

Export-ModuleMember -Alias papp -Function Merge-Parameter, Get-ParameterDictionary

Have fun[ctional]!

PowerShell – The Patchwork of Paths, PSPaths and ProviderPaths

Paths in PowerShell are tough to understand [at first.] PowerShell Paths - or PSPaths, not to be confused with Win32 paths - in their absolute forms, come in two distinct flavours:

  • Provider-qualified: FileSystem::c:\temp\foo.txt
  • Drive-qualified: c:\temp\foo.txt

It's very easy to get confused over provider-internal (The ProviderPath property of a resolved PathInfo – and the bold portion of the provider-qualified path above) and drive-qualified paths since they look the same if you look at the default FileSystem provider drives. That is to say, the PSDrive has the same name (C) as the native backing store, the windows filesystem (C). So, to make it easier for yourself to understand the differences, create yourself a new PSDrive:

ps c:\> new-psdrive temp filesystem c:\temp\
ps c:\> cd temp:
ps temp:\>

Now, let's look at this again:

  • Provider-qualified: FileSystem::c:\temp\foo.txt
  • Drive-qualified: temp:\foo.txt

A bit easier this time to see what’s different this time. The bold text to the right of the provider name is the ProviderPath.

So, your goals for writing a generalized provider-friendly Cmdlet (or advanced function) that accepts paths are:

  • Define a LiteralPath path parameter aliased to PSPath
  • Define a Path parameter (which will resolve wildcards / glob)
  • Assume you are receiving PSPaths, NOT native provider-paths

Point number three is especially important. Also, obviously LiteralPath and Path should belong in mutually exclusive parameter sets.

Relative Paths

A good question is: how do we deal with relative paths being passed to a Cmdlet. As you should assume all paths being given to you are PSPaths,  let’s look at what the Cmdlet below does:

ps temp:\> write-zip -literalpath foo.txt

The command should assume foo.txt is in the current drive, so this should be resolved immediately in the ProcessRecord or EndProcessing block like (using the scripting API here to demo):

$provider = $null;
$drive = $null
$providerPath = $ExecutionContext.SessionState.Path.GetUnresolvedProviderPathFromPSPath("foo.txt", [ref]$provider, [ref]$drive)

Now you everything you need to recreate the two absolute forms of PSPaths, and you also have the native absolute ProviderPath. To create a provider-qualified PSPath for foo.txt, use $provider.Name + “::” + $providerPath. If $drive is not null (your current location might be provider-qualified in which case $drive will be null) then you should use $drive.name + ":\" + $drive.CurrentLocation + "\" + "foo.txt" to get a drive-qualified PSPath.

Have fun!

PowerShell - Why are keys in Hashtables sorted randomly?

PowerShell guru and Admin-extraordinare Jeff Hicks asked a great question that many a Windows administrator has probably asked themselves when working with Hashtables in PowerShell: Why do the key/values come out in a different order than put in? This is a great question and the answer lies in computer science theory, particularly computational complexity theory. Rather than bore you with a ton of nonsense about O(1), O(n) and other propeller-head dribble, I figure I could explain it in terms everyone should be able to understand.

Hashtables, Buckets and HashCodes = Rolodex, Index Cards and Surnamescard-index

Yes, if the light hasn’t gone on yet, it will soon. Every .NET object includes a method called GetHashCode. This method returns a number that represents the identity of the object in a kind of fuzzy way. I say “fuzzy” because the hash code for a given instance of an object can be different on different platforms (xp, vista, 2003 etc) or on different bitness (64 vs 32 bit.) This method is used by hashtable to get the “surname” of an object. Instead of Index Cards, a Hashtable uses “buckets” to separate groups of objects. Any given Hashcode will naturally fall into a particular bucket as the function (result) of a high-speed optimized algorithm, much like any given surname naturally falls under a particular letter of the alphabet. Finally, it should be clear to you that using index cards [Hashtable buckets] is way faster than flicking through an unsorted folder [randomized list.]

Just like a Rolodex, the order you add names to it doesn’t dictate the order they are in as you flip through the cards sequentially, which is analogous to sending a Hashtable to out-default.

PowerShell 2.0 - About Dynamic Parameters

Did you know that when you run Get-Help against a cmdlet to find out about its parameters, you might not be getting the whole truth? Certain cmdlets, especially those that operate on providers (FileSystem, Certificate, Registry etc) can adopt new parameters on the fly, depending on the path they are acting on. For example, when use you Get-Content on the file system (drive c: etc), it gets three new parameters in addition to the static ones listed by Get-Help (but more about this later): Delimiter, Encoding and Wait.

Determining Dynamic Parameters using Get-Help

Get-Help has a new parameter, –Path, which lets you give the help system some context for determining dynamic parameters:

-Path <string>
    Gets help that explains how the cmdlet works in the specified provider path. Enter a Windows PowerShell provider path.

    This parameter gets a customized version of a cmdlet help topic that explains how the cmdlet works in the specified Windows PowerShell provider path. This parameter is effective only for help about a provider cmdlet and only when the provider includes a custom version of the provider cmdlet help topic.

    To see the custom cmdlet help for a provider path, go to the provider path location and enter a Get-Help command or, from any path location, use the Path parameter of Get-Help to specify the provider path. For more information, see about_Providers.

Determining Dynamic Parameters using Get-Command

Get-Command has a new parameter, –ArgumentList, which acts similarly to unveil what dynamic parameters might be attached to a cmdlet for a given parameterset and path/literalpath if available on the chosen cmdlet. I’ve written a simple function that takes a cmdlet name as an argument and will display all of the dynamic parameters available for a cmdlet for each distinct provider:

# this function will pass a drive name in position 0 as an unnamed argument
# most path-oriented cmdlets accept this binding
function Get-DynamicParameter {
    param(        
        [string]$command
    ) 
  
    $parameters = @{}
    get-psdrive | sort -unique provider | % {
        $parameters[$_.provider.name] = gcm $command -args "$($_.name):" | % {
            $c = $_; @($_.parameters.keys) | sort | ? {
                $c.parameters[$_].isdynamic
            }
        }
    }
    $parameters    
}

Example use:

PS> Get-DynamicParameter get-content

Name                           Value
----                           -----
Alias
FileSystem                     {Delimiter, Encoding, Wait}
AssemblyCache
Registry
Environment
WSMan
Certificate
FeedStore
Function
Variable
PscxSettings

NOTE: when you don’t pass any context parameters to get-command via –argumentlist, it will take your current location as the context for dynamic parameters (if any are found.) So running get-command from c:\ instead hklm:\ will give you the additional parameters Delimiter, Encoding and Wait.

Have fun!

PowerShell 2.0 – Introducing the PModem File Transfer Protocol

One of the things that never quite fit well with me with the remoting feature in PowerShell 2.0 is that while you can “telnet” to remote systems with Enter-PSSession and import commands and do all sorts of cool tricks, there is no way to send or retrieve files from the console. It seems like such a waste that you configure WinRM up with SSL and Kerberos and get this nice encrypted channel up, but if you want to transfer files you have to revert to file shares, remote desktop or classic ftp.

Back in the “good ole’ days” of BBSs and FidoNet, people used to use simple protocols like XModem (advancing to YModem and then ZModem) or Kermit that worked by streaming character data directly to your terminal. It wasn’t fast or particularly efficient, but it got the job done. I thought I’d take a crack at building something similar for PowerShell, and this first 0.5 release is the fruits of this weekend’s tinkering. At the moment it only can “pull” a file to the local system from a remote session, but the next release will allow “pushing” a file from a local system to a remote session.

image

The reason I named it after XModem is because it works in a similar way: files are not “pulled” from the remote server, but instead are “pushed.” X[YZ]Modem file transfer was initiated by the remote end. I’ll not spoil the fun by explaining how it works, but I think you’ll enjoy pulling it apart. It’s in a module format and is implemented in pure script.

Requirements

  • PowerShell 2.0 installed on both client and server with remoting enabled to the location of the file being transferred.

E.g. if you want to grab a file using Get-RemoteFile from a remote server, you must be able to create a valid PSSession to it with the New-PSSession cmdlet. When Send-LocalFile is implemented, you’ll need remoting enabled in the other direction too.

  • The PMODEM module must be findable on both the client and server via import-module and must be the same version.

Here’s the Get-RemoteFile function help (via –?):

NAME
    Get-RemoteFile

SYNOPSIS
    Retrieves a file from a remote computer via a supplied PSession.


SYNTAX
    Get-RemoteFile [-Session] <pssession> [-RemoteFile] <string> [[-LocalPath] <string>] [[-PacketSize] <int32>]
	[-PassThru] [-AsJob] [<commonparameters>]


DESCRIPTION
    Retrieves a remote file from a server via a supplied PSSession. All communication
    is performed out-of-band, yet inside the secure WinRM channel.

    No other ports, file shares or any other special configuration is needed. However,
    the PMODEM module must be on the remote computer and findable in its $ENV:PSModulePath;
    the protocol versions must also match on both ends. You will be warned of any
    misconfiguration(s).

    When not running asynchronously, progress records are generated.


RELATED LINKS

REMARKS
    To see the examples, type: "get-help Get-RemoteFile -examples".
    For more information, type: "get-help Get-RemoteFile -detailed".
    For technical information, type: "get-help Get-RemoteFile -full".

Things coming in later releases: wildcards/multiple file support, compression and integration via proxy functions (copy-item/move-item/remove-item/rename-item etc).

Download PModem

Grab pmodem-0.5.zip and unzip it into a folder in your $ENV:PSModulePath on the client and server computers you want to use PMODEM on.

Have fun!

About the author

Irish, PowerShell MVP, .NET/ASP.NET/SharePoint Developer, Budding Architect. Developer. Montrealer. Opinionated. Montreal, Quebec.

Month List

Page List