In a PowerShell class I'm taking, we were being introduced to the BEGIN{}
and END{}
blocks of a PowerShell function, and how you could use them to do things like manage connections.
This got me thinking:
- Imagine I was establishing some sort of object that had a large memory footprint, or took some noticeable time to generate. Do the
BEGIN{}
andEND{}
blocks reduce any memory/processing costs, especially if my function was being used in a pipe? - Why would you want to use this instead of
try
/catch
/finally
? If you establish a connection and some part of the code fails, wouldn't you want afinally
block to close the connection?
CodePudding user response:
Re 1:
begin
and end
blocks inside a script or function execute once per call, namely before and after pipeline input is processed, which happens in the process
block, which is invoked once for each pipeline input object - see about_Functions.
Therefore, it makes sense to:
- place one-time initializations that need not vary depending on a given pipeline input object - such as as database connection - in the
begin
block - and perform any necessary cleanup in the
end
block.
The caveat is that your end
block may situationally not get to execute, namely if:
a downstream cmdlet (one in a later pipeline segment) cause a terminating error
the
Select-Object
cmdlet stops the pipeline prematurely via its-First
parameter.
There is a pending improvement (as of this writing):
- A post-v7.2 version will allow use of a
clean
block (the name may change) that can be used for cleanup that is guaranteed to execute - see RFC #294.
Re 2:
A
try
/catch
/finally
statement can only be placed inside one of the three available blocks (begin
,process
,end
), and only inside that block isfinally
then guaranteed to execute.Using none of the the three blocks means that the function's body is implicitly placed in the
end
block, i.e. is only ever executed once per call. (Such functions are usually not designed for pipeline input.)
The above implies that you cannot use a single try
/ catch
/ finally
statement for once-per-call initialization and cleanup if you also want to support per-object processing via process
, because the initialization and cleanup then have to be in different blocks (begin
vs. end
).
Once the future clean
block mentioned above is available, cleanup should be moved into the clean
block for guaranteed execution.
Example:
This sample code shows the use of the begin
/ process
/ end
blocks in an advanced script block ({ ... }
) invoked with &
, for brevity (what is inside & { ... }
can be used as-is as a function or script body).
It creates and opens an output file in the begin
block, writes each pipeline input object to it in the process
block, then closes the file in the end
block (this example is contrived; in real life, you'd simply use Set-Content
).
1, 2, 3 | & {
param([Parameter(ValueFromPipeline)] $InputObject)
begin {
Write-Verbose "Initializing: Opening output file (stream writer)"
try {
$streamWriter = [System.IO.StreamWriter] "$PWD/sample.txt"
} catch { throw } # Abort, if an exception occurred.
}
process {
Write-Verbose "Saving to file: $InputObject"
$streamWriter.WriteLine($InputObject)
}
end {
Write-Verbose "Cleaning up: Closing file"
$streamWriter.Dispose()
}
} -Verbose
The results are saved in file sample.txt
file in the current directory (be sure to remove it manually), and the following verbose output prints to the display, illustrating how the blocks are invoked in response to the pipeline input (1, 2, 3
):
VERBOSE: Initializing: Opening output file (stream writer)
VERBOSE: Saving to file: 1
VERBOSE: Saving to file: 2
VERBOSE: Saving to file: 3
VERBOSE: Cleaning up: Closing file
This blog post contains a more detailed discussion.Thanks, Hazrelle