I know there is code review for 'speed up' exists, but I also need to 'fix' problem of my script.
I migrated from Command Prompt(.bat
) to Powershell(.ps1
) because I think Command Prompt is hard to make script for complex things. I've heard that Powershell may have some overhead than Command Prompt, but if it is fast enough, I don't care.
Here is link for two files, sdn.old.bat
and sdn.new.ps1
. I put them in paste site because those are long enough.
Here is my problem.
This part takes a very long time to run.
$logs_loc = @(
"$Env:LocalAppdata"
"$Env:Appdata"
)
ForEach ($item in $logs_loc) {
Get-ChildItem -Path "$item\*" -Recurse -Force -Include *.log *.log.txt | Remove-Item -Force
}
It takes ~5 seconds and I don't know why. This also doesn't do anything. This code should remove all *.log
or *.log.txt
files under %Appdata%
and %LocalAppdata%
, but it don't delete anything. I tested with test file randomly placed blank *.log
and *.log.txt
but they remain after run.
I haven't tested other part of my script, so there may another problem exists...
TL;DR
- I don't know why my code doesn't work but also takes too much time for running.
- Is there anything that can 'improve' the speed?
CodePudding user response:
The issue is that your -include
parameters is interpreting *.log *.log.txt
as a single pattern, but we want it to match either pattern so just separate them by a ,
, so *.log, *.log.txt
.
$logs_loc = @(
"$Env:LocalAppdata"
"$Env:Appdata"
)
ForEach ($item in $logs_loc) {
Get-ChildItem -Path "$item\*" -Recurse -Force -Include *.log, *.log.txt | Remove-Item -Force
}
As for speed, this seems like an operation you wont be performing often so im not sure its worth investing much time into making it faster than it is, mine takes 10 seconds. I suggest asking Code Review for that bit!
CodePudding user response:
Otter's helpful answer already explains the issue with your current code, the -Include
parameter takes string[]
(string array) as argument, if you want to pass multiple filters to the parameter you need to separate them by a comma ,
. See about_Arrays for details.
As for improving the efficiency of your code, you would need to make .NET API calls to IO.DirectoryInfo
as zett42 points out in a comment. For handling the folder recursion you can use a Queue<T>
:
$env:LocalAppdata, $env:Appdata | & {
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline)]
[string] $Path
)
begin { $queue = [Collections.Generic.Queue[IO.DirectoryInfo]]::new() }
process { $queue.Enqueue($Path) }
end {
while($queue.Count) {
$dir = $queue.Dequeue()
foreach($filter in '*.log', '*.log.txt') {
$dir.EnumerateFiles($filter)
}
foreach($i in $dir.EnumerateDirectories()) {
$queue.Enqueue($i)
}
}
}
} -ErrorAction SilentlyContinue | Remove-Item -Force