I have a function:
function LocateIP ([String] $IPPool, [String] $IP) {
return $IPPool.contains($IP)
}
where $IPPool would be a file with lots of ips and $IP would be, of course, the IP. function need to return true if $ip is inside the ippool.
This works great, problem arise when when i try to iterate a file of ips and then work line by line and running LocateIP function on it.
if the file holds more than 50k of ips and i do iterate line by line and checking, that could take lots of time, and, of course, more than that it gets much worse.
can i find another way that will help me to work with bigger files?
CodePudding user response:
I imagine there are plenty of ways to speed it up:
Switch from
ForEach-Object { }
toforeach() { }
Don't call a PowerShell function for every line. Instead of calling LocateIP, run the
$line.contains($ip)
test directly. PowerShell function calls have a lot of overhead.Avoid using
Get-Content
to read the file, when you could use[System.IO.File]::ReadAllLines()
instead.Push all the work down to a faster cmdlet such as
Select-String -Pattern ([regex]::Escape($IP)) -Path file.txt
IP addresses form a tree structure; if you need to load the file once then do lots of checks, you could make a structure with faster lookup performance.