So I have this script that loops through the Windows Security log to check if there has been any activity on an account the last 7 days (because the retention on the log is 7 days - would be a bonus to get a day limiter into the script). However, each run takes about 6 hours (1,2mil events). The fact that listing them inside the event viewer only takes a couple of seconds makes me believe I can write the code more optimized. Any insight on this?
Code:
$filter = @{LogName='Security';ProviderName='Microsoft-Windows-Security-Auditing'}
$i = 0
$entries = Get-WinEvent -FilterHashtable $filter -ComputerName localhost | ForEach-Object{
$eventXml = ([xml]$_.ToXml()).Event
$userName = ($eventXml.EventData.Data | Where-Object { $_.Name -eq 'TargetUserName' }).'#text'
$computer = ($eventXml.EventData.Data | Where-Object { $_.Name -eq 'WorkstationName'}).'#text'
If($userName -match "Username1" -or $userName -match "Username2" -or $userName -match "Username3")
{
[PSCustomObject]@{
Time = [DateTime]$eventXml.System.TimeCreated.SystemTime
UserName = $userName
Computer = $computer
}
}
$i
Write-Progress -activity "Scanning Win Events..." -status "Scanned: $i"
}
$filetime = Get-Date -Format "ddMMyyyyHHmm"
$entries | Out-File "C:\Temp\UsedAccounts$filetime.txt"
$endTime = (Get-Date)
'Duration: {0:mm} min {0:ss} sec' -f ($endTime-$startTime)```
CodePudding user response:
As commented, there are some ways to speed things up:
- Add an event id to the filter instead of asking for all event types. Also, not all events will have a
TargetUserName
item.. - Change the
ForEach-Object
loop into aforeach()
which is faster than piping - Do not write out stuff or
Write-Progress
inside the loop
# filter on logon, because not all events would have a 'TargetUserName' item..
$filter = @{LogName='Security';ProviderName='Microsoft-Windows-Security-Auditing'; ID=4624}
$entries = foreach ($entry in (Get-WinEvent -FilterHashtable $filter -ComputerName localhost)) {
$eventXml = ([xml]$entry.ToXml()).Event
$userName = ($eventXml.EventData.Data | Where-Object { $_.Name -eq 'TargetUserName' }).'#text'
# if you need 'whole-word' matching, change to
# $username -match '\b(Username1|Username2|Username3)\b'
if($userName -match 'Username1|Username2|Username3') {
$computer = ($eventXml.EventData.Data | Where-Object { $_.Name -eq 'WorkstationName'}).'#text'
# output an object
[PSCustomObject]@{
Time = [DateTime]$eventXml.System.TimeCreated.SystemTime
UserName = $userName
Computer = $computer
}
}
# don't waste time writing unnecessary stuff in the loop with
# Write-Progress or Write-Host
}
# now output the objects to a structured Csv file you can
# double-click to open in Excel
$filetime = Get-Date -Format "ddMMyyyyHHmm"
$entries | Export-Csv -Path "C:\Temp\UsedAccounts$filetime.csv" -UseCulture -NoTypeInformation
Almost forgot.. You can limit the events to be from the last 7 days by extending the filter:
$filter = @{
LogName ='Security'
ProviderName ='Microsoft-Windows-Security-Auditing'
ID = 4624
StartTime = (Get-Date).AddDays(-7).Date
}
CodePudding user response:
Theo was faster than me, but in addition to his suggestions I would recommend using -FilterXml
If you change to using XML for your filter instead of a hashtable you can definitely improve your filter options, including specifying how recent your results are.
$filter = @'
<QueryList>
<Query Id="0" Path="Security">
<Select Path="Security">*[System[TimeCreated[@SystemTime>='2021-09-15T17:39:00.000Z']]]</Select>
</Query>
</QueryList>
'@
$entries = Get-WinEvent -FilterXml $filter
That would get everything in the Security log from today at 10:39 AM to now. You could set that to whatever date/time you want. Is it fast? No, it really isn't. Loading 10 minutes of log data, just over 8200 entries for me, took 1 minute 18 seconds. Is it faster than grabbing the entire log? Oh heck yeah it is! I have not allowed it to load the 109k entries my system has (I have a 100MB limit on the file), but I imagine it would be proportionately longer.
Past that I would really consider just running a regex match against the message of the event rather than converting everything to XML. Admittedly, the bulk of your time is just getting the events into PowerShell, but processing them is going to take time too, and just running a regex match against the message rather than converting to XML and parsing that takes (on my dev box at least) 60x longer to convert to XML and parse that. Sure, that's just 30 milliseconds vs 1829, but that's to reduce 8.2k events down to 8. With 1.2 million that's going to become a bit more significant. Since this is always being run on the local machine you may as well just use $env:COMPUTERNAME
rather than pull the computer name from each event. So here's what I would do after you get your event log entries:
$filtered=ForEach($entry in $entries){
if($entry.Message -match "(?ms)Target Subject:. ? Account Name:\s (user1|user2|user3)"){
[pscustomobject]@{
TimeCreated=$entry.TimeCreated
User=$Matches[1]
Computer=$env:COMPUTERNAME
}
}
}
$filetime = Get-Date -Format "ddMMyyyyHHmm"
$filtered | Out-File "C:\Temp\UsedAccounts$filetime.txt"