I am building a net6.0 application where we have to interact with an external device that communicates via RS232 serial port.
The external device utilizes a protocol in order to communicate with the application, where we know beforehand the size and some parts (header-like) of the message packet and is based on the client-server architecture.
In my attempt to implement the solution, I used polling in an infinite while loop on the serial which was working fine, although it would take quite a few time to synchronize (approx 30 seconds).
I tried to workaround that solution and go to a more "event driven approach" based on events and trying to read data via the DataReceived event.
While it seemed that I was getting data back, the actual contents of the buffer were significantly different than the ones expected, much bigger in size (expecting approx 10-15 bytes maximum, got around 140 bytes).
I read the remarks on the second link provided and there seems to be some ambiguous results:
- The operating system decides when to raise an event
- An event will not be raised upon each byte arrival
My questions are:
When does the DataReceived event triggered? Would there be the case where the OS is buffering the data received and sends them as a batch? For example, one "request" from RS232 would be 12 bytes and the next one 14 bytes etc and thus when I am trying to access the data from the buffer there is a much bigger amount of bytes?
Is there a way to configure the application or the OS (not sure how portable that solution would be) so that when the RS232 device sends any kind of payload (for example either 12 bytes or 14 bytes etc), this would explicitly trigger an event?
Thank you very much for your time!
CodePudding user response:
What you are looking for is probably this:
Use the BytesToRead property to determine how much data is left to be read in the buffer. DataRecieved event usually triggers, when an EOC character is sent from the client. It also triggers when enough Bytes appear in "BytesToRead". Sometimes it triggers for every byte.
Private Sub _SerialPortReader_DataReceived(sender As Object, e As SerialDataReceivedEventArgs) Handles _SerialPortReader.DataReceived
Dim currentSP As SerialPort = Convert.ChangeType(sender, GetType(SerialPort))
Dim strBuilder As System.Text.StringBuilder = New System.Text.StringBuilder()
Dim incomingText As String = ""
currentSP.ReadTimeout = 1000
Do
Try
strBuilder.Append(Convert.ChangeType(currentSP.ReadByte(), GetType(Char)))
incomingText = strBuilder.ToString()
If incomingText.Contains(CustomSharedFunctions.GetStringFromDecimal(_EndofCommunicationString)) Then Exit Do
Catch ex As Exception
Exit Do
End Try
Loop
'incomingText contains now the full message sent to you, remember this event triggers on a different thread, so you might want to use Invoke() here.
End Sub
C# Code here:
private void _SerialPortReader_DataReceived(object sender, SerialDataReceivedEventArgs e){
SerialPort currentSP = Convert.ChangeType(sender, typeof(SerialPort));
System.Text.StringBuilder strBuilder = new System.Text.StringBuilder();
string incomingText = "";
currentSP.ReadTimeout = 1000;
do
{
try
{
strBuilder.Append(Convert.ChangeType(currentSP.ReadByte(), typeof(char)));
incomingText = strBuilder.ToString();
if (incomingText.Contains(CustomSharedFunctions.GetStringFromDecimal(_EndofCommunicationString)))
break;
}
catch (Exception ex)
{
break;
}
}
while (true);}
CodePudding user response:
Maybe you should not care about how/when the OS manages its stuff... Usually it is triggered by a subtle balance of buffer size/timeouts for performance.
As I understood, you are working on the application level, so you should maybe rather focus on detecting end of frames (as you are using a given protocol) than on single bytes.
I would suggest to try to build a kind of parser/adapter upon the incoming data to detect well formed / splitted frames (in a dedicated/custom event to be elegant/robust?)...
Protocols are just a matter of nested layers/frames down to bits/bytes...