Home > Software engineering >  Server Timeout C#
Server Timeout C#

Time:11-17

So, I'm developing a client-side application in C# that connects to a server side application (also written in C#). To begin with, I am just trying to get the applications to successfully communicate with one another. Currently, I have both the client and server running on the same device.


Server Side


On the server side, I'm using a TcpListener to accept a socket, printing out that it has connected for debugging purposes, receiving a request, and sending a response. The code can be found below:

Server Side Code:

while (true)
{
    // Accept a new connection
    Socket socket = socketListener.AcceptSocket();
    if (socket.Connected)
    {
        Console.WriteLine("\nClient Connected!!\n==================\nClient IP {0}\n", socket.RemoteEndPoint);
        
        // Make a byte array and receive data from the client
        byte[] receive = new byte[1024];
        _ = socket.Receive(receive, receive.Length, 0);
        
        // Convert byte to string
        string buffer = Encoding.ASCII.GetString(receive);

        
        string response = "Test response";
     
        int numBytes = 0;
        try
        {
            if (socket.Connected)
            {
                if ((numBytes = socket.Send(data, data.Length, 0)) == -1)
                    Console.WriteLine("Socket Error: cannot send packet");
                else
                    Console.WriteLine("No. of bytes sent {0}", numBytes);
            }
            else
            {
                Console.WriteLine("Connection Dropped...");
            }
        }
        catch (Exception e)
        {
            Console.WriteLine("An exception has occurred: "   e.ToString());
        }            
    }
}

Client Side


On the client side, I'm using a TcpClient to connect to the server using an IP address (In this case it's 127.0.0.1), establishing a NetworkStream object, sending a request, and reading a response.

Client-Side Code:

private static readonly TcpClient socket = new TcpClient();

private const string IP = "127.0.0.1";
private const int PORT = 46495;

static void Main(string[] args)
{
    try
    {
        socket.Connect(IP, PORT);
    }
    catch (Exception)
    {
        Console.WriteLine("Error connecting to the server.");
        return;
    }

    NetworkStream stream = socket.GetStream();
    stream.ReadTimeout = 2000;

    string request = "Test Request";
    byte[] bytes = Encoding.UTF8.GetBytes(request);
    stream.Write(bytes, 0, bytes.Length);

    StreamReader reader = new StreamReader(stream, Encoding.UTF8);

    try
    {
        string response = reader.ReadToEnd();
        Console.WriteLine(response);
    }
    catch(Exception e)
    {
        Console.WriteLine(e);
    }
}

The Output


On the server side, everything appears to be fine. The client connects successfully with the expected IP address, I get the expected request, and the correct response appears to have been sent successfully.

The client-side is where it gets more complicated. Where I would expect the "Test Response" response, instead I get a SocketException that from what I understand indicates a timeout??? The full output can be found below:

System.IO.IOException: Unable to read data from the transport connection: A connection attempt failed because the connected party did not properly respond after a period of time, or an established connection failed because the connected host has failed to respond...
 ---> System.Net.Sockets.SocketException (10060): A connection attempt failed because the connected party did not properly respond after a period of time, or an established connection failed because the connected host has failed to respond.
   at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
   --- End of inner exception stack trace ---
   at System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size)
   at System.IO.StreamReader.ReadBuffer()
   at System.IO.StreamReader.ReadToEnd()
   at Client.Client.Main(String[] args) in C:\Dev\Project Orange Sunshine\Project Orange Sunshine\Client\Client.cs:line 38

What I have tried


To begin I wanted to ensure that my server was in fact sending a response in the first place. To test this, I tried accessing the server application through a web browser. Sure enough, I got a blank page with the expected "Test Response" text in the top left corner. This, to me, indicates my server application is working as expected.

Through some googling, I have found a variety of answers to similar questions stating that it is likely that the Windows Defender Firewall is blocking the port that is being used. For testing purposes, I tried disabling the firewall entirely for private networks such as the one that I am on. This didn't change anything, unfortunately.

I feel like I am missing something obvious and any input would be greatly appreciated.

Cheers!

CodePudding user response:

StreamReader.ReadToEnd() on a NetworkStream will only return once the "end" of the stream is reached, which doesn't happen in your example; thus, the StreamReader times out.

You should fix this by using the lower-level NetworkStream.Read method to read from the stream:

var buffer = new byte[4096];
var bytesRead = stream.Read(buffer, 0, buffer.Length);
Console.WriteLine("Read {0} bytes", bytesRead);
string response = Encoding.UTF8.GetString(buffer, 0, bytesRead);
Console.WriteLine(response);

To make this test program more robust, you will also need to introduce "framing", i.e., some way for the server to indicate to the client that it can stop reading. This can be a terminator suffix, such as \r\n used by HTTP, or a length prefix that is sent upfront to tell the client how many more bytes to read.

  • Related