Home > Mobile >  What method for live sending Python numpy data to Unity 3D?
What method for live sending Python numpy data to Unity 3D?

Time:08-18

My goal for a project is to send some Python data live to Unity 3D.

More detailled: I'm processing a livestream webcam image for some values with OpenCV and Numpy and (ideally) want to send those calculated values continuously to Unity3D while the webcam and Python script is running. According to those sent values, a GameObject in Unity is then being live-transformed.

I got this to work only statically so far by using a localhost Python socket and C# System.Net.Sockets: When pressing a key while my Python script is running, the socket will open and send data. To receive, I then have to press a GUI button in Unity and afterwards the socket closes.

My problem: Using the socket in my Python while-loop while streaming the webcam makes my Python script freeze (found a workaround in my static idea for this freeze by sending random data back from Unity to Python), and I obviously also don't want to have to press a button in Unity everytime I want to receive anything.

My question: What method seems fit for my task to continuously send data from OpenCV (Python) to Unity? I don't really know what I should look into to be honest, and haven't found any similar issue on here so far. I heard about something like Python UDP/TCP sockets or C# Thread, is this something to go after? Or are sockets the thing to stick to and try to fix?

CodePudding user response:

You can simply do this : “Take picture bytes stream from RenderTexture (Unity) and send it to python server on websockets” A sample code you can use for connecting to websockets:

void Start() {
      ws = WebSocketFactory.CreateInstance("ws://localhost:12345");
      // ...
      
      // Connect to server 
      ws.connect();
   
     InvokeRepeating("Capture", 1f, 0.2f);
 }

 public void Capture() {
     RenderTexture activeRenderTexture = RenderTexture.active;
     RenderTexture.active = Camera.targetTexture;
     
     Camera.Render();
 
    Texture2D image = new Texture2D(Camera.targetTexture.width, Camera.targetTexture.height);
    image.ReadyPixels(new Rect(0, 0, Camera.targetTexture.width, CameraTexture.height), 0, 0);
    image.Apply();
    RenderTexture.active = activeRenderTexture;

    byte[] bytes = image.EncodeToPNG();
    Destroy(image);

    ws.send(bytes);
}

some important notes:

1- we encode images to PNG format and send them in this sample. you can do it or not, it's your choice!

2- In this solution, you are running everything on main thread in unity.Thus, it may lag and your program's frame rate may be low. also , the way you encode it is important (PNG encoding is huge!).

3- If this solution didn't work, you can also try streaming with FMETP STREAM maybe.

CodePudding user response:

For sending your numpy array from python and receive it in unity, there are multiple ways. One of the simplest ways is that you convert your numpy array to string and send it through socket to unity. Then, convert it back to float in unity. A sample for sending string:

# python code
import socket 

UDP_IP = "127.0.0.1"
UDP_PORT = 5050

sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)

msg = "sample string!"

sock.sendto(msg, (UDP_IP, UDP_PORT))

//C# code
IPEndPoint anyIP = new 

IPEndPoint(IPAddress.Parse("0.0.0.0"), port);

byte[] data = client.Receive(ref anyIP); 

string text = Encoding.UTF8.GetString(data);

Another way that may be a better approach (but I didn’t try it myself) is you can serialize your numpy python array. Since ndarray is a binary construct, you can use ndarray.tobytes() and then deserialize it in C#.

But for deserialization, pay attention to your numpy data types(int, float, etc.) and Endianness of your machine (little endian/ big endian)

An example for deserializing in C# (an array of bytes to array of doubles):

Buffer.BlockCopy();

You can see this question for more info.

  • Related