Home > Software design >  What options exist to stream video from iOS to browser?
What options exist to stream video from iOS to browser?

Time:01-28

I'm looking for a way to implement real-time streaming of video (and optionally audio) from iOS device to a browser. In this case iOS device is a server and browser is a client.

Video resolution must be in the range 800x600-1920x1080. Probably the most important criteria is lag that should be less than 500 msec.

I've tried a few approaches so far.

1. HLS

Server: Objective-C, AVFoundation, UIKit, custom HTTP-server implementation

Client: JS, VIDEO tag

Works well. Streams smoothly. The VIDEO tag in the browser handles incoming video steam out of the box. This is great! However, it has lags that are hard to minimize. It feels like this protocol was built for non-interactive video streaming. Something like twitch where a few seconds of lag is fine. Tried Enabling Low-Latency. A lot of requests. A lot of hassle with the playlist. Let me know if this is the right option and I have to push harder in this direction.

2. Compress every frame into JPEG and send to a browser via WebSockets

Server: Objective-C, AVFoundation, UIKit, custom HTTP-server implementation, WebSockets server

Client: JS, rendering via IMG tag

Works super-fast and super-smooth. Latency is 20-30 msec! However, when I receive a frame in a browser, I have to load it using loading from a Blob field via base64 encoded URL. At the start, all of this works fast and smoothly, but after a while, the browser starts to slow down and lags. Not sure why I haven't investigated too deeply yet. Another issue is that frames compressed as JPEGs are much larger (60-120kb per frame) than MP4 video stream of HLS. This means that more data is pumped through WiFi, and other WiFi consumers are starting to struggle. This approach works but doesn't feel like a perfect solution.

Any ideas or hints (frameworks, protocols, libraries, approaches, e.t.c.) are appreciated!

CodePudding user response:

  1. HLS

… It feels like this protocol was built for non-interactive video streaming …

Yep, that's right. The whole point of HLS was to utilize generic HTTP servers as media streaming infrastructure, rather than using proprietary streaming servers. As you've seen, several tradeoffs are made. The biggest problem is that media is chunked, which naturally causes latency of at least the size of the chunk. In practice, it ends up being the size of a couple chunks.

"Low latency" HLS is a hack to return to the methods we had before HLS, with servers that just stream content from the origin, in a way compatible with all the HLS stuff we have to deal with now.

  1. Compress every frame into JPEG and send to a browser via WebSockets

In this case, you've essentially recreated a video codec, and added the overhead of Web Sockets. Also, with the base64 encoding rather than sending it binary, you're adding extra CPU and memory requirements, as well as ~33% overhead in bandwidth.

If you really wanted to go this route, you could simply use MediaRecorder, an HTTP PUT request, stream the output of the recorder, send it to the server, to relay on to the client over HTTP. The client then just needs a <video> tag referencing some URL on the server, and nothing special to playback. You'll get nice low latency without all the overhead and hassle.

However, don't go that route. Suppose the bandwidth drops out? What if some packets are lost and you need to re-sync? How will you set up communication between each end to continually adjust quality, buffering, codec negotiation, etc.? What if peer-to-peer connections are advantageous?

Use WebRTC

It's a full purpose-built stack for maintaining low latency. Libraries are available for most any stack on most any platform. It works in browsers.

Rather than reinventing all of this, you can take advantage of what's there.

The downside is complexity... it isn't easy to get started with, but well worth it for most low latency use cases.

  • Related