I am using a library to do RTMP streaming from an USB/UVC camera in android. The library I am using is this one. This library works just fine for android USB cameras that are RGB. However, I am working with a thermal camera that has YUYV frame format, therefore the initial camera image is as below:
This image format is in YUV and needs some processing before the image is viewable. Therefore, I'd like to know how can I grab the frames from the camera, applying processing and then update the preview.
My problem is that, when streaming from my device, the endpoint/device receiving the stream will also view this green image, as it needs some processing, and i don't know how to correct this in android.
When connecting in Windows, I could process the image make it viewable by doing the following steps:
cap = cv2.VideoCapture(-1)
cap.set(cv2.CAP_PROP_CONVERT_RGB, 0)
while(True):
ret, frame = cap.read()
frame = frame.reshape(292, 384, 2)
# I would remove the last 4 extra rows
frame = frame[0:288, ...]
# Convert image to uint16
dt = np.dtype(('<u2', [('x', np.uint8, 2]))
frame = frame.view(dtype=dt).astype(np.float32)
cv2.imshow('frame', gray)
if cv2.waiKey(1) & 0xFF = ord('q'):
break
This would output a 16bit frame in greyscale
Unfortunately, i do not have considerable experience with image processing on Android, so I am not sure if I must repeat the same steps done in Python, but on android or is there any alternative/easier way to fix the frames being streamed by the OpenGlView. The stream library uses a OpenGlView to display the frames
<com.pedro.rtplibrary.view.OpenGlView
android:id="@ id/openglview"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:keepAspectRatio="true"
/>
This is the stream service that handles the camera detection
import android.app.Service
import android.content.Context
import android.content.Intent
import android.hardware.usb.UsbDevice
import android.os.Binder
import android.os.Build
import android.os.IBinder
import android.util.Log
import androidx.annotation.RequiresApi
import androidx.core.app.NotificationCompat
import com.pedro.rtplibrary.view.OpenGlView
import com.serenegiant.usb.USBMonitor
import com.serenegiant.usb.UVCCamera
import net.ossrs.rtmp.ConnectCheckerRtmp
/**
* Basic RTMP/RTSP service streaming implementation with camera2
*/
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
class StreamService : Service() {
companion object {
private const val TAG = "RtpService"
private const val channelId = "rtpStreamChannel"
private const val notifyId = 123456
private const val width = 384
private const val height = 292
var openGlView: OpenGlView? = null
}
val isStreaming: Boolean get() = endpoint != null
private var endpoint: String? = null
private var rtmpUSB: RtmpUSB? = null
private var uvcCamera: UVCCamera? = null
private var usbMonitor: USBMonitor? = null
private val notificationManager: NotificationManager by lazy { getSystemService(Context.NOTIFICATION_SERVICE) as NotificationManager }
override fun onCreate() {
super.onCreate()
Log.e(TAG, "RTP service create")
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
val channel = NotificationChannel(channelId, channelId, NotificationManager.IMPORTANCE_HIGH)
notificationManager.createNotificationChannel(channel)
}
keepAliveTrick()
}
private fun keepAliveTrick() {
if (Build.VERSION.SDK_INT > Build.VERSION_CODES.O) {
val notification = NotificationCompat.Builder(this, channelId)
.setOngoing(true)
.setContentTitle("")
.setContentText("").build()
startForeground(1, notification)
} else {
startForeground(1, Notification())
}
}
override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {
Log.e(TAG, "RTP service started")
usbMonitor = USBMonitor(this, onDeviceConnectListener).apply {
register()
}
return START_STICKY
}
private fun prepareStreamRtp() {
stopStream()
stopPreview()
rtmpUSB = if (openGlView == null) {
RtmpUSB(this, connectCheckerRtmp)
} else {
RtmpUSB(openGlView, connectCheckerRtmp)
}
}
fun startStreamRtp(endpoint: String): Boolean {
if (rtmpUSB?.isStreaming == false) {
this.endpoint = endpoint
if (rtmpUSB!!.prepareVideo(width, height, 30, 4000 * 1024, false, 0, uvcCamera) && rtmpUSB!!.prepareAudio()) {
rtmpUSB!!.startStream(uvcCamera, endpoint)
return true
}
}
return false
}
fun setView(view: OpenGlView) {
openGlView = view
rtmpUSB?.replaceView(openGlView, uvcCamera)
}
fun setView(context: Context) {
openGlView = null
rtmpUSB?.replaceView(context, uvcCamera)
}
fun startPreview() {
rtmpUSB?.startPreview(uvcCamera, width, height)
}
private val connectCheckerRtmp = object : ConnectCheckerRtmp {
override fun onConnectionSuccessRtmp() {
showNotification("Stream started")
Log.e(TAG, "RTP connection success")
}
private val onDeviceConnectListener = object : USBMonitor.OnDeviceConnectListener {
override fun onAttach(device: UsbDevice?) {
usbMonitor!!.requestPermission(device)
}
override fun onConnect(device: UsbDevice?, ctrlBlock: USBMonitor.UsbControlBlock?, createNew: Boolean) {
val camera = UVCCamera()
camera.open(ctrlBlock)
try {
camera.setPreviewSize(width, height, UVCCamera.FRAME_FORMAT_YUYV)
} catch (e: IllegalArgumentException) {
camera.destroy()
try {
camera.setPreviewSize(width, height, UVCCamera.DEFAULT_PREVIEW_MODE)
} catch (e1: IllegalArgumentException) {
return
}
}
uvcCamera = camera
prepareStreamRtp()
rtmpUSB!!.startPreview(uvcCamera, width, height)
endpoint?.let { startStreamRtp(it) }
}
}
}
The MainActivity that calls the stream service library to do the streaming:
import android.Manifest.permission.CAMERA
import android.Manifest.permission.READ_EXTERNAL_STORAGE
import android.Manifest.permission.RECORD_AUDIO
import android.Manifest.permission.WRITE_EXTERNAL_STORAGE
import android.content.ComponentName
import android.content.Context
import android.content.Intent
import android.content.ServiceConnection
import android.os.Bundle
import android.os.IBinder
import android.util.Log
import android.view.SurfaceHolder
import android.view.View
import android.widget.Button
import androidx.activity.viewModels
import androidx.appcompat.app.AppCompatActivity
import androidx.core.app.ActivityCompat.requestPermissions
import com.pedro.rtplibrary.view.OpenGlView
import dagger.hilt.android.AndroidEntryPoint
import dev.alejandrorosas.apptemplate.MainViewModel.ViewState
import dev.alejandrorosas.streamlib.StreamService
import org.opencv.android.OpenCVLoader
@AndroidEntryPoint
class MainActivity : AppCompatActivity(R.layout.activity_main), SurfaceHolder.Callback, ServiceConnection {
private val viewModel by viewModels<MainViewModel>()
private var mService: StreamService? = null
override fun onCreate(savedInstanceState: Bundle?) {
Log.d("OPENCV", "OPENCV Loading Status ${OpenCVLoader.initDebug()}")
super.onCreate(savedInstanceState)
StreamService.openGlView = findViewById(R.id.openglview)
startService(getServiceIntent())
viewModel.serviceLiveEvent.observe(this) { mService?.let(it) }
viewModel.getViewState().observe(this) { render(it) }
findViewById<View>(R.id.settings_button).setOnClickListener { startActivity(Intent(this, SettingsActivity::class.java)) }
findViewById<OpenGlView>(R.id.openglview).holder.addCallback(this)
findViewById<Button>(R.id.start_stop_stream).setOnClickListener { viewModel.onStreamControlButtonClick() }
requestPermissions(this, arrayOf(READ_EXTERNAL_STORAGE, RECORD_AUDIO, CAMERA, WRITE_EXTERNAL_STORAGE), 1)
}
private fun render(viewState: ViewState) {
findViewById<Button>(R.id.start_stop_stream).setText(viewState.streamButtonText)
}
private fun getServiceIntent(): Intent {
return Intent(this, StreamService::class.java).also {
bindService(it, this, Context.BIND_AUTO_CREATE)
}
}
override fun surfaceChanged(holder: SurfaceHolder, p1: Int, p2: Int, p3: Int) {
mService?.let {
it.setView(findViewById<OpenGlView>(R.id.openglview))
it.startPreview()
}
}
override fun surfaceCreated(holder: SurfaceHolder) {
}
}
I implemented the native function that calls the frames from the USB camera as below in the USBBase.Java:
public void startPreview(final UVCCamera uvcCamera, int width, int height) {
Log.e(TAG, "handleStartPreview:mUVCCamera" uvcCamera " mIsPreviewing:");
if ((uvcCamera == null)) return;
Log.e(TAG, "handleStartPreview2 ");
if (!isStreaming() && !onPreview && !(glInterface instanceof OffScreenGlThread)) {
uvcCamera.setFrameCallback(mIFrameCallback, UVCCamera.PIXEL_FORMAT_RGBX);
//uvcCamera.setValue(UVCCamera.CTRL_ZOOM_ABS, 0x8800);
glInterface.setEncoderSize(width, height);
glInterface.setRotation(0);
glInterface.start();
uvcCamera.setPreviewTexture(glInterface.getSurfaceTexture());
uvcCamera.startPreview();
onPreview = true;
} else {
Log.e(TAG, "Streaming or preview started, ignored");
}
}
// grabs the frame from the camera
private byte[] FrameData = new byte[384 * 292 * 4];
private final IFrameCallback mIFrameCallback = new IFrameCallback() {
@Override
public void onFrame(final ByteBuffer frameData) {
Log.d(TAG, "mIFrameCallback: onFrame------");
frameData.get(FrameData, 0, frameData.capacity());
}
};
However, I don't understand if this what I got to process to have a viewable image being streamed from OpenGlView, or how can I process correctly the YUYV frame and then update my view.
I would like to know how can i fix this green image being captured by my camera
CodePudding user response:
I found the following great example of converting a YUYV image from a camera and converting it to an RGB bitmap:
https://study.marearts.com/2014/12/yuyv-to-rgb-and-rgb-to-yuyv-using.html
in this example, you can see how you can convert images from YUYV to RGB, as well as from RGB to YUYV, after which you can display the finished image on the screen.
I also found a ready-made project that converts a Yuv image to Rgb, maybe it will help you: