Recording MJPEG STREAM

Now that I have searched almost the whole internet, I ask if someone can help.

To the point: The video image of LEO is a MJPEG stream. The should be recorded by a recording button. Another button is to stop recording and display as a preview in a window (on the LEO UI). Quasi buffered. If the recording is OK, you can download it or delete it with a button. A new recording should overwrite the old one.

Actually everything as in the WEBRTC example of UV4L. I once adapted the example and fit into my UI. Everything works, but I can not communicate with ROS anymore. No idea why.

I just played with the VLC player and can record every stream. That’s pretty good. Requires more bandwidth (x2) (~ 12Mbps). But another solution would be nice

@Blazej_Sowa ? Hm hm? :smiley:
@Django : actually it’s always easier to record from your controlling device. The ony issue is that you get compressed and laggy image. Even in Turtle (when used UV4L) we didn’t record on the Rover as well.

The solution depends on what you are trying to accomplish.
I highly encourage to utilize the fact that the stream is published to ROS topics and take a look at rosbag.

The way I would do this is to write a ROS node in Python that would provide start recording, stop recording and play services and add corresponding buttons to Web UI. When you press the record button, the UI would call the start recording service and the Python script would start recording the image topic (and possibly some other ones) to a bag file located somewhere inside the filesystem of Raspberry Pi. That way, you don’t need to use any additional bandwith and you are guaranteed that the recorded stream won’t be laggy. Then, when you stop the recording and click play, the script would start playing the bag file and you could preview the stream on your UI.

1 Like