Does an RTSP Stream Use Resources If Not in Use?

I’m using a cloud-based security camera with a modified firmware to enable RTSP, even though RTSP isn’t officially supported. It’s currently enabled, but I’m not actively using it. I occasionally connect to the RTSP stream, so I prefer not to disable it.

Does the RTSP stream video and use extra resources even when nothing is connected to it, or does it only start streaming after something authenticates?

Edit: The camera in question is a Wyze Cam V3.

Not going to mention the camera or at least the brand?

Seems unlikely you are going to get any good answers without those details.

By guess would be no, if you aren’t using RTSP simply having it available doesn’t using extra processing power on the camera.

I added the camera name to the original post (Wyze Cam V3).

I’m hesitant to promote a company I don’t fully endorse.

I’m glad to hear about the RTSP, but I’m concerned about an issue that’s occurring, which might be due to the camera handling more than it can manage.

The camera occasionally becomes unresponsive for 3-5 minutes. During this time, it doesn’t power cycle and remains online, responding to authentication and cloud-based settings changes. However, the video feed doesn’t display, RTSP times out, and it fails to record to the SD card. It’s set to continuous recording, but there’s a noticeable gap during these periods when it doesn’t function properly.