Skip to main content
This example turns a live webcam feed into annotated video using YOLO. WebRTC is the transport layer for low-latency, real-time streaming.

πŸš€ Try this Example

View the complete source code on GitHub. Steps to run:
  1. Install fal:
pip install fal
  1. Authenticate:
fal auth login
  1. Clone the demos repo and install dependencies:
git clone https://github.com/fal-ai-community/fal-demos.git
cd fal-demos
pip install -e .
  1. Run the backend (local dev):
fal run fal_demos/video/yolo_webcam_webrtc/yolo.py::WebcamWebRtc
  1. Run the frontend:
cd fal_demos/video/yolo_webcam_webrtc/frontend
npm install
FAL_KEY=myfalkey npm run dev
Open the Vite dev server in your browser and set the Endpoint field to the full real-time endpoint (for example: myuser/myapp/webrtc).
This deployment flow is the same as other serverless examples, like Deploy Text-to-Image Model, but the request path is a real-time endpoint.

Deploy to fal

To host the backend, deploy the app and use the resulting endpoint in the UI:
fal deploy fal_demos/video/yolo_webcam_webrtc/yolo.py::WebcamWebRtc
The real-time endpoint will be your-username/your-app/webrtc.

How it works

  • The browser streams your webcam to the app in real-time (WebRTC transport).
  • The backend runs YOLO on each frame and draws detection boxes.
  • The annotated stream is sent back to the browser over the same connection.