System Requirements - RTMP Server
Below is a rough guideline for server hardware requirements for your RTMP server
For 10 Concurrent Streams
-
CPU:
• 2 cores (modern dual‑core at ~2.5+ GHz should be sufficient) -
RAM:
• 4 GB minimum (8 GB recommended for extra buffering and system overhead) -
Network:
• 1 Gbps NIC- Estimated Bandwidth: ~40 Mbps total (10 streams × 4 Mbps)
- Plenty of headroom is available with a 1 Gbps link
For 100 Concurrent Streams
-
CPU:
• 4–8 cores- For ingest only, a quad‑core 3.0+ GHz processor may suffice
- If any processing (e.g., repackaging) is added, lean toward 8 cores
-
RAM:
• 8 GB minimum (16 GB recommended to comfortably handle buffering, connection management, and OS overhead) -
Network:
• 1 Gbps NIC might be borderline if streams are high quality- Estimated Bandwidth: ~400 Mbps total (100 streams × 4 Mbps)
- For extra reliability and headroom, consider a 10 Gbps NIC or NIC bonding
For 1,000 Concurrent Streams
-
CPU:
• 8–16 cores- For pure ingest, 8 cores might work if optimized
- If you perform any transcoding or heavy processing, 16+ cores are recommended
-
RAM:
• 16 GB minimum (32 GB recommended to accommodate higher buffering, connection management, and any additional processing tasks) -
Network:
• 10 Gbps NIC (or aggregated/multiple NICs)- Estimated Bandwidth: ~4 Gbps total (1,000 streams × 4 Mbps)
- A 10 Gbps connection provides the necessary headroom and stability
Key Considerations
-
Processing Load:
The above recommendations assume minimal CPU load per stream (i.e. simple ingress). -
Network Overhead:
Real‑world conditions (protocol overhead, burstiness, etc.) might push bandwidth requirements higher. It’s wise to over‑dimension network capacity relative to the calculated total. -
Scalability:
In production, consider load balancing across multiple servers if you expect to consistently approach these limits, and ensure monitoring to adjust resources as needed.
These guidelines provide a starting point to help you size your hardware. Actual requirements can vary significantly depending on your exact situation.