Table of Contents

FAQ

General

What platforms does VASTreaming support?

VASTreaming supports Windows (10, 11, Server 2016+), Linux (Ubuntu 20.04+, Debian 11+, RHEL 8+), Android (8.0+), macOS (12+), and iOS (12.2+). The .NET libraries support .NET Framework 4.5+, .NET Standard 2.0, and .NET 6–10. C++ libraries are available for all supported platforms. See Supported Platforms for the full compatibility matrix.

What UI frameworks are supported?

VASTreaming provides video rendering controls for .NET MAUI, WPF, WinForms, and limited support for ASP.NET Core Razor and Blazor. The C++ SDK supports native Android views, UIKit, SwiftUI, and AppKit. See Supported Platforms for details.

Where should I start?

The quickest way to get started is to grab the demo projects from GitHub and get a free 30-day demo license. Then read the documentation in this order: Terminology, Supported Platforms, List of Libraries, Basic Principles, Core Primitives. See Sample Applications for detailed descriptions of each demo project.

Initialization

What initialization is required before using the SDK?

Three steps are required at application startup:

  1. Set the license key: VAST.Common.License.Key = "YOUR_LICENSE_KEY";
  2. Configure logging: set VAST.Common.Log.LogFileName and VAST.Common.Log.LogLevel
  3. Synchronize the clock: call VAST.Common.NtpTime.Sync()

The license key must be set before any other VASTreaming operation. At application exit, call VAST.Media.MediaGlobal.Uninitialize() to stop internal monitoring threads. See Sample Applications for initialization examples.

Why is NTP synchronization needed?

NtpTime.Sync() synchronizes the internal clock with an NTP server for accurate timestamp generation. This is important for multi-source synchronization, recording, and live streaming scenarios where correct timing ensures proper audio/video alignment.

Protocols

Which streaming protocols are supported?

VASTreaming supports RTSP, RTP/RTCP, RTMP, SRT, HLS, LL-HLS, MPEG-DASH, WebRTC, WebTransport, NDI, MJPEG over HTTP, TS over HTTP, and multicast UDP.

Which protocol should I use for low-latency streaming?

For the lowest latency on LAN, use SRT (100–200 ms) or RTSP/RTMP/WebRTC (200–300 ms). SRT latency can be reduced below 100 ms using custom protocol extensions. For browser-based playback, WebRTC provides the lowest latency. Avoid HLS (15–30 seconds) and MPEG-DASH (5–20 seconds) when latency is critical. LL-HLS reduces HLS latency to 2–5 seconds. See Library Performance for detailed benchmarks.

Which protocol should I use for large-scale delivery?

HLS and MPEG-DASH are designed for large-scale HTTP-based delivery and can handle ~15,000 concurrent sessions on a single server. RTSP supports ~900 concurrent sessions, RTMP ~600. WebRTC is not suitable for high-load servers due to the overhead of peer connections. See Library Performance for throughput benchmarks.

Can I translate between protocols?

Yes. The multi-protocol StreamingServer ingests media from any supported source and delivers it to clients using their preferred protocol. For example, an RTSP camera feed can be served simultaneously as HLS, MPEG-DASH, and WebRTC. See Basic Principles for the server architecture.

Does VASTreaming support VOD (Video on Demand)?

Yes. VOD streaming is supported via HLS, MPEG-DASH, and RTSP. VOD publishing points serve pre-recorded content with seeking and pause support. See Basic Principles for details on publishing point types.

Why does WebRTC playback fail even though the stream works over RTSP or HLS?

WebRTC is very restrictive about codec compatibility. The Google native WebRTC library only supports a limited set of codecs and profiles — for example, H.264 Constrained Baseline profile is widely supported, while Main or High profiles may be rejected by the browser. Audio codecs are also negotiated during the SDP exchange and must match what both peers support. Because of this, simple pass-through of an existing stream often does not work with WebRTC, and transcoding is necessary in many cases to re-encode the media into a compatible format and profile.

Codecs and Formats

Which video and audio codecs are supported?

Video: H.264, H.265/HEVC, MPEG-1/2/4, Motion JPEG. Audio: AAC, MP3, AC-3, E-AC-3, Opus, PCM variants, G.711/722/723/726/729, AMR, GSM. Container formats include MP4, Transport Stream, MP3, WAV, and FLV. See Terminology for codec descriptions.

Why are there no built-in codecs on Linux?

Linux does not have a built-in media framework like Media Foundation (Windows), VideoToolbox (macOS/iOS), or MediaCodec (Android). To encode or decode media on Linux, you must use FFmpeg or NVIDIA CUDA. Set the preferred framework explicitly:

// FFmpeg (CPU-based, all Linux systems)
parameters.PreferredVideoFramework = VAST.Common.MediaFramework.FFmpeg;

// NVIDIA CUDA (requires Nvidia GPU)
parameters.PreferredVideoFramework = VAST.Common.MediaFramework.CUDA;

Which decoder framework should I use?

By default, VASTreaming uses the platform's built-in framework. On Windows, Media Foundation provides hardware-accelerated decoding. FFmpeg is available on all platforms. NVIDIA CUDA can be used for Nvidia GPU decoding on Windows and Linux. If you set the framework to Unknown, the library tries all available frameworks until one succeeds. See Core Primitives for decoder factory details.

Hardware Acceleration

How do I enable hardware acceleration?

Set AllowHardwareAcceleration = true in playback parameters or encoder/decoder parameters. On Windows, this uses Media Foundation and Direct3D. On macOS and iOS, VideoToolbox is used. On Android, MediaCodec provides hardware codecs. See Supported Platforms for the platform-specific acceleration table.

What performance improvement does hardware acceleration provide?

Hardware acceleration dramatically increases throughput. For example, H.264 1080p decoding scales from 1–2 streams on CPU to 8–16 on a consumer GPU, ~40 on Tesla T4, and ~130 on Tesla A16. See Library Performance for detailed benchmarks.

Architecture

What is the source/sink pattern?

VASTreaming uses a pipeline model where media flows from sources to sinks. Sources include cameras, network clients, and file readers. Sinks include servers, file writers, and renderers. A single source can feed multiple sinks simultaneously. MediaSession automates connecting sources to sinks. See Basic Principles for details.

Can I reuse source and sink objects after stopping?

No. VASTreaming state machine objects are not reusable. Once an object reaches a terminal state (Closed or Error), it must be disposed. Create a new instance to continue. See Core Primitives for state transition diagrams.

What is a publishing point?

A publishing point is a named endpoint on a streaming server that serves media content to clients. It maps a publishing path to a media source and supports multiple concurrent clients viewing the same stream. Publishing points can be created automatically when a publisher connects or manually in code. See Core Primitives for details.

Memory Management

How does VersatileBuffer reference counting work?

VersatileBuffer uses reference counting to manage pre-allocated memory buffers efficiently. Call AddRef() when retaining a buffer received from external code (e.g., an event handler). Call Release() when you are done with the buffer. Do not call AddRef() on buffers obtained from MediaGlobal.LockBuffer() or Clone() — these already have their reference count incremented. See Core Primitives for code examples.

IP Cameras

How do I connect to IP cameras?

Use RTSP to receive camera streams directly, or use ONVIF for device discovery, stream enumeration, and PTZ control. SearchAsync discovers ONVIF-compatible cameras on the local network. After discovery, connect using OnvifClient2 (Profile 2) or OnvifClient1 (Profile 1) to enumerate available stream URIs. See the WinForms App Demo for a working ONVIF example.

My IP camera connects but no video is received. What should I do?

If the camera connects successfully but media is not coming (no video in the player or no streaming in the server), try adding the vast-transport=TCP parameter to the URI:

rtsp://camera-ip/stream?vast-transport=TCP

By default, RTSP uses UDP for media transport, which can be blocked by firewalls or fail on certain network configurations. The vast-transport=TCP parameter forces RTP interleaved transport over the same TCP connection as the RTSP control channel.

Authentication fails when credentials contain special characters. How do I fix this?

If the username or password contains special characters (such as @, :, /, #, %, or spaces), they must be URL-encoded in the URI. For example, if the password is p@ss:word, encode it as p%40ss%3Aword:

rtsp://admin:p%40ss%3Aword@camera-ip/stream

Use Uri.EscapeDataString to encode credentials programmatically:

string uri = $"rtsp://{Uri.EscapeDataString(username)}:{Uri.EscapeDataString(password)}@{host}/stream";

My IP camera is not connecting. What should I do?

IP cameras are known for protocol and compatibility issues across manufacturers. Enable debug logging (VAST.Common.Log.LogLevel = VAST.Common.Log.Level.Debug) and contact VASTreaming support with the log file. VASTreaming provides fixes and workarounds for specific camera models. See Support for contact details.

Demos

Where can I find demo projects?

Selected demo projects are available on GitHub. You can obtain a free 30-day demo license to run them. See Sample Applications for a full list of available demos.

Why does the server demo require administrator privileges on Windows?

On Windows, opening an HTTP listener on a specific port requires administrator privileges unless a URL reservation has been configured. The server demo uses HttpListener or similar APIs that bind to http://+:port/, which Windows restricts to elevated processes by default. You can either run the application as administrator, or create a URL reservation for a specific user using netsh:

netsh http add urlacl url=http://+:8088/ user=DOMAIN\username

After adding the reservation, the server can be started without administrator privileges.

How do I enable HTTPS on Windows?

On Windows, HTTPS requires binding an SSL certificate to the listening port using netsh. First, obtain the certificate thumbprint from the certificate store, then register it:

// either create urlacl rule allowing HTTP access to the port
netsh http add urlacl url=http://*:<port>/ user=Everyone
// or associate the certificate to your application explicitly
netsh http add sslcert ipport=0.0.0.0:<https-port> certhash=<thumbprint> appid={<application-guid>}

This step is mandatory — without it, the server cannot accept HTTPS connections. The appid can be any GUID identifying your application.

Logging and Debugging

How do I enable logging?

Set the log file path and level at application startup:

VAST.Common.Log.LogFileName = System.IO.Path.Combine(appFolderPath, "VAST.Demo.Streaming.log");
VAST.Common.Log.LogLevel = VAST.Common.Log.Level.Debug;

Use Debug level when investigating issues or requesting help from support.

How do I send logs to VASTreaming support?

The SDK includes a built-in log upload method:

await VAST.Common.License.SendLog("description of the issue");

A valid license key must be configured for this feature to work. Alternatively, attach the log file manually when contacting support@vastreaming.net.

Licensing and Payment

What payment methods are accepted?

The following payment options are available:

  • Bank transfer — no surcharge
  • Cryptocurrency — no surcharge
  • PayPal — 4% processing surcharge
  • Credit card — 4% processing surcharge

Contact info@vastreaming.net with your preferred payment method and we will send you further instructions.

Is there a free trial?

Yes. You can get a free 30-day demo license to evaluate the SDK using the demo projects on GitHub.

Support

How do I get support?

Send support requests, bug reports, and feature requests to support@vastreaming.net. Include the SDK version, target platform and framework, steps to reproduce the issue, and log output. Straightforward bug fixes are typically delivered within 24 hours. See Support for full details.

Can I request new features?

Yes. Simple features that would benefit other customers can be implemented for free. Complex features specific to your requirements are available as custom development. Contact support@vastreaming.net to discuss. See Support for details.

See Also