It should come as no surprise to anyone that, when looking at total global internet traffic, the largest share is generated by video streaming services.
According to the Global Internet Phenomena Report 2024 (global aggregated data), Sandvine indicates that on fixed networks, the Video category represents approximately 39% of downstream traffic. Within “Content Categories”, On-Demand Streaming generates the most traffic, accounting for around 54% of total downstream volume.
Complementing Sandvine’s data, the Ericsson Mobility Report (2025) estimates that video already represents 76% of total traffic on mobile networks, while Cloudflare Radar reports 19% year-over-year growth in global internet traffic, driven primarily by the consumption of high-resolution multimedia content on leading platforms such as YouTube and Netflix.
It is important to emphasize that each source relies on its own sample sets and measurement methodologies. There is no single, absolutely authoritative source that measures “100% of the internet”. That said, all sources point in the same direction: internet traffic continues to grow year after year, and one of the main drivers—without question—is video consumption, regardless of device.
Sandvine publishes a table titled “Downstream Traffic – Top Apps 2022”, where Netflix ranks #1 with approximately 14.93% of global downstream traffic (according to its measurement). In that same table, we find for example:
In “Top Video Apps 2022”, Netflix again appears at around 14.92%, Amazon Prime Video at ~2.83%, and HBO Max appears with approximately 1.64% in that ranking.
Important note: these figures reflect Sandvine’s methodology and sample for that specific period. They do not represent “100% of the global internet,” but rather measurements from networks where Sandvine has visibility.
Regional Snapshot (Sandvine GIPR 2024, 2023 Data by Region)
The GIPR 2024 shows “Top Apps by Downstream Volume 2023” by region (Americas / Europe / APAC). Examples include:
*It is possible that Amazon Prime aggregates e-commerce traffic in addition to Prime Video, as no explicit clarification is provided in GIPR 2024.
Netflix Open Connect is Netflix’s proprietary CDN: a network of servers and interconnections designed to deliver video over the internet as efficiently as possible. Netflix describes the core idea of a CDN as “bringing content closer to the user” and delivering it over HTTP/HTTPS.
Netflix positions Open Connect as an ISP-facing program built on two main pillars:
A Simple Analogy
Your home (the user) requests a “product” (a video segment).
Instead of always shipping it from “a warehouse in another country,” Netflix places micro-warehouses (OCAs) nearby (within the ISP or at an IXP), reducing distance and congestion.
BGP (Border Gateway Protocol)
What it is: the protocol used by internet operators (ASNs) to exchange routing information: “to reach these IPs, go this way.”
Analogy: a “GPS between cities” that decides which highways lead to each destination.
In Open Connect, Netflix uses BGP for traffic steering: determining which clients (IP prefixes) “see” a given OCA as a valid destination.
Prefixes / CIDR
A prefix (e.g., 203.0.113.0/24) is a block of IP addresses.
Analogy: a neighborhood or ZIP code. Instead of referring to a single house, you refer to an entire area.
Control Plane vs. Data Plane
Analogy: Data plane = trucks delivering goods; Control plane = the logistics center assigning routes and warehouses
Open Connect clearly separates two domains:
Data Plane: Open Connect Appliances (OCAs)
OCAs are servers that deliver video files from high-capacity disks/SSDs with very high throughput. Netflix publishes high-level specifications such as:
Netflix also describes built-in fault tolerance (automatic disk disablement, redundant power supplies, etc.).
Control Plane: The “Brain”
The control plane decides which OCA should serve each client at any given moment and provides the client with URLs to download content from that OCA.
Netflix explicitly documents that:
Netflix does not want OCAs to serve “anyone.” Instead, it defines a model in which traffic direction is explicitly determined by both the ISP and Netflix.
In practice:
Why this matters: it allows Netflix traffic to be localized within the ISP network (less transit, less congestion, better QoE), while giving the ISP control over distribution by region, access network, or capacity.
Netflix explains that Open Connect consists of:
Translated into practical terms:
VOD has a major advantage: much of the catalog can be pre-positioned.
In its deployment guides, Netflix addresses capacity planning, fill operations, and monitoring, and explicitly describes the dependency on the control plane for OCAs to remain “serving.”
The simple idea: if you know which titles will be popular, you can load them close to users before peak hours.
It is worth noting that for live events, which Netflix is increasingly streaming, the behavior is different and more reactive. Netflix itself details its live streaming architecture in its blog and technical articles.
When Netflix accounts for ~15% of global downstream traffic (according to Sandvine H1 2022), and streaming dominates downstream volume overall, advertising is competing for stability against the heaviest and most rebuffer-sensitive flows.
The more “unique” a creative or manifest is per user (tracking, macros, IDs), the harder it is to cache. Open Connect excels when objects are reusable and pre-positionable; for ads, this means designing for reuse (same media across many users, minimal manifest variability, etc.).
Netflix provides a Partner Portal with reporting focused on routes and prefixes (BGP route performance) for partners.
In saturated markets, it is no longer enough to offer large volumes of content. Consistency matters. Friction must be removed. Availability should not vary by country. This agreement gives Netflix a competitive advantage that is not creative, but logistical and experiential.
This is exactly the type of tooling that helps answer questions like: “Why does buffering spike in this ISP during ad pods?”
Netflix built Open Connect because, at its scale, video delivery stops being a technical detail and becomes part of the product itself. When millions of users press play at the same time—especially during prime time—the biggest enemy is not the catalog; it is congestion, latency, and the inefficiency of moving massive volumes of data across long paths or through intermediaries.
Bringing Content Closer to the User to Improve QoE
The core idea of any CDN is “put the video close.” With Open Connect, Netflix can:
Route Control and Load Steering
With a proprietary CDN, Netflix can coordinate closely with ISPs on a highly controlled delivery model:
Economic Efficiency at Scale
At Netflix scale, paying third-party CDNs on a per-GB basis becomes structurally inefficient. Open Connect allows Netflix to:
Pre-Positioned Catalog
VOD (unlike live) allows anticipation:
Observability and Operations
Netflix needs precise network-level visibility into where experience degrades:
The overarching conclusion is that streaming—especially long-form video on CTV—is not just “content.” It is network and infrastructure. At Netflix scale, user experience depends more on where the bytes are and how they travel than on any superficial player optimization. That is why Open Connect exists: a proprietary CDN designed to bring content closer to viewers, reduce backbone dependency, and control delivery paths through interconnection and peering (IXPs or private links) and routing decisions (with BGP acting as a “GPS between networks”).
The practical outcome is a more stable QoE: faster startup times and fewer stalls during prime time.
The same reality applies to CTV and AdTech. Whether you use SSAI or CSAI, advertising lives within the same network budget and depends on the same delivery stability as content. When delivery degrades, the first symptoms appear as ad load failures, rebuffering, measurement mismatches, and pods that fail to render on time.
The key lesson for the industry is clear: to scale video and monetization without breaking the experience, systems must be designed with caching, routing, and interconnection in mind—object reuse, minimal manifest variability, and observability by ISP and route—because in modern streaming, delivery infrastructure is part of the product.
At tvads we has a professional team able to advise you on this field and and guide you in any area of your streaming advertising business, advising you or even operating it on your behalf if necessary
All author posts