So, you want to stream live. You’ve done lots of research on options for a Content Delivery Network (CDN) and which software you’re going to use. Now you need to get the video signal from your source (camera or production switcher) into the webcasting computer system. You need a capture device, but you’re overwhelmed with the wide choice of devices, with costs ranging from under $100 to over $1000. What’s the difference? What matters for you?

What is a capture device, anyway?

Most modern laptops have a webcam which makes live video available to software on that system. Within the camera, there is a chip that converts the analog visual world into digital data. The system sees the webcam as a video device which can be used by software such as Skype or Google Hangouts. It can also be used by most webcasting software, like Telestream Wirecast or vMix.

The unfortunate fact is that your little webcam doesn’t give you the ability to connect an external video source. This is where a third-party capture device comes to the rescue. It allows you to connect an external video feed to your computer, and makes that source available as a video/audio device selection in your software.

Alphabet soup warning: The video industry
has no end to the number of acronyms they use.

The Signal

Over the last ten years, the video industry has transformed from Standard Definition (SD), to High Definition (HD), to Ultra High Definition (Ultra HD or UHD). Your capture device needs to support the signal format which you have available as an input. This may be as simple as an HDMI consumer camera that supports 1080p or 720p. But what does this mean?

It all started with the National Television System Committee (NTSC) back in 1953, when the original TV standards were formed. They outlined that in North America, a TV image would contain 525 horizontal lines (483 are visible) and would be interlaced, meaning that only half (alternating) lines are rendered in each frame[i]. Today this early standard is known as Standard Definition (SD).

The DVD was the industry’s first implementation of digital technology, and specified a method for converting analog video signals into digital data and storing it on the DVD media. The effective resolution was 640x480, and referred to as 480i – the “i” meaning it was interlaced.

While the DVD was a revolutionary at the time, the next wave of change to full digital TV was even farther reaching, and enabled higher resolutions. The new specifications provided for the whole TV ecosystem to be digital from camera to display, including over-the-air broadcast signals. The new digital TV rules only provide guidance on how signals are handled, as opposed to the single mandated resolution of the early NTSC standard. This allows for a broad range of signals, but also causes lots of confusion, particularly with what we refer to as “HD”.

The easiest way to think about a signal format is by its resolution. Common designations for HD are 1080p and 720p. The number is the frame height in pixels, while the letter “i” or “p” is the interlacing method (or lack thereof). As above, “i” means Interlaced, rendering alternating lines in each frame. “P” mean progressive – there is no interlacing, and all lines are rendered in every frame. As you would guess, progressive delivers a better picture, because each frame contains a complete image – unlike interlaced, which is like looking at a picture through a fast flickering window blind. However, progressive content consumes twice the data to render same video as an interlaced signal. This is especially important when the signal is broadcast over the air, but not so important when being recorded.

Below is chart which outlines common digital signals and resolutions.

Digital television standards

Standard

Resolutions

H:V[ii]
Aspect Ratio

Pixels/frame

Video CD

352 × 240 (NTSC)

4:3

84,480

352 × 288 (PAL)

101,376

SVCD

480 × 480 (NTSC)

4:3 or 16:9

230,400

480 × 576 (PAL)

276,480

SDTV 480i,

EDTV 480p,

SMPTE 293M

640 × 480

4:3 or 16:9

307,200

852 × 480

408,960

DVD

720 × 480 (NTSC)

4:3 or 16:9

345,600

720 × 576 (PAL)

414,720

720p (HDTV)

1280 × 720

16:9

921,600

1080i, 1080p (HDTV, Blu-ray)

1920 × 1080

16:9

2,073,600

2160p (UHDTV)

3840 × 2160

16:9

8,294,400

4320p (UHDTV)

7680 × 4320

16:9

33,177,600

The last aspect of the signal is just that – the frame “aspect” ratio. As you can see in the chart above 4:3 and 16:9 are most commonly used. The aspect describes the shape of the viewing frame; width to height being the multiplier used to determine the final resolution of the frame. Standard Definition is most commonly in 4:3 aspect and HD being 16:9.

Fun fact: 4k is the newest term in the video industry.
This term was originally coined by the Digital Cinema
Initiative (think NTSC for movies) and refers to the horizontal
resolution of 4000 pixels. DCI’s specification is 4096 x 2160.
By comparison, Ultra HD (UHD), the standard used for
consumer 4K TVs, is 3840 x 2160. We could also
call Full HD “2K” at its 1920x1080. But we don’t.

The Connections

Figuratively, and sometimes literally, there are two sides to a capture device. The input – which accepts the incoming signal to the device – and how the device connects to the computer to output the digital signal.

The input side comes in many variations. The most common connection is HDMI for consumer sources and SDI for professional sources. However, there are several other sources as well: DVI, VGA, Component and Composite to name a few.

The connection to the computer falls into two main categories: internal and external devices. External devices usually connect via the Universal Serial Bus (USB)[iii]. This is the easiest method to add a capture device to your system. Most modern systems support USB 3.0, which allows for connecting HD – and sometimes Ultra HD – devices. There are two device types in this class: devices that need their own software (drivers) to work, and ones that support plug-and-play (no drivers needed). The latter is obviously preferred and easier to use.

The second category is internal devices. They usually connect through the PCI Express (PCIe) adapter[iv] slot inside a desktop or rack system. These cards have the benefit of being connected directly to the system, which enables higher performance. This higher performance also allows for multiple input channels on one device. Video switching software packages like Wirecast or vMix are a perfect match for cards with multiple capture channels for multi-camera webcast production.

That’s a lot of info! What’s right for you?

Whew. Now that you understand the ins and outs of capture devices, you can focus on your needs and deciding what type of device is best for your application. This leads to lots of questions about your specific needs:

How many video sources do you desire – single camera, or multiple video sources?

This is the first question you must answer. If your project only requires a single camera (or video source), then a USB device will likely be your best bet. We recommended the Magewell USB Capture line of driver-free devices. It’s as simple as it can get – truly plug and play. The USB Capture family is available with HDMI or SDI input. The USB Capture Plus line comes in HDMI, SDI and DVI models, with signal loop-through (out) on the HDMI and SDI versions, and analog audio input and output.

If you need more than one source available to your software, then a PCIe capture card would be best. Magewell has a complete line of PCIe Capture cards with a vast array of input options. These cards come with one, two and four channels on a single card.

Does your solution need to be portable?

Again, a USB device might be the best option if portability is important. Otherwise, you might want to consider a purpose-built system like the StreamDynamics StreamMini X4. This compact, small-form-factor system incorporates the Magewell Pro Capture Quad card, which provides four live camera sources to your software.

What type of type of output connectivity does your camera(s) or switcher support?

In general, the most common digital signal is HDMI, which is available from both consumer and professional cameras. SDI is also a common video output professional signal. The biggest difference between HDMI and SDI is that SDI cables can run a much longer distance before losing signal strength. Depending on the age your equipment, it may also support several other types and connections (such as composite, s-video or component video; and balanced or unbalanced analog audio). Once you know the type of output and signal format you will be using, select a capture device that supports it.

It’s also worth noting the All-in-One class devices which have single capture channel and allow you to select from a broad array of a/v input types, from analog to digital and SD to HD. There is even a 4K All-in-One card which spans from SD to 4K capture resolutions.

Below is a chart of signal types and the connectors that are commonly used with them.

Signal - full name

Acronym

Connector

Type

Max resolution
(X-px × Y-px (i) @ Z-Hz)

Composite video

CVBS

RCA or BNC

Analog

720 × 576i @ 50
720 × 480i @ 59.94

S-Video

S-Video

or Y/C

Mini-DIN 4-pin or
2 BNC or RCA connectors

Analog

720 × 576i @ 50
720 × 480i @ 59.94

Video Graphics Adapter

VGA

DE-15/HD-15 (canonical),
RGB or RGBHV on separate BNC connectors, Mini-VGA, DVI/Mini-DVI/Micro-DVI

Analog

2048 × 1536 @ 85[5]

Component video

YPBPR

3 RCA or BNC connectors

Analog

1920 × 1080 @ 60[7]

Apple-AAUI (D-Terminal)

Digital Visual Interface

DVI

DVI, Mini-DVI, Micro-DVI

Both

2560 × 1600 @ 60 3840 × 2400 @ 33

Apple Display Connector (ADC)

Both

2560 × 1600 @ 60

Serial digital interface

SDI

BNC

Digital

From 143 Mbit/s to 2.970 Gbit/s, depending on variant. 480i, 576i, 480p, 576p, 720p, 1080i, 1080p.

High-Definition Multimedia Interface

HDMI

19 pin HDMI
Type A/C

Digital

2560 × 1600 @ 75
4096 × 2160 @ 60 [8] (version 2.0)

DisplayPort

DP

20-pin (external)
32-pin (internal)

Digital

2560 × 1600 @ 75 8192 × 4320 @ 60 (version 1.3)

Conclusion

Selecting a capture device can be as easy as 1 – 2 – 3:  

    • Select your input type and video format
    • Select the number of channels you desire
    • Select the type of device interface – USB or PCIe

    Just remember:

      • USB = portable and easy
      • PCIe = performance and multiple channels

      I know that was a lot of info – I hope you found it helpful!

      I know that was a lot of info – I hope you found it helpful in your live streaming journey. And don’t forget that we’re here to help! Let us know your questions and feedback in the comment section below, and we’ll get back to you. 

      Footnotes:

      [i] Interlacing was a method for cheating since there wasn’t enough over the air bandwidth for a single frame. They split alternating lines into tow frames and the phosphors in the old tube TVs glowed long enough for the alternating lines to blend. Interlacing is still used in HD application to lower the amount of data need to for the same effective resolution. While there are no longer phosphors used in displays the time on screen is so short that your eye cannot tell that you are only seeing half the image at once.

      [ii] Horizontal : Vertical

      [iii] Some systems also have a Thunderbolt connection which is popular on Apple Mac systems.

      [iv] There is also the small form factor - Mini PCIe version or Newer M.2 standard.   

      Newer Post