The following is an article written by Wes Simpson for DV.com. The original posting can be found here. So thank you Wes and DV for this very relevant posting from the readers of the Marcellus Blog:
Flash video is a technology that is widely used on Web sites throughout the Internet for delivering video (and other) signals to a wide variety of different devices.
Flash video files and live streams can be displayed on handheld devices (such as the new Droid from Verizon Wireless), all types of laptops, netbooks and desktop PCs as well as a growing number of televisions that have Ethernet ports for displaying content from sites such as YouTube. Originally created by a company called Macromedia, Flash is now a product of Adobe Corp.
WHY USE FLASH?
Flash’s big advantage is the widespread distribution of the software required to play Flash content on a variety of devices throughout the online world. Adobe claims on their Web site that 98 percent of Internet-enabled desktops worldwide have the Flash player installed, and so do hundreds of millions of other devices.
Having the support of the dominant online video providers is obviously a big plus for market penetration, but Flash has a number of technical benefits as well.
Flash is particularly strong in Web sites that combine both vector animation and bit-mapped raster graphics. Vector graphics require much less data compared to sending frame after frame of data describing the changes to hundreds of pixels, even using advanced compression technologies.
Flash also supports H.264 compression, which is used around the world for video recording and delivery. Ideally, this would allow video that was already compressed using H.264 to be directly imported and streamed without much manipulation, or for video to be exported easily to other players.
Unfortunately, Flash’s advanced stream management techniques, coupled with the variable bit-rate control methods used, make this impractical. Instead, videos normally require transcoding before they can be streamed using Flash, and similar manipulations are needed to export Flash files for playback in other formats. This is a double-edged sword — Flash can deliver high-quality images to users on many different platforms, but it has the drawback of requiring unique file structures.
HOW DOES IT WORK?
Flash video is delivered to clients by way of a Flash server, either from a pre-recorded file or as a live stream. In either case, the process is quite similar.
It begins when a user decides to view Flash content. The user can request the content through a Web page hosted by the Flash server, or, much more commonly, the request is made through another Web page that redirects the user’s request for content to the Flash server.
In either case, the user’s device (client) sends two messages to the server to initiate the client-server handshake. Adobe recently published the protocol used in this process, called RTMP (for Real Time Messaging Protocol). Once this handshake is complete, the resulting connection is used to deliver video, audio and other content.
Decoding and displaying the content on the user’s device is performed by Adobe Flash Player software that must be installed on the user’s device before display or playback can begin. This software is typically configured as a plug-in to a Web browser, which means that the browser is responsible for activating the player software and for facilitating the flow of data between the server and the player.
One major benefit of plug-ins is that they can be updated without requiring any changes to the browser software, allowing for rapid innovation. In addition, a plug-in can contain proprietary code or tools for handling encrypted content that do not need to be incorporated into the browser.
Flash plug-ins also support the ActionScript Virtual Machine (AVSM), a software construct that implements a set of common, well-defined behaviors, enabling developers to write a single application that can run on a variety of different platforms, such as Microsoft, Apple and Linux.
Scripts can contain a number of actions for many different types of behavior, such as user interaction, stream management, Web site access and other functions. The ASVM gives Flash a lot of power that can be harnessed by developers to create rich multimedia experiences and even to create games that are written entirely in ActionScript.
CREATING THE VIDEO
Creating a Flash video can be deceptively simple — many Web sites will accept video files in any of a number of different source formats and create a Web page containing that video in a matter of minutes. Similarly, self-contained, portable Webcasting appliances can be used to convert live video signals directly into streams. However, several steps must occur to produce the final product — either a live stream or a file that can be hosted on a server and streamed on demand.
The first step in the process is acquisition, where the source video signal is brought into the appliance for processing. When the signal is a composite/component/SDI video source originating from a camera, tape machine or similar device, this process is known as capture, wherein the video signal is fed into a specially designed interface board that converts the signal into a form suitable for further processing within the appliance.
In the case of video content that is already in a file, the process begins by copying the video clip or file into the appliance, which is often called file capture or upload.
The next step in the process is compression. This needs to be done using one of the compression formats support by Flash technology, such as those provided by Sorenson, On2 or the standard H.264 format (available since the launch of Version 9 of Adobe Flash). Scaling is often also done during this step, whereby the size of the original video frame is adjusted to fit the size of the destination video device, either through stretching or squeezing the video image (normally) or through cropping (rarely).
The final step in the process is to apply a “wrapper” to the video content that helps the playback device understand how the video and related audio or other content is to be interpreted. This wrapper contains information (metadata) about the video image format; lists the compression codecs used to create the data; and describes any other signals such as audio or text that will form part of the output of the viewer. Wrappers, which are also called “containers,” provide a common format for communicating the relevant information about the stream, thereby enabling the playback device to quickly and easily determine how the bits within the stream are to be decoded and displayed.
DELIVERING THE VIDEO
Two methods are frequently used for delivering Flash video to viewers: real-time streaming and on-demand streaming. With real-time streaming a video signal is delivered to one or more viewers from a single source, which can be live or pre-recorded video. This technique is often used for broadcasting live news on Web sites such as CNN.com, and the viewer “tunes-in” to the ongoing program while it is playing.
In contrast, with on-demand streaming, each viewer receives a stream that is custom-delivered to his or her viewing device, on a time schedule that is controlled by the viewer, who can pause, rewind and fast-forward the video. YouTube and many other sites use this technique. Two different, but related, technologies are used to support these two delivery methods.
In real-time streaming, the big challenge is to make a copy of the source stream for each of the viewers who is currently watching. Because the public Internet and many private networks are not multicast-enabled, each viewer’s device must receive a unique sequence of data packets addressed specifically to the user’s device IP address.
To accomplish this task, a “reflecting server” is used to take a single incoming stream and generate multiple output streams.
As shown in Figure 1 (above), video is fed from a source to a Webcasting device, which captures the video, compresses it, and places it into the Flash stream wrapper and format. The reflecting server then replicates this stream for each client device.
In parallel, a portal Web site is often set up that serves as a landing page for user devices to get information about available streams. Clients that navigate to this page are redirected to the reflecting server to actually receive the stream.
The process for on-demand streaming is shown in Figure 2 (above). First, the video content must be created and uploaded to a Flash server. Inside the Flash server, content files are transcoded as necessary into the final Flash streaming format and stored. In parallel, the author of the content will often create a Web page that contains an ActionScript that tells viewers about the available content and give them controls to begin video playback.
These Web pages will redirect the client devices to the Flash server to actually receive the streams. Typically, these Web pages will display a thumbnail that is a single frame selected from the video to illustrate the contents.
A large ecosystem of software and systems has grown up to serve the Flash video market. One of the easiest ways to begin real-time streaming is to rent or purchase Webcasting equipment that is available from a number of suppliers. These self-contained units have video and audio inputs and Ethernet outputs that deliver one or more fully compliant Flash streams. Also, the services of a reflecting server can be purchased or rented as needed to replicate the stream to many viewers. Companies using this approach can get on the air quickly without a huge expenditure of time or money for purchasing systems and training. So what are you waiting for?
Again – thank you Wes and DV.com