Other Free Encyclopedias » Online Encyclopedia » Encyclopedia - Featured Articles » Contributed Topics from P-T

Real Time Multimedia - Real-Time Networked Multimedia, Real-time Streaming Media Protocols

data internet applications video

Wenjun Zeng
University of Missouri, Columbia, MO, USA

Junqiang Lan
Harmonic, Inc., NY Design Center, NY, USA

Definition: Real-time multimedia refers to applications in which multimedia data has to be delivered and rendered in real time; it can be broadly classified into interactive multimedia and streaming media.

Multimedia is a term that describes multiple forms of information, including audio, video, graphics, animation, images, text, etc. The best examples are continuous media such as animation, audio and video that are time-based, i.e., each audio sample or video frame has a timestamp associated with it, representing its presentation time. Multimedia data has to be presented in a continuous fashion, in accordance with their associated timestamp. For example, video is typically rendered at 30 frames per second to give the viewers the illusion of smooth motion. As a result, multimedia applications typically have the real-time constraint, i.e., media data has to be delivered and rendered in real time.

Today, with the advances of digital media and networking technologies, multimedia has become an indispensable feature on the Internet. Animation, audio and video clips become increasingly popular on the Internet. A large number of distributed multimedia applications have been created, including Internet telephony, Internet videoconferencing, Internet collaboration that combines video, audio and whiteboard, Internet TV, on demand streaming or broadcasting, distance learning, distributed simulation, entertainment and gaming, multimedia messaging, etc.

Multimedia data, unlike traditional data, exhibits several unique characteristics. First, multimedia applications usually require much higher bandwidth than traditional textual applications. A typical 25-second movie clip with a resolution of 320×240 could take 2.3 mega-bytes, which is equivalent to about 1000 screens of textual data. Second, most multimedia applications have stringent delay constraints, including real-time delivery. Audio and video data must be played back continuously at the rate they are sampled. If the data does not arrive in time, the playing back process will stop and the artifact can be easily picked up by human ears and eyes. Third, multimedia data stream is usually bursty due to the dynamics of different segments of the media. For most multimedia applications, the receiver has a limited buffer. The bursty data stream, if not smoothed, may overflow or underflow the application buffer. When data arrives too fast, the buffer will overflow and some data packets will be lost, resulting in poor quality. When data arrives too slowly, the buffer will underflow and the application will starve, causing the playing back process to freeze. Other characteristics of multimedia data include power-hungry, synchronous, loss-tolerant, having components of different importance, highly adaptable, etc. Some of the characteristics such as loss-tolerance, prioritized components and adaptability can in fact be exploited in a real-time multimedia communication system.

Contrary to the high bandwidth, real-time and bursty natures of multimedia data, in reality, networks are typically shared by thousands and millions of users, and have limited bandwidth, unpredictable delay and availability. For example, the Internet provides only the best effort service, i.e., data packets can be lost, re-ordered, or delayed for a long time. As a result, advanced networking technologies have been designed specifically for the efficient delivery of multimedia data. There is typically a trade-off between delay and quality. Different applications may require different levels of quality of service (QoS).

Real-Time Networked Multimedia

Real-time multimedia can be broadly classified into interactive multimedia and streaming media. Interactive multimedia applications include Internet telephony, Internet video-conferencing, Internet collaboration, Internet gaming, etc. In interactive multimedia applications, the delay constraint is very stringent in order to achieve interactivity. For example, in Internet telephony, human beings can only tolerate a latency of about 250 milliseconds. This imposes an extremely challenging problem for interactive multimedia applications over the Internet that provides only the best effort service. Over the years, great efforts have been made to facilitate the development of interactive multimedia applications over the Internet. For example, Microsoft Research’s ConferenceXP Research Platform 1 supports the development of real-time collaboration and videoconferencing applications by delivering high-quality, low-latency audio and video over broadband connections that support multicast capability.

The second class of networked multimedia technology is streaming media. Streaming media technology enables the real time or on demand distribution of audio, video and multimedia on the Internet. Streaming media is the simultaneous transfer of digital media so that it is received as a continuous real-time stream. Streamed data is transmitted by a server application and received and rendered in real-time by client applications. These client applications can start playing back audio and video as soon as enough data has been received and stored in the receiver’s buffer. There could be up to a few seconds of startup delay, i.e., the delay between when the server starts streaming the data and when the client starts the playback. Some of the popular streaming media products are Microsoft’s Windows Media Player and RealNetworks’s RealPlayer for Internet streaming, and PacketVideo’s embedded media player for wireless streaming to embedded devices such as the next generation multimedia phones.

Standards have been developed to facilitate the inter-operability of the products from different vendors for both interactive multimedia and real-time streaming media applications. These standards are briefly described below.

Real-time Streaming Media Protocols

Real time media delivery requires a maximum end-to-end delay to guarantee that live audio and video can be received and presented continuously. For this reason, underlying protocols other than TCP are typically used for streaming media, since TCP is targeted for reliable transmission and frequent retransmission may violate the real time delay constraint, and also TCP is not suitable for IP multicast. The most commonly used transport protocol for real time streaming is the User Datagram Protocol (UDP). UDP is an unreliable protocol; it does not guarantee that there is no packet loss or packets are arrived in order. It is the higher layer’s responsibility to recover from lost data, duplicated packets, and out of order packets.

RTSP/RTP/RTCP

Since UDP is a general transport-layer protocol, to address some of the specific problems in real time streaming; the Real-Time Transport Protocol (RTP) was designed to transport real time media. As mentioned in the previous section, RTP is both an IETF Proposed Standard (RFC 1889) 13 and an ITU Standard (H.225.0). It is a packet format for multimedia data streams. RTP enables the end system to identify the type of data being transmitted, determine in what order the packets of data should be presented, and synchronize media streams from different sources. While RTP does not provide any mechanism to ensure timely delivery or provide other quality of service guarantees, it is augmented by a control protocol (RTCP, Real time Transport Control Protocol) 14 that enables the system to monitor the quality of the data distribution. As a part of RTP protocol, RTCP also provides control and identification mechanisms for RTP transmissions.

RTSP (Real Time Streaming Protocol, RFC 2326) 15 is a client-server multimedia presentation control protocol, designed to address the needs for efficient delivery of streamed multimedia over IP networks. It leverages existing web infrastructure (for example, inheriting authentication and PICS (Platform for Internet Content Selection)) from HTTP) and works well for both large audiences and single-viewer media-on-demand. RealNetworks, Netscape Communications and Columbia University jointly developed RTSP within the MMUSIC working group of IETF. In April 1998, it was published as a Proposed Standard by IETF. RTSP has mechanisms for time-based access to any part of media. In addition, RTSP is designed to control multicast delivery of streams, and is ideally suited to full multicast solutions.

SMIL

SMIL (pronounced smile) stands for Synchronized Multimedia Integration Language. It is a new markup language being developed by the World Wide Web Consortium (W3C) that would enable Web developers to divide multimedia content into separate files and streams (audio, video, text, and images), send them to a user’s computer individually, and then have them displayed together as if they were a single multimedia stream. By using a single time line for all of the media on a page, their display can be properly time-coordinated and synchronized.

Real Time Transport Protocol [next] [back] Reagon, Bernice Johnson (1942–) - History of African Americans, Cultural Worker, Curator

User Comments

Your email address will be altered so spam harvesting bots can't read it easily.
Hide my email completely instead?

Cancel or