Intelligent flow technology and streaming media playback

xiaoxiao2021-03-06  41

Intelligent flow technology and streaming media playback mode BY: WGSCD 2005-2 Smart Technology (Surestream) Today, 28.8Kbps modem is the basic rate of Internet connection, Cable Modem, ADSL, DSS, ISDN, etc., etc., content providers have to Limit the quality of the media, or limit the number of connections. According to RealNetwork Site Statistics, the actual traffic is 10bps to 26kbps, a bell-shaped distribution, peak in 20kbps. This means that if the content provider selects a 20kbps fixed rate, there will be a large number of users not get a good quality signal and may stop the media stream and cause the client to buffer again until enough data is received.

One solution is that the server reducs data sent to the client to prevent rebuff, in RealSystem 5.0, this method is called "video stream thinning". The limitations of this method are that the RealVideo file is a data rate design. The result can be extended by extracting the internal frame to a lower rate, resulting in low quality. The farther away from the raw data rate, the worse the quality. Another solution is to create multiple files according to different connection rates. According to the user connection, the server sends the corresponding file. This method brings difficulties in making and management, and the user connection is dynamic, and the server cannot be coordinated in real time. . Intelligent flow technology overcomes bandwidth coordination and thinning through two ways. First, establish an encoding frame, allowing multiple streams of different rates to simultaneously encode, incorporate into the same file; second, using a complex client / server mechanism to detect bandwidth variation.

For software, equipment, and data transmission speeds, users browsed audio and video content in different bandwidths. To meet customer requirements, Progressive Networks company code, record the media data at different rates, and save it in a single file, this file is called a smart stream file, which is created to extend the flow file. When the client issues a request, it passes its bandwidth capacity to the server, and the media server transmits the smart stream file to the user according to the customer bandwidth. In this way, the user can see the most likely high quality transmission, and the producer only needs to maintain a single file, and the media server automatically switches according to the resulting bandwidth. Intelligent flows over describing the bandwidth features that changes on the I real-world Internet, to send high quality media and guarantee reliability, and provide solutions to the content authorization of the hybrid connection environment. The streaming media implementation is as follows:

Create a file for all connection rate environments

Transfer media at different rates in a hybrid environment

Depending on the network change, seamless switching to other rates

Key frame priority, audio is important than partial frame data

Backward compatible old version RealPlayer

Intelligent stream is a realization of so-called adaptive stream management (ASM) API in RealSystem G2, and ASM describes the type of streaming data, assisting intelligent decisions, and determining to send that type of packet. File formats and broadcast plugins define ASM rules. Assign a predefined attribute in the simplest form and a wide bandwidth to the packet group. For advanced forms, the ASM rules allow the plugin to change the packet transmission based on the network condition changes. Each ASM rule can have a definition of a definition, such as a demonstration definition customer bandwidth is 5,000 to 15,000 kbps, and the package loss is less than 2.5%. This condition describes the customer's current network connection, and the customer subscribes to this rule. The properties defined in the rules help RealServer effectively transmit packets, such as network conditions, and customers subscribe to a different rule.

Network protocol supporting streaming media transfer

Real-time transport protocol RTP and RTCP

RTP (Real-TimeTransportProtocol) is a transport protocol for multimedia data streams on the Internet. RTP is defined as working in a one-to-one or more transmission situation, and its purpose is to provide time information and streaming. RTP usually uses UDP to transmit data, but RTP can work on other protocols such as TCP or ATM. Two ports will be used when the application starts an RTP session: one to the RTP, one to the RTCP. The RTP itself does not provide a reliable transmission mechanism to deliver a packet in order, nor does traffic control or congestion control, which rely on RTCP. Usually the RTP algorithm does not implement as a separate network layer, but as part of the application code. Real-time transmission control protocol RTCP. RTCP (Real-TimeTransportControlProtocol) and RTP provide traffic control and congestion control services. During the RTP session, each participant sends a RTCP package periodically. The RTCP package contains statistics such as the number of data packets, the number of lost packets, etc., therefore, the server can dynamically change the transmission rate and even change the payload type. RTP and RTCP use, which can optimize the transmission efficiency with effective feedback and minimal overhead, thus particularly suitable for real-time data on the network. Real-time flow protocol RTSP

Real-time flow protocols RTSP (RealTimeStreamingProtocol) is made by RealNetworks and Netscape, which defines how a multimedia data is transmitted efficiently through IP networks. RTSP is located on the RTP and RTCP on the architecture, which uses TCP or RTP to complete data transmission. HTTP transmits HTML compared to RTSP, while RTP is transmitted is multimedia data. The HTTP request is issued by the client, and the server responds; when using RTSP, both the client and the server can issue a request, that is, the RTSP can be two-way.

Resource reservation protocol RSVP protocol

Since the audio and video data flow are more sensitive to the network's delay, high quality audio, video information, in addition to bandwidth requirements in the network, and other more conditions are required. RSVP (ResourceRereveProtocol) is a resource booking protocol on the Internet, using RSVP to reserve a part of the network resources (ie, bandwidth), to provide QoS for streaming. RSVP is integrated in some experimental systems such as the network video conferencing tool Vic.

Stream media playback

Unicast

A separate data channel needs to be created between the client and the media server, and each packet sent from one server can only be transmitted to a client, which is called unicast. Each user must send a separate query on the media server, while the media server must send each user to a copy of the data package. This huge redundancy first causes the server heavy burden, and the response takes a long time, even stops playback; the management also is forced to purchase hardware and bandwidth to ensure a certain quality of service.

Multicast

IP Multicast Technologies build a network with multicast capabilities, allowing routers to copy packets to multiple channels at a time. With multicast mode, a single server can simultaneously send continuous data streams simultaneously with tens of thousands of clients. The media server only needs to send a packet instead of multiple; all the requested client sharing the same packet. Information can be sent to the client of any address, reducing the total amount of packets transmitted on the network. Network utilization efficiency is greatly improved, and the cost is greatly reduced.

On-demand and broadcast

On-demand connection is a proactive connection between the client and the server. In the on-demand connection, the user initializes the client connection by selecting a content item. Users can start, stop, back, fast forward, or pause flow. On-demand connections provide maximum control of convection, but this way is quickly used throughout the network bandwidth due to each client connected to the server.

The broadcast is the user passive reception stream. During the broadcast, the client receives stream, but cannot control the stream. For example, the user cannot pause, fast forward, or back to the stream. A single copy of the data packet in the broadcast mode will be sent to all users on the network. When using unicast transmission, you need to copy multiple copies, send them to those users who need it in multiple point-to-point, while using broadcast mode, a single copy of the packet will be sent to all users on the network. No matter whether the user needs, the above two transmission methods will have a waste of network bandwidth. The multicast absorbs the length of the above two transmission methods, overcomes the weaknesses of the above two transmission methods, and sends a single copy of the packet to those customers required. Multicast does not copy multiple copies of the packet to the network, nor does it send the packet to those customers who don't need it, ensuring the smallest bandwidth of multimedia applications on the network. Specifically to programming applications, such as' has a dynamic voice data (byte) array, how can you play voice data in this array continuously?

'Private Declare Function mciExecute Lib "winmm.dll" Alias ​​"mciExecute" (ByVal lpstrCommand As String) As Long' Private Declare Function mciSendString Lib "winmm.dll" Alias ​​"mciSendStringA" (ByVal lpstrCommand As String, ByVal lpstrReturnString As String, ByVal UreturnLength As Long, Byval HWndCallback As Long) AS Long '-------------------------------------------------------------------------------------------------- ------------------------------------------- 'is ​​the left voice collection play, sound card or use the left virtual instrument 'need tract API is: Declare Function waveInOpen Lib "winmm.dll" (ByVal lphWaveIn As Long, ByVal uDeviceID As Long, ByVal lpFormat As WAVEFORMAT, ByVal dwCallback As Long, ByVal dwInstance As Long , ByVal dwFlags As Long) As Long Declare Function waveInPrepareHeader Lib "winmm.dll" (ByVal hWaveIn As Long, ByVal lpWaveInHdr As WAVEHDR, ByVal uSize As Long) As Long Declare Function waveInReset Lib "winmm.dll" (ByVal hWaveIn As Long) As long declare function Waveinstart lib "Winmm.dll" (Byval Hwavein As Long) As long declare function Waveinstop lib "w" w "w" w "w" w "WAVAL HWAVEIN AS long" inmm.dll "(ByVal hWaveIn As Long) As Long Declare Function waveInUnprepareHeader Lib" winmm.dll "(ByVal hWaveIn As Long, ByVal lpWaveInHdr As WAVEHDR, ByVal uSize As Long) As Long Declare Function waveInClose Lib" winmm.dll "(ByVal hWaveIn As Long) As Long Declare Function waveInAddBuffer Lib "winmm.dll" (ByVal hWaveIn As Long, ByVal lpWaveInHdr As WAVEHDR, ByVal uSize As Long) As LongDeclare Sub CopyStructFromPtr Lib "kernel32" Alias ​​"RtlMoveMemory" (ByVal struct As Any, ByVal PTR As Long, ByVal CB As Long

Declare Function GlobalAlloc Lib "kernel32" (ByVal wFlags As Long, ByVal dwBytes As Long) As Long Declare Function GlobalLock Lib "kernel32" (ByVal hmem As Long) As Long Declare Function GlobalFree Lib "kernel32" (ByVal hmem As Long) As Long public Declare Function timeSetEvent Lib "winmm.dll" (ByVal uDelay As Long, ByVal uResolution As Long, ByVal lpFunction As Any, ByVal dwUser As Long, ByVal uFlags As Long) As Long Private Declare Function timeBeginPeriod Lib "winmm.dll" (ByVal uPeriod As Long) As Long Private Declare Function timeEndPeriod Lib "winmm.dll" (ByVal uPeriod As Long) As Long Private Declare Function timeKillEvent Lib "winmm.dll" (ByVal uID As Long) As Long 'structure Public structure WAVEHDR Dim lpData AS Long Dim DWBYTESRECORDED AS Long Dim DwusetesRecorded AS Long Dim Dwuser AS Long Dim DWFLAGS AS Long Dim Dwloops As Long Dim Lpnext As Long Dim Reserved As Long End Structure

Public Structure WAVEFORMAT Dim wFormatTag As Integer Dim nChannels As Integer Dim nSamplesPerSec As Long Dim nAvgBytesPerSec As Long Dim nBlockAlign As Integer Dim wBitsPerSample As Integer Dim cbSize As Integer End Structure

'Is constant Const SND_ASYNC = & H1 Const SND_NODEFAULT = & H2 Const Flags & = SND_ASYNC Or SND_NODEFAULT Const WHDR_DONE = & H1' do you precision timers, three memory operations with two open API memory in the memory, the intention loop recording, by CopyStructFromPtr () To get the data you need, if you don't do it, you can also put it directly, but the above API does not include a square tone API or more, specifically, you can refer to stream technology and RTP / RTCP protocol!

转载请注明原文地址:https://www.9cbs.com/read-54256.html

New Post(0)