First, the prourse flow media is widely defined, most of the time is to put the continuous image and sound information after compression processing, let the user download, listen, and not need to wait for the entire compressed file to download Video / audio transmission, compression technology that you can watch yourself. Streaming media also refers to a specific file format supported by this technical support: compressed flow files, which is transmitted through network and decoded through personal computer software. MCI is Microsoft initially proposed for Windows, with the rapid development of multimedia technology, various compression algorithms in this field, MCI technology is increasingly, and most obvious is that it does not support possible bits. The compression algorithm for processing DVD has appeared in a variety of new media formats that have occurred in recent years, while multimedia libraries such as VFW provided by Microsoft are too troublesome. How to do it? As the "successor" of MCI, Microsoft has introduced DirectShow technology based on DirectX (including DirectDraw, DirectSound, Direct3D), which is the media layer above DirectX, supports various videos from the local or network, Decoding and playback of media files in audio compression formats, can capture multimedia streams from the device, and can also process streaming of various compression algorithms. These formats include: MPEG audio and video standards, audio and video interaction standards (AVIs), Wave, MIDI, and advanced stream format ASF. DirectShow's way of media data processing is a multimedia stream. It uses this way in applications to reduce the complexity of programming, while automatically negotiating the conversion from the data source to the application, the flow interface provides unified, The data access control method can be predicted so that the application does not need to consider its original source and format when playing media data. Second, understanding DirectX DirectX is a programming environment for multimedia applications and hardware enhancements. It is Microsoft to develop its Windows adapted to adapt to the best platforms of various multimedia. DirectX has now become part of Microsoft's own SDK, while Windows 98 / Windows 2000 integrates DirectX, indicating that it has become part of the operating system. DirectX technology is an API (Application Interface), each DirectX component is the sum of the APIs that the user calls, through its application, can directly access the computer's hardware. This way, the application can utilize hardware accelerator. If the hardware accelerator cannot be used, DirectX can also simulate the accelerator to provide a powerful multimedia environment. To understand DirectX, we can divide the system into four layers: ● Hardware / network layer: Place multiple media devices, including graphics accelerators, sound cards, input devices, and network communication devices, etc. Multimedia basic service; ● DirectX media layer: Provide API functionality for animation, audio and video, etc. DirectShow is a technology built on the DirectX media layer, which is ActiveMovie 2.0. It appears in a set of API functions or ActiveX controls, the use is to allow developers to deliver high quality audio and video signals on the network. It is worth mentioning that DirectShow provides us with an open development environment, we can customize the components according to your own needs.
Third, the DirectShow technology structure DirectShow defines how to process streaming data using standard components, which are called filters. The filter has an input, an output pin angle (PIN), or both. In the most core position in DirectShow technology, it is an insertable standard component that is "filter", which is a COM object that performs a specific task. The filter can be subdivided into a source filter, a transform filter, a rendere filter, and the like. The filter operates streaming by reading writing, modifying data, and display data to the file. In order to complete the entire task, all filter Filter must be connected, and these three filters constitute a filter chart structure, as shown in Figure 3.1: Figure 3.1 Filter chart structure (Filter Graph)
As can be seen from Figure 3.1, the filter chart is a collection of various filters, which is connected in sequence by the input and output pin "PIN" of the filter, and these filters can be negotiated to determine what will support Form multimedia. Since DirectShow supports a reconstructed filter chart structure, you can play multiple types of media using the same software components. Developers can expand DirectShow's support for media by defining their own filters. In the filter chart structure, the source filter is used to acquire data from the data source, and transmit the data to the filter chart. The data source here can be a camera, an Internet, a disk file, and the like; the converter is used to obtain, process, and transmit Media data, which includes separation video and audio, splitter transport filter, unzipped video conversion filter (audio transfer filter), Audio Transform Filter, and Filtering The server is used to represent media data on the hardware, such as graphics cards and sound cards, or any places where media data can be accepted, such as disk files. It includes a video renderer filter that displays an image, and the audio data is sent to the audio generator filter on the sound card. In the filter chart, in order to complete a specific task, all required filters must be connected, because the output of the pre-stage filter must be input to the sub-filter. One filter has at least an input needle (INPUT PIN) and the specific output is sent to the output pin; Figure 3.2 shows a filter connection diagram:
3.2 Filter connection diagram
Your application does not need to separate individual filters in the filter chart, because at a higher level, DirectShow provides a component (FGM) called filter chart manager to manage the connection of these filters. The flow of flow media data is between the filters, and the FGM provides a set of COM interfaces, and the application can access filter charts, control streaming media or receiving filter events. If desired, it can automatically insert a suitable decoder and connect the output pins of the converted filter to the expression filter. The application can control the activity of the filter chart by communication with the filter chart manager. Program developers only need to call API functions to implement convection media control, such as running flow media in filter chaphs in the filter chaph; PAUSE method to stop the current media playback; STOP method stops playing streaming media. In addition, the Filter Graph Manager can transmit event information to the application layer, allowing applications to respond to event processing, such as playing or searching or searching data, stream end information, and the like. Figure 3.3 is an instance of MPEG decoding playback, it can be seen that Source Filter will send the acquired multimedia data to the MPEG Decomposition conversion filter, the MPEG decomposition converter has a input pin, two output pin angles respectively, video and audio respectively The interpretator is decoded, and the last two data represents a filter via a video, and the audio indicates that the filter is sent to the graphics card and the sound card. Figure 3.3 MPEG decoding instance