Recall an AVI file

zhaozj2021-02-08  269

Here we explain how to use DirectShow to compress an AVI file. We focus on video compression, the same method can be applied to audio compression. We are divided into the following steps: 1. Select a compressed filter. There are many ways to compress video or audio, such as: a, local DirectShow Filter B, Video Compression Management Encoder (VCM) C, Audio Compression Management Encoder (ACM) D, DirectX Media Object (DMOS) In DirectShow, VCM is encapsulated by the AVI Compressor filter. Similarly, the ACM encoder is also encapsulated by the ACM Wrapper filter. DMOS is encapsulated by DMO Wrapper filter. System Equipment Enumerator provides a unified approach to enumerate and create these compressors, we don't have to consider the underlying operation. Please refer to the foregoing described above in the enumeration equipment method. Here we only give code: void OnInitdialog (hResult HR; icreateDevenum * psysdevenum = null; ienummoniker * penum = null; iMoniker * PMoniker = NULL;

hr = CoCreateInstance (CLSID_SystemDeviceEnum, NULL, CLSCTX_INPROC_SERVER, IID_ICreateDevEnum, (void **) & pSysDevEnum); hr = pSysDevEnum-> CreateClassEnumerator (CLSID_VideoCompressorCategory, & pEnum, 0); while (S_OK == pEnum-> Next (1, & pMoniker, NULL) ) {IPropertyBag * pPropBag = NULL; pMoniker-> BindToStorage (0, 0, IID_IPropertyBag, (void **) & pPropBag); VARIANT var; VariantInit (& var); hr = pPropBag-> Read (L "FriendlyName", & var, 0 ); If (succeeded (hr)) {LRESULT ISEL = AddString (Getdlgitem (Idc_codec_list), var.bstrval;} variantclear (& var); ppropbag-> release (); pmoniker-> release ();

Senddlgitemmessage (HDLG, IDC_CODEC_LIST, LB_SETCURSEL, 0, 0); psysdevenum-> release (); penum-> release ();} Create an instance of a filter, call the IMONIKER :: BindToObject method. Method will return an IBaseFilter interface pointer. Just like the following: ibasefilter * pfilter = null; hr = pmoniker-> bindtoobject (null, null, iid_ibasefilter, (void **) & pfilter); if (succeededed (HR)) {// can use filters // To release the IBaseFilter interface pointer} 2, set the video compression property video compressed filter to support the IamVideoCompression interface in its output pin. Use this interface to set compressed properties, such as rate, compression quality waiting. First, call the ibasefilter :: Enumpins method to find the output pin of the filter and then query the pin for the interface. Some filters are not supported by all interfaces, and some compressed properties are not supported. In order to determine the ability to support, we call IamVideoCompression :: getInfo to determine. This method returns some information: A, a set performance identifier B, a description string and version string C, the default rate, quality, etc. It is called in the following syntax: hr = pcompress-> GetInfo (PSzversion, & CBVersion , Pszdesc, & CBDesc, & lkeyframe, & lpframe, & dblquality, & lcap;

The PSZVersion and PSZDesc parameters are wide character buffers that receive versions and description strings. The CBVersion and CBDesc parameters receive the requested buffer size. Ikeyframe, LPFrame and DBlquality parameters get the default rate, P 率 rate and quality. Quality is expressed by floating point from 0.0 to 1.0. The ICAP parameter gets an aligned capacity identifier, which is defined as a COMPRESSIONCAPS enumeration type. Any one of these parameters can be set to null ,. For example, if the first call method is called, the first and third parameters are NULL when the method is called for the first time. Use the return value CBVersion and CBDesc to allocate buffers, then call this method again: int CBVersion, CBDesc; // size in bytes, not character-> getInfo (0, & CBVersion, 0, & CBDesc, 0, 0, 0, 0); if (succeededed (HR)) {wchar * pszversion = new wchar [cbversion / 2]; wchar * pszdesc = new wchar [cbdesc / 2]; hr = pcompress-> getInfo (pszversion, 0, pszdesc, 0, 0, 0, 0, 0);} ICAP parameters provide support capabilities for filters for the IamVideoCompression method. For example, if the ICAP contains the compressioncaps_cankeyframe flag, you can call the IamvideoCompression :: get_keyframerate method to get the rate of key, call amvideocompression :: put_keyframerate to set the rate. If the value of ICAP does not contain these parameters, you can only use the default value. if (lCap & CompressionCaps_CanKeyFrame) {hr = pCompress-> get_KeyFrameRate (& lKeyFrame); if (FAILED (hr) || lKeyFrame <0) {lKeyFrame = lDefaultKeyFrame;. // from GetInfo}} The following code output pin try Look for the IamvideoCompression interface. If successful, it will return the default and actual compressed attribute value. HRESULT HR = E_FAIL; IENUMPINS * PENUM = NULL; IPIN * PPIN = NULL; IAMVIDEOCOMPRESSION * PCompRESS = NULL;

// Look for the pin Pfilter-> Enumpins that support IamvideoCompression; while (s_ok == penum-> next (1, & ppin, null) {hr = ppin-> queryinterface (IID_IAMVIDEOCOMPRESSION, (Void **) & pcompress PPIN-> Release (); if (succeededed (hr)) // Founded The interface. {Break;}} {long LCAP; // Performance Identifier long lkeyframe, lpframe; // true value Double M_quality; lpframedef; // Default Double Qualitydef; // Get the default value and its performance hr = pcompress-> getInfo (0, 0, 0, 0, & keyframedef, & lpframedef, & quhip, & lcap); if ( SUCCEEDED (hr)) {// get the actual value if (lCap & CompressionCaps_CanKeyFrame) {hr = pCompress-> get_KeyFrameRate (& lKeyFrame); if (FAILED (hr). || lKeyFrame <0) lKeyFrame = lKeyFrameDef;} if (lCap & CompressionCaps_canbframe) {hr = pcompress-> get_pframesperkeyframe (& lpframe); if (Failed (HR) || LPFrame <0) lpframe = lpframedef;}} (lcap & compressioncaps_canquality) {hr = pcompress-> get_quality (& quaality); if (Failed (HR) || Quality <0) Quality = QualityDef;}}} IcapturegraphBuilder2 interface to create your filter graphics, you can call the ICAPTUREGRAPHBUILDER2 :: FindInterface method to get the IamvideoCompression interface. 3. Creating a compressed graphic figure below is a typical AVI file filtering graph:

Avi_Splitter filter pulls data from the file (Async)) and decomposes the video and audio stream. The video decompressed filter decodes the compressed video and then recompresses the video compressor. The compressed video enters the AVI MUX filter. The audio stream is not compressed in this example, so it is transferred directly from AVI Splitter to AVI MUX. AVI MUX performs an interlaced scan and then outputs data to disk using the File Write filter. Note that even if there is no audio stream in the original file, the AVI Mux filter is also necessary. The easiest way to create this filtering graph is to use Capture Graph Builder, which is a part of DirectShow to create capture graphics or other custom filtering graphics. Note: DirectShow has two Capture Graph Builder versions. They provide different interfaces and classes identified. Early version class identity is CLSID_CAPTUREGRAPHBUILD, and the interface is IcapturegraphBuilder. It is compatible with the existing application. The new version of the class identity is the new interface name of the CLSID_CAPTUREGRAPHBUILDER2 is IcaptureGraphbuilder2. The new interface has more flexibility than the old interface. Create a Capture Graph Builder we use CoCreateInstance: ICaptureGraphBuilder2 * pBuild = NULL; hr = CoCreateInstance (CLSID_CaptureGraphBuilder2, NULL, CLSCTX_INPROC_SERVER, IID_ICaptureGraphBuilder2, (void **) & pBuild); then we use the Capture Graph Builder to create filter graphics: a, established part Rendered filtration graphics, which contains AVI MUX filters and File Writer filters. b, add source filters and compressed filters. C, connect the source filter to the MUX filter. The following is gradually explaining each detail: establishing a rendering section to establish a rendering section of the filter graphic, call the ICApturegraphBuilder2 :: setOutputFileName method. It returns a MUX filter and a pointer to the File Write. MUX is required to establish a filter graphics below, but this example does not require File Write, so its parameter is NULL. Ibasefilter * pmux = null; pbuild-> setoutputFileName (& MediaSubtype_avi, // file type wszoutputfile, // file name & pmux, // get a pointer null pointing to Multiplexer); // Get a pointer to File Write

When the method returns, the MUX filter has a significant reference count, so be sure to make it released later. The MUX filter provides two interfaces to control the AVI format: IconfigInterleaving interface: Set Interlaced Mode Iconfigavimux Interface: Setting the mainstream and AVI compatibility index Adding Source Filter and Compressed Filter Next We want to add a source filter in the filter graphics And compressed filters. When you call SetOutputFileName, Capture Graph Builder automatically creates an instance of a filtering graphic manager. You can call the iCapturegraphBuilder :: getFiltergraph method to get the pointer to the filter graphics manager you just created. IGraphBuilder * pGraph = NULL; pBuild-> GetFiltergraph (& pGraph); now we call IgraphBuilder :: AddSourceFilter ways to add asynchronous file source filter, and then call IfilterGraph :: AddFilter way to add video compression filter: IBaseFilter * pSrc = NULL PGRAPH-> AddSourceFilter (WSZINPUTFILE, L "Source Filter", & PSRC); PGRAPH-> AddFilter (PVComp, L "compressor"); to this step our state is like the picture below, the source filter and compressed filter do not Connect with other filters. The last step of connecting to MUX is to connect the source filter to the AVI MUX filter via a video compressed filter. We use the IcapturegraphBuilder2 :: Renderstream method to connect to the output pin of the source filter to the specified filter. The first two parameters specify that the pins of that source filter are used to connect by specifying the classification and media types of the pins. Asynchronous file source filters have only one output pin, so these parameters are set to NULL. The latter three parameters specify the source filter, compressed filter, and Mux filters. The following code demonstrates the video stream through a video compressed filter: pbuild-> renderstream (null, // output pin type null, // media type PSRC, // source filter PVComp, / / ​​compressed filter PMUX) It is assumed that the source file contains an audio stream, and the AVI Splitter filter will output an audio stream in the output pin. In order to connect this pin, we need to call Renderstream: Pbuild-> Renderstream (Null, Null, PSRC, NULL, PMUX); here we do not specify a compressed filter. Moreover, the output pin of the source filter is already connected, so the Renderstream method searches for an unconnected output pin to the Splitter filter. It can directly connect the pin to the MUX filter. But if there is no audio stream in the source file, the second call will fail. 4. Write files If you want to write the file normally, you must call the iMediaControl :: RUN method to run the filter graphics. Wait until the play is complete, call iMediaControl :: STOP. If you want to display the progress of the file, you can use iMediaseEKing to query the MUX filter. Call the iMediaseEKing :: get Duration method to get the duration of the file. Timed use iMediaseEKing :: getCurrentPostition method to get the current location, but must be filtered in the graphics. Note, in general, we query the graphics manager with the iMediaseEKing interface. If you are writing the file, this will be a special case. Here we have to query the MUX filter.

Query the filtering graphic positioning requires playback, not when writing a file. Looking at the following code readers will be clearer. IMediaSeeking * pSeek = NULL; IMediaEventEx * pEvent = NULL; IMediaControl * pControl = NULL; REFERENCE_TIME rtTotal; hr = pMux-> QueryInterface (IID_IMediaSeeking, (void **) & pSeek); hr = pGraph-> QueryInterface (IID_IMediaEventEx, (void * *) & pevent); hr = pgraph-> queryinterface (IID_IMEDIACONTROL, (void **) & pControl);

/ / Set DirectShow time notification hr = pevent-> setNotifyWindow hWnd, wm_graphNotify, 0);

HR = getduration (& rttotal); Senddlgitemmessage (hwnd, idc_progress1, pbm_setrange, 0, makelparam (0, RTTOTAL / 10000000)); // Start time .uint_ptr res = setTimer (hwnd, nidevent, 100, null); // run filtering Graphic.pcontrol-> run (); When the application receives the timer event, it updates the current location: void mobile (hwnd hdlg, iMediaseeking * pseek) {reference_time RtNow; HRESULT HR = pseek-> getCurrentPosition (& RTNOW); if (Successdededed (HR)) {senddlgitemmessage (HDLG, IDC_Progress1, PBM_SETPOS, RTNOW / 10000000, 0);}} When the application receives the DirectShow end event, it can stop the graphic, just as below: LRESULT CALLBACK WNDPROC (HWND HDLG , Uint MSG, WPARAM WPARAM, LPARAM LPARAM) {Switch (MSG) {/ * ... * / Case WM_GraphNotify: dohandleevent (); Break; / * ... * /}}

Void dohandleevent () {long evcode, param1, param2; bool bcomplete = false; if (! pevent) return;

// get all events while (SUCCEEDED (pEvent-> GetEvent (& evCode, & param1, & param2, 0)) {pEvent-> FreeEventParams (evCode, param1, param2); switch (evCode) {case EC_USERABORT: case EC_ERRORABORT: case EC_COMPLETE : Bcomplete = true; Break;}} if (bcomplete) {pControl-> stop (); // Important! You must stop the graph! // Turn off the event. Pevent-> setNotifyWindow (NULL, 0, 0); PEVENT-> Release (); pevent = null; senddlgitemmessage (idc_progress1, pbm_setpos, 0, 0); killtimer (hwnd, nidEvent);}}

转载请注明原文地址:https://www.9cbs.com/read-1014.html

New Post(0)