我正在使用Windows Media Foundation创buildvideo播放应用程序。
我已经使用IMFTransform接口创build了自定义EVR混音器,以及下面链接中提到的其他几个强制接口。
https://msdn.microsoft.com/en-us/library/windows/desktop/ms701624(v=vs.85).aspx
在ubuntu中使用gcc生成特定频率的声音?
term.h:标题找不到
Win32应用程序暂停最小化窗口animation
如何设置其他程序的任何可见窗口在顶部?
// Create the video renderer. hr = MFCreateVideoRendererActivate(hVideoWindow,&pActivate); // Add custom mixer hr = pActivate->SetGUID(MF_ACTIVATE_CUSTOM_VIDEO_mixer_CLSID,CLSID_CMyCustommixerMFT);
EVR在我的自定义调音MF_E_CANNOT_CREATE_SINK调用了所需的方法,但最后我得到错误MF_E_CANNOT_CREATE_SINK 。
对于自定义混音器,我指的是混音器的MFT实现,我指的是Windows媒体基础示例中的mft_grayscale示例应用程序。 大部分的IMFTransform实现都是从这个示例中复制的。
https://msdn.microsoft.com/en-us/library/windows/desktop/bb970487%28v=vs.85%29.aspx?f=255&MSPPError=-2147217396
我被这个错误困住了很长时间,无法find解决这个问题的方法。
**编辑**
根据方法STDMETHODIMP Getdeviceid(IID *pdeviceid)的文档,
If a mixer or presenter uses Direct3D 9,it must return the value IID_IDirect3DDevice9 in pdeviceid. The EVR's default mixer and presenter both return this value. If you write a custom mixer or presenter,it can return some other value. However,the mixer and presenter must use matching device identifiers.
自定义混音器应该返回与演示者匹配的这个值。 正如我在我的代码中实现自定义混音器,我正在返回deviceid作为IID_IDirect3DDevice9 。
更新
我只有一个audio和video的videostream在里面。
GetStreamLimits – input和输出stream限制设置为1
GetStreamIDs – inputID 0和输出ID 0
AddInputStreams – 在我的调音AddInputStreams ,我没有调用这个方法
build议我将使用MFTrace进行debugging。
Windows 10蓝牙低功耗连接c#
fork-exec中的copy-on-write如何工作?
创build自定义语音命令(GNU / Linux)
Windows编程C语言
这是一个实现自定义视频混合器,可以在Windows 7和Microsoft MediaSession / Evr上工作的可能方法。
我发现dxva2可以使用两种NV12流格式。 当然,我们不能混合使用alpha流,但它的作品。 我的图形驱动程序告诉我,dxva2子流只能处理AYUV / AI44,但NV12的工作(奇怪)。 NV12没有alpha,但如果我们不叠加,我们可以显示两个视频(也许更多)。 我还发现CLSID_CColorConvertDMO未能提供AYUV与MediaSession / Evr和自定义视频混合器。 颜色转换可以在自定义视频混合器中完成。
我会多次发帖,所以要耐心等待。 这里很难格式化代码。 对于代码的某些部分,您将需要来自MFNode的公共文件
有些接口只是返回E_NOTIMPL,他们只是在这里检查什么Evr需要。 所以我ommited E_NOTIMPL使用的代码。
自定义视频混合器类:
//---------------------------------------------------------------------------------------------- // CustomVideomixer.h //---------------------------------------------------------------------------------------------- #ifndef MFTCUSTOMVIDEOmixer_H #define MFTCUSTOMVIDEOmixer_H class CCustomVideomixer : BaSEObject,public IMFVideodeviceid,public IMfgetService,public IMFTopologyServiceLookupClient,public IMFTransform,public IMFVideomixerControl,public IMFVideoProcessor,public IMFAttributes,public IMFVideomixerBitmap,public IMFVideoPositionMapper { public: // CustomVideomixer.cpp static HRESULT CreateInstance(IUnkNown*,REFIID,void**); // IUnkNown - CustomVideomixer.cpp STDMETHODIMP QueryInterface(REFIID,void**); STDMETHODIMP_(ULONG) AddRef(); STDMETHODIMP_(ULONG) Release(); // IMFVideodeviceid - CustomVideomixer.cpp STDMETHODIMP Getdeviceid(IID*); // IMfgetService - CustomVideomixer.cpp STDMETHODIMP GetService(REFGUID,LPVOID*); // IMFTopologyServiceLookupClient - CustomVideomixer.cpp STDMETHODIMP InitServicePointers(IMFTopologyServiceLookuP*); STDMETHODIMP ReleaseServicePointers(); // IMFTransform - CustomVideomixer_Transform.cpp STDMETHODIMP GetStreamLimits(DWORD*,DWORD*,DWORD*); STDMETHODIMP GetStreamCount(DWORD*,DWORD*); STDMETHODIMP GetStreamIDs(DWORD,DWORD,DWORD*); STDMETHODIMP GetInputStreamInfo(DWORD,MFT_INPUT_STREAM_INFO*); STDMETHODIMP GetoutputStreamInfo(DWORD,MFT_OUTPUT_STREAM_INFO*); STDMETHODIMP GetAttributes(IMFAttributes**); STDMETHODIMP GetInputStreamAttributes(DWORD,IMFAttributes**); STDMETHODIMP GetoutputStreamAttributes(DWORD,IMFAttributes**); STDMETHODIMP DeleteInputStream(DWORD); STDMETHODIMP AddInputStreams(DWORD,DWORD*); STDMETHODIMP GetInputAvailableType(DWORD,IMFMediaType**); STDMETHODIMP GetoutputAvailableType(DWORD,IMFMediaType**); STDMETHODIMP SetInputType(DWORD,IMFMediaType*,DWORD); STDMETHODIMP SetoutputType(DWORD,DWORD); STDMETHODIMP GetInputCurrentType(DWORD,IMFMediaType**); STDMETHODIMP GetoutputCurrentType(DWORD,IMFMediaType**); STDMETHODIMP GetInputStatus(DWORD,DWORD*); STDMETHODIMP GetoutputStatus(DWORD*); STDMETHODIMP SetoutputBounds(LONGLONG,LONGLONG); STDMETHODIMP ProcessEvent(DWORD,IMFMediaEvent*); STDMETHODIMP ProcessMessage(MFT_MESSAGE_TYPE,ULONG_PTR); STDMETHODIMP ProcessInput(DWORD,IMFSample*,DWORD); STDMETHODIMP ProcessOutput(DWORD,MFT_OUTPUT_DATA_BUFFER*,DWORD*); // IMFVideomixerControl - CustomVideomixer_mixer.cpp STDMETHODIMP GetStreamOutputRect(DWORD,MFVideonormalizedRect*); STDMETHODIMP GetStreamZOrder(DWORD,DWORD*); STDMETHODIMP SetStreamOutputRect(DWORD,const MFVideonormalizedRect*); STDMETHODIMP SetStreamZOrder(DWORD,DWORD); // IMFVideoProcessor - CustomVideomixer_mixer.cpp STDMETHODIMP GetAvailableVideoProcessorModes(UINT*,GUID**); STDMETHODIMP GetBackgroundColor(COLORREF*); STDMETHODIMP GetFilteringRange(DWORD,DXVA2_ValueRange*); STDMETHODIMP GetFilteringValue(DWORD,DXVA2_Fixed32*); STDMETHODIMP GetProcAmpRange(DWORD,DXVA2_ValueRange*); STDMETHODIMP GetProcAmpValues(DWORD,DXVA2_ProcAmpValues*); STDMETHODIMP GetVideoProcessorCaps(LPGUID,DXVA2_VideoProcessorCaps*); STDMETHODIMP GetVideoProcessorMode(LPGUID); STDMETHODIMP SetBackgroundColor(COLORREF); STDMETHODIMP SetFilteringValue(DWORD,DXVA2_Fixed32*); STDMETHODIMP SetProcAmpValues(DWORD,DXVA2_ProcAmpValues*); STDMETHODIMP SetVideoProcessorMode(LPGUID); // IMFAttributes - CustomVideomixer_Attributes.cpp STDMETHODIMP Compare(IMFAttributes*,MF_ATTRIBUTES_MATCH_TYPE,BOOL*); STDMETHODIMP CompareItem(REFGUID,REFPROPVARIANT,BOOL*); STDMETHODIMP copyAllItems(IMFAttributes*); STDMETHODIMP DeleteallItems(); STDMETHODIMP DeleteItem(REFGUID); STDMETHODIMP GetAllocatedBlob(REFGUID,UINT8**,UINT32*); STDMETHODIMP GetAllocatedString(REFGUID,LPWSTR*,UINT32*); STDMETHODIMP GetBlob(REFGUID,UINT8*,UINT32,UINT32*); STDMETHODIMP GetBlobSize(REFGUID,UINT32*); STDMETHODIMP GetCount(UINT32*); STDMETHODIMP GetDouble(REFGUID,double*); STDMETHODIMP GetGUID(REFGUID,GUID*); STDMETHODIMP GetItem(REFGUID,PROPVARIANT*); STDMETHODIMP GetItemByIndex(UINT32,GUID*,PROPVARIANT*); STDMETHODIMP GetItemType(REFGUID,MF_ATTRIBUTE_TYPE*); STDMETHODIMP GetString(REFGUID,LPWSTR,UINT32*); STDMETHODIMP GetStringLength(REFGUID,UINT32*); STDMETHODIMP GetUINT32(REFGUID,UINT32*); STDMETHODIMP GetUINT64(REFGUID,UINT64*); STDMETHODIMP GetUnkNown(REFGUID,LPVOID*); STDMETHODIMP LockStore(); STDMETHODIMP SetBlob(REFGUID,const UINT8*,UINT32); STDMETHODIMP SetDouble(REFGUID,double); STDMETHODIMP SetGUID(REFGUID,REFGUID); STDMETHODIMP SetItem(REFGUID,REFPROPVARIANT); STDMETHODIMP SetString(REFGUID,LPCWSTR); STDMETHODIMP SetUINT32(REFGUID,UINT32); STDMETHODIMP SetUINT64(REFGUID,UINT64); STDMETHODIMP SetUnkNown(REFGUID,IUnkNown*); STDMETHODIMP UnlockStore(); // IMFVideomixerBitmap - CustomVideomixer_Bitmap.cpp STDMETHODIMP ClearalphaBitmap(); STDMETHODIMP GetAlphaBitmapParameters(MFVideoAlphaBitmapParams*); STDMETHODIMP SetAlphaBitmap(const MFVideoAlphaBitmaP*); STDMETHODIMP UpdatealphaBitmapParameters(const MFVideoAlphaBitmapParams*); // IMFVideoPositionMapper - CustomVideomixer_Bitmap.cpp STDMETHODIMP MapOutputCoordinatetoInputStream(float,float,float*,float*); private: // CustomVideomixer.cpp CCustomVideomixer(); virtual ~CCustomVideomixer(); CriticSection m_CriticSection; volatile long m_nRefCount; CDxva2Manager m_cDxva2Manager; IMediaEventSink* m_pMediaEventSink; IMFMediaType* m_pRefInputType; IMFMediaType* m_pSubInputType; IMFMediaType* m_pOutputType; BOOL m_bDraining; DWORD m_dwInputStreamCount; BOOL m_bHaveRefOuput; BOOL m_bHaveSubOuput; // CustomVideomixer.cpp HRESULT SetD3DManager(IDirect3DDeviceManager9*); HRESULT BeginStreaming(ULONG_PTR); HRESULT Flush(); // CustomVideomixer_Type.cpp HRESULT GetoutputType(IMFMediaType**); }; #endif
CustomVideomixer.cpp:
//---------------------------------------------------------------------------------------------- // CustomVideomixer.cpp //---------------------------------------------------------------------------------------------- #include "StdAfx.h" CCustomVideomixer::CCustomVideomixer() : m_nRefCount(1),m_pMediaEventSink(NULL),m_pRefInputType(NULL),m_pSubInputType(NULL),m_pOutputType(NULL),m_bDraining(FALSE),m_dwInputStreamCount(1),m_bHaveRefOuput(FALSE),m_bHaveSubOuput(FALSE) { TRACE_TRANSFORM((L"CustomVideomixer::CTOR")); } CCustomVideomixer::~CCustomVideomixer() { TRACE_TRANSFORM((L"CustomVideomixer::DTOR")); AutoLock lock(m_CriticSection); Flush(); m_cDxva2Manager.ReleaseDxva2(); SAFE_RELEASE(m_pMediaEventSink); SAFE_RELEASE(m_pRefInputType); SAFE_RELEASE(m_pSubInputType); SAFE_RELEASE(m_pOutputType); } HRESULT CCustomVideomixer::CreateInstance(IUnkNown* pUnkOuter,REFIID iid,void** ppv) { TRACE_TRANSFORM((L"CustomVideomixer::CreateInstance")); HRESULT hr; IF_Failed_RETURN(hr = (ppv == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (pUnkOuter != NULL ? CLASS_E_NOAGGREGATION : S_OK)); CCustomVideomixer* pMFT = new (std::nothrow)CCustomVideomixer; IF_Failed_RETURN(pMFT == NULL ? E_OUTOFMEMORY : S_OK); LOG_HRESULT(hr = pMFT->QueryInterface(iid,ppv)); SAFE_RELEASE(pMFT); return hr; } ULONG CCustomVideomixer::AddRef() { LONG lRef = InterlockedIncrement(&m_nRefCount); TRACE_REFCOUNT((L"CustomVideomixer::AddRef m_nRefCount = %d",lRef)); return lRef; } ULONG CCustomVideomixer::Release() { ULONG uCount = InterlockedDecrement(&m_nRefCount); TRACE_REFCOUNT((L"CustomVideomixer::Release m_nRefCount = %d",uCount)); if (uCount == 0) { delete this; } return uCount; } HRESULT CCustomVideomixer::QueryInterface(REFIID riid,void** ppv) { TRACE_TRANSFORM((L"CustomVideomixer::QI : riid = %s",GetIIDString(riid))); // IMFQualityAdvise // IEVRTrustedVideoPlugin static const QITAB qit[] = { QITABENT(CCustomVideomixer,IMFVideodeviceid),QITABENT(CCustomVideomixer,IMfgetService),IMFTopologyServiceLookupClient),IMFTransform),IMFVideomixerControl),IMFVideoProcessor),IMFAttributes),IMFVideomixerBitmap),IMFVideoPositionMapper),{ 0 } }; return QISearch(this,qit,riid,ppv); } HRESULT CCustomVideomixer::Getdeviceid(IID* pdeviceid) { TRACE_TRANSFORM((L"CustomVideomixer::Getdeviceid")); HRESULT hr; IF_Failed_RETURN(hr = (pdeviceid == NULL ? E_POINTER : S_OK)); *pdeviceid = IID_IDirect3DDevice9; return hr; } HRESULT CCustomVideomixer::GetService(REFGUID guidService,REFIID riid,LPVOID* ppvObject) { TRACE_TRANSFORM((L"CustomVideomixer::GetService : guidService = %s - riid = %s",MFServiceString(guidService),GetIIDString(riid))); HRESULT hr; IF_Failed_RETURN(hr = (ppvObject == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (guidService != MR_VIDEO_mixer_SERVICE ? MF_E_UNSUPPORTED_SERVICE : S_OK)); if (riid == IID_IMFVideomixerControl || riid == IID_IMFVideoProcessor || riid == IID_IMFTransform) { hr = QueryInterface(riid,ppvObject); } else { LOG_HRESULT(hr = MF_E_UNSUPPORTED_SERVICE); } return hr; } HRESULT CCustomVideomixer::InitServicePointers(IMFTopologyServiceLookuP* pLookup) { TRACE_TRANSFORM((L"CustomVideomixer::InitServicePointers")); // https://msdn.microsoft.com/en-us/library/windows/desktop/dd319606(v=vs.85).aspx // https://msdn.microsoft.com/en-us/library/windows/desktop/dd406901(v=vs.85).aspx HRESULT hr; IF_Failed_RETURN(hr = (pLookup == NULL ? E_POINTER : S_OK)); AutoLock lock(m_CriticSection); //IF_Failed_RETURN(hr = (IsActive() ? MF_E_INVALIDREQUEST : S_OK)); SAFE_RELEASE(m_pMediaEventSink); DWORD dwObjectCount = 1; (void)pLookup->LookupService(MF_SERVICE_LOOKUP_GLOBAL,MR_VIDEO_RENDER_SERVICE,IID_PPV_ARGS(&m_pMediaEventSink),&dwObjectCount); IF_Failed_RETURN(hr = (m_pMediaEventSink == NULL ? E_POINTER : S_OK)); // IMFClock* pInterface = NULL; // (void)pLookup->LookupService(MF_SERVICE_LOOKUP_GLOBAL,IID_PPV_ARGS(&pInterface),&dwObjectCount); // SAFE_RELEASE(pInterface); // IMFVideoPresenter* pInterface = NULL; // (void)pLookup->LookupService(MF_SERVICE_LOOKUP_GLOBAL,&dwObjectCount); // IF_Failed_RETURN(hr = (pInterface == NULL ? E_POINTER : S_OK)); // SAFE_RELEASE(pInterface); // IMFVideoRenderer* pInterface2 = NULL; // (void)pLookup->LookupService(MF_SERVICE_LOOKUP_GLOBAL,IID_PPV_ARGS(&pInterface2),&dwObjectCount); // IF_Failed_RETURN(hr = (pInterface2 == NULL ? E_POINTER : S_OK)); // SAFE_RELEASE(pInterface2); return hr; } HRESULT CCustomVideomixer::ReleaseServicePointers() { TRACE_TRANSFORM((L"CustomVideomixer::ReleaseServicePointers")); AutoLock lock(m_CriticSection); SAFE_RELEASE(m_pMediaEventSink); return S_OK; } HRESULT CCustomVideomixer::SetD3DManager(IDirect3DDeviceManager9* pDeviceManager) { TRACE_TRANSFORM((L"CustomVideomixer::SetD3DManager")); HRESULT hr = S_OK; m_cDxva2Manager.ReleaseDxva2(); if (pDeviceManager != NULL) { if (m_pRefInputType != NULL && m_pOutputType != NULL) IF_Failed_RETURN(hr = m_cDxva2Manager.InitDxva2(pDeviceManager,m_pOutputType,m_pRefInputType,m_pSubInputType)); } return hr; } HRESULT CCustomVideomixer::BeginStreaming(ULONG_PTR ulParam) { TRACE_TRANSFORM((L"CustomVideomixer::BeginStreaming")); HRESULT hr; IF_Failed_RETURN(hr = (m_pMediaEventSink == NULL ? E_POINTER : S_OK)); //IF_Failed_RETURN(hr = m_pMediaEventSink->Notify(EC_SAMPLE_NEEDED,ulParam,0)); IF_Failed_RETURN(hr = m_pMediaEventSink->Notify(EC_SAMPLE_NEEDED,1,0)); // MF_E_INVALIDSTREAMNUMBER // MF_E_TRANSFORM_TYPE_NOT_SET return hr; } HRESULT CCustomVideomixer::Flush() { TRACE_TRANSFORM((L"CustomVideomixer::Flush")); m_bDraining = FALSE; m_bHaveRefOuput = FALSE; m_bHaveSubOuput = FALSE; return S_OK; }
CustomVideomixer_Transform.cpp:
//---------------------------------------------------------------------------------------------- // CustomVideomixer_Transform.cpp //---------------------------------------------------------------------------------------------- #include "StdAfx.h" HRESULT CCustomVideomixer::GetStreamLimits(DWORD* pdwInputMinimum,DWORD* pdwInputMaximum,DWORD* pdwOutputMinimum,DWORD* pdwOutputMaximum) { TRACE_TRANSFORM((L"CustomVideomixer::GetStreamLimits")); HRESULT hr; IF_Failed_RETURN(hr = ((pdwInputMinimum == NULL || pdwInputMaximum == NULL || pdwOutputMinimum == NULL || pdwOutputMaximum == NULL) ? E_POINTER : S_OK)); *pdwInputMinimum = 1; *pdwInputMaximum = 16; *pdwOutputMinimum = 1; *pdwOutputMaximum = 1; return hr; } HRESULT CCustomVideomixer::GetStreamCount(DWORD* pcInputStreams,DWORD* pcOutputStreams) { TRACE_TRANSFORM((L"CustomVideomixer::GetStreamCount")); HRESULT hr; IF_Failed_RETURN(hr = ((pcInputStreams == NULL || pcOutputStreams == NULL) ? E_POINTER : S_OK)); *pcInputStreams = m_dwInputStreamCount; *pcOutputStreams = 1; return hr; } HRESULT CCustomVideomixer::GetStreamIDs(DWORD dwInputIDArraySize,DWORD* pdwInputIDs,DWORD dwOutputIDArraySize,DWORD* pdwOutputIDs) { TRACE_TRANSFORM((L"CustomVideomixer::GetStreamIDs")); HRESULT hr; IF_Failed_RETURN(hr = (dwInputIDArraySize == 0 || dwOutputIDArraySize == 0 ? MF_E_BUFFERTOOSMALL : S_OK)); IF_Failed_RETURN(hr = (pdwInputIDs == NULL || pdwOutputIDs == NULL ? E_POINTER : S_OK)); *pdwOutputIDs = 0; if (m_dwInputStreamCount == 1) *pdwInputIDs = 0; else IF_Failed_RETURN(hr = E_FAIL); return hr; } HRESULT CCustomVideomixer::GetInputStreamInfo(DWORD dwInputStreamID,MFT_INPUT_STREAM_INFO* pStreamInfo) { TRACE_TRANSFORM((L"CustomVideomixer::GetInputStreamInfo")); TRACE_TRANSFORM((L"dwInputStreamID = %d",dwInputStreamID)); HRESULT hr; IF_Failed_RETURN(hr = (pStreamInfo == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (dwInputStreamID > 1 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); pStreamInfo->dwFlags = MFT_INPUT_STREAM_WHOLE_SAMPLES | MFT_INPUT_STREAM_SINGLE_SAMPLE_PER_BUFFER | MFT_INPUT_STREAM_FIXED_SAMPLE_SIZE | MFT_INPUT_STREAM_DOES_NOT_ADDREF; pStreamInfo->hnsMaxLatency = 0; pStreamInfo->cbSize = 0; pStreamInfo->cbMaxLookahead = 0; pStreamInfo->cbAlignment = 0; return hr; } HRESULT CCustomVideomixer::GetoutputStreamInfo(DWORD dwOutputStreamID,MFT_OUTPUT_STREAM_INFO* pStreamInfo) { TRACE_TRANSFORM((L"CustomVideomixer::GetoutputStreamInfo")); TRACE_TRANSFORM((L"dwOutputStreamID = %d",dwOutputStreamID)); HRESULT hr; IF_Failed_RETURN(hr = (pStreamInfo == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (dwOutputStreamID != 0 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); AutoLock lock(m_CriticSection); pStreamInfo->dwFlags = MFT_OUTPUT_STREAM_WHOLE_SAMPLES | MFT_OUTPUT_STREAM_SINGLE_SAMPLE_PER_BUFFER | MFT_OUTPUT_STREAM_FIXED_SAMPLE_SIZE | MFT_OUTPUT_STREAM_PROVIDES_SAMPLES; pStreamInfo->cbAlignment = 0; pStreamInfo->cbSize = 0; return hr; } HRESULT CCustomVideomixer::GetAttributes(IMFAttributes** ppAttributes) { TRACE_TRANSFORM((L"CustomVideomixer::GetAttributes")); HRESULT hr; IF_Failed_RETURN(hr = (ppAttributes == NULL ? E_POINTER : S_OK)); *ppAttributes = this; (*ppAttributes)->AddRef(); return hr; } HRESULT CCustomVideomixer::GetInputStreamAttributes(DWORD dwInputStreamID,IMFAttributes** ppAttributes) { TRACE_TRANSFORM((L"CustomVideomixer::GetInputStreamAttributes")); TRACE_TRANSFORM((L"dwInputStreamID = %d",dwInputStreamID)); HRESULT hr; IF_Failed_RETURN(hr = (dwInputStreamID > 1 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); IF_Failed_RETURN(hr = (ppAttributes == NULL ? E_POINTER : S_OK)); *ppAttributes = this; (*ppAttributes)->AddRef(); return hr; } HRESULT CCustomVideomixer::GetoutputStreamAttributes(DWORD dwOutputStreamID,IMFAttributes** ppAttributes) { TRACE_TRANSFORM((L"CustomVideomixer::GetoutputStreamAttributes")); TRACE_TRANSFORM((L"dwOutputStreamID = %d",dwOutputStreamID)); HRESULT hr; IF_Failed_RETURN(hr = (dwOutputStreamID != 0 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); IF_Failed_RETURN(hr = (ppAttributes == NULL ? E_POINTER : S_OK)); *ppAttributes = this; (*ppAttributes)->AddRef(); return hr; } HRESULT CCustomVideomixer::DeleteInputStream(DWORD dwStreamID) { TRACE_TRANSFORM((L"CustomVideomixer::DeleteInputStream")); TRACE_TRANSFORM((L"dwStreamID = %d",dwStreamID)); if (dwStreamID == 0) return MF_E_INVALIDREQUEST; else if (dwStreamID != 1) return MF_E_INVALIDSTREAMNUMBER; else if(m_dwInputStreamCount != 2) return MF_E_INVALIDREQUEST; //MF_E_TRANSFORM_INPUT_REMAINING m_dwInputStreamCount--; return S_OK; } HRESULT CCustomVideomixer::AddInputStreams(DWORD cStreams,DWORD* adwStreamIDs) { TRACE_TRANSFORM((L"CustomVideomixer::AddInputStreams")); HRESULT hr; IF_Failed_RETURN(hr = (cStreams != 1 ? E_INVALIDARG : S_OK)); IF_Failed_RETURN(hr = (adwStreamIDs == NULL ? E_INVALIDARG : S_OK)); IF_Failed_RETURN(hr = (*adwStreamIDs != 1 ? E_INVALIDARG : S_OK)); if (m_dwInputStreamCount == 1) m_dwInputStreamCount++; else IF_Failed_RETURN(hr = E_INVALIDARG); return S_OK; } HRESULT CCustomVideomixer::GetInputAvailableType(DWORD dwInputStreamID,DWORD dwTypeIndex,IMFMediaType** ppType) { TRACE_TRANSFORM((L"CustomVideomixer::GetInputAvailableType")); TRACE_TRANSFORM((L"dwInputStreamID = %d - dwTypeIndex = %d",dwInputStreamID,dwTypeIndex)); return MF_E_NO_MORE_TYPES; } HRESULT CCustomVideomixer::GetoutputAvailableType(DWORD dwOutputStreamID,IMFMediaType** ppType) { TRACE_TRANSFORM((L"CustomVideomixer::GetoutputAvailableType")); TRACE_TRANSFORM((L"dwOutputStreamID = %d - dwTypeIndex = %d",dwOutputStreamID,dwTypeIndex)); HRESULT hr; IF_Failed_RETURN(hr = (ppType == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (dwOutputStreamID != 0 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); IF_Failed_RETURN(hr = (dwTypeIndex != 0 ? MF_E_NO_MORE_TYPES : S_OK)); AutoLock lock(m_CriticSection); if (m_pRefInputType == NULL) { hr = MF_E_TRANSFORM_TYPE_NOT_SET; } else { hr = GetoutputType(ppType); } return hr; } HRESULT CCustomVideomixer::SetInputType(DWORD dwInputStreamID,IMFMediaType* pType,DWORD dwFlags) { TRACE_TRANSFORM((L"CustomVideomixer::SetInputType")); TRACE_TRANSFORM((L"dwInputStreamID = %d",dwInputStreamID)); HRESULT hr; IF_Failed_RETURN(hr = (dwInputStreamID > 1 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); IF_Failed_RETURN(hr = (dwFlags & ~MFT_SET_TYPE_TEST_ONLY ? E_INVALIDARG : S_OK)); BOOL bReallySet = ((dwFlags & MFT_SET_TYPE_TEST_ONLY) == 0); TRACE_TRANSFORM((L"bReallySet = %s",bReallySet ? L"TRUE" : L"FALSE")); AutoLock lock(m_CriticSection); if (pType) { LogMediaType(pType); } else { if (dwInputStreamID == 0) SAFE_RELEASE(m_pRefInputType); else SAFE_RELEASE(m_pSubInputType); return hr; } if (bReallySet) { if (dwInputStreamID == 0) { SAFE_RELEASE(m_pRefInputType); m_pRefInputType = pType; m_pRefInputType->AddRef(); } else { SAFE_RELEASE(m_pSubInputType); m_pSubInputType = pType; m_pSubInputType->AddRef(); } } return hr; } HRESULT CCustomVideomixer::SetoutputType(DWORD dwOutputStreamID,DWORD dwFlags) { TRACE_TRANSFORM((L"CustomVideomixer::SetoutputType")); TRACE_TRANSFORM((L"dwOutputStreamID = %d",dwOutputStreamID)); HRESULT hr; IF_Failed_RETURN(hr = (dwOutputStreamID != 0 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); IF_Failed_RETURN(hr = (dwFlags & ~MFT_SET_TYPE_TEST_ONLY ? E_INVALIDARG : S_OK)); BOOL bReallySet = ((dwFlags & MFT_SET_TYPE_TEST_ONLY) == 0); TRACE_TRANSFORM((L"bReallySet = %s",bReallySet ? L"TRUE" : L"FALSE")); AutoLock lock(m_CriticSection); if (pType) { LogMediaType(pType); } else { SAFE_RELEASE(m_pOutputType); return hr; } if (bReallySet) { SAFE_RELEASE(m_pOutputType); m_pOutputType = pType; m_pOutputType->AddRef(); } return hr; } HRESULT CCustomVideomixer::GetInputCurrentType(DWORD dwInputStreamID,IMFMediaType** ppType) { TRACE_TRANSFORM((L"CustomVideomixer::GetInputCurrentType")); TRACE_TRANSFORM((L"dwInputStreamID = %d",dwInputStreamID)); HRESULT hr; IF_Failed_RETURN(hr = (ppType == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (dwInputStreamID > 1 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); AutoLock lock(m_CriticSection); IMFMediaType* m_pInputType = dwInputStreamID == 0 ? m_pRefInputType : m_pSubInputType; if (!m_pInputType) { hr = MF_E_TRANSFORM_TYPE_NOT_SET; } else { // Todo : clone MediaType *ppType = m_pInputType; (*ppType)->AddRef(); } return hr; } HRESULT CCustomVideomixer::GetoutputCurrentType(DWORD dwOutputStreamID,IMFMediaType** ppType) { TRACE_TRANSFORM((L"CustomVideomixer::GetoutputCurrentType")); TRACE_TRANSFORM((L"dwOutputStreamID = %d",dwOutputStreamID)); HRESULT hr; IF_Failed_RETURN(hr = (ppType == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (dwOutputStreamID != 0 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); AutoLock lock(m_CriticSection); if (!m_pOutputType) { hr = MF_E_TRANSFORM_TYPE_NOT_SET; } else { // Todo : clone MediaType *ppType = m_pOutputType; (*ppType)->AddRef(); } return hr; } HRESULT CCustomVideomixer::GetInputStatus(DWORD dwInputStreamID,DWORD* pdwFlags) { TRACE_TRANSFORM((L"CustomVideomixer::GetInputStatus")); TRACE_TRANSFORM((L"dwInputStreamID = %d",dwInputStreamID)); HRESULT hr; IF_Failed_RETURN(hr = (pdwFlags == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (dwInputStreamID > 1 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); AutoLock lock(m_CriticSection); // I think we can always process *pdwFlags = MFT_INPUT_STATUS_ACCEPT_DATA; return hr; } HRESULT CCustomVideomixer::GetoutputStatus(DWORD* pdwFlags) { TRACE_TRANSFORM((L"CustomVideomixer::GetoutputStatus")); HRESULT hr; IF_Failed_RETURN(hr = (pdwFlags == NULL ? E_POINTER : S_OK)); AutoLock lock(m_CriticSection); /*if (m_bHaveOuput) { *pdwFlags = MFT_OUTPUT_STATUS_SAMPLE_READY; } else { *pdwFlags = 0; }*/ return hr; } HRESULT CCustomVideomixer::SetoutputBounds(LONGLONG /*hnsLowerBound*/,LONGLONG /*hnsUpperBound*/) { TRACE_TRANSFORM((L"CustomVideomixer::SetoutputBounds")); return E_NOTIMPL; } HRESULT CCustomVideomixer::ProcessEvent(DWORD /*dwInputStreamID*/,IMFMediaEvent* /*pEvent */) { TRACE_TRANSFORM((L"CustomVideomixer::ProcessEvent")); return E_NOTIMPL; } HRESULT CCustomVideomixer::ProcessMessage(MFT_MESSAGE_TYPE eMessage,ULONG_PTR ulParam) { TRACE_TRANSFORM((L"CustomVideomixer::ProcessMessage : %s (Param = %d)",MFTMessageString(eMessage),ulParam)); HRESULT hr = S_OK; AutoLock lock(m_CriticSection); switch (eMessage) { case MFT_MESSAGE_NOTIFY_BEGIN_STREAMING: //case MFT_MESSAGE_NOTIFY_START_OF_STREAM: hr = BeginStreaming(ulParam); break; case MFT_MESSAGE_COMMAND_FLUSH: case MFT_MESSAGE_NOTIFY_END_STREAMING: case MFT_MESSAGE_NOTIFY_END_OF_STREAM: hr = Flush(); break; case MFT_MESSAGE_COMMAND_DRAIN: m_bDraining = TRUE; break; case MFT_MESSAGE_SET_D3D_MANAGER: hr = SetD3DManager(reinterpret_cast<IDirect3DDeviceManager9*>(ulParam)); // hr = MF_E_UNSUPPORTED_D3D_TYPE... break; } return hr; } HRESULT CCustomVideomixer::ProcessInput(DWORD dwInputStreamID,IMFSample* pSample,DWORD dwFlags) { TRACE_TRANSFORM((L"CustomVideomixer::ProcessInput")); TRACE_TRANSFORM((L"dwInputStreamID = %d",dwInputStreamID)); HRESULT hr; IF_Failed_RETURN(hr = (pSample == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (dwInputStreamID > 1 ? MF_E_INVALIDSTREAMNUMBER : S_OK)); IF_Failed_RETURN(hr = (dwFlags != 0 ? E_INVALIDARG : S_OK)); AutoLock lock(m_CriticSection); if (m_bHaveRefOuput || m_bHaveSubOuput) { return MF_E_NOTACCEPTING; } if (SUCCEEDED(hr = m_cDxva2Manager.ProcessInput(pSample,dwInputStreamID))) { if (dwInputStreamID == 0) { m_bHaveRefOuput = TRUE; LOG_HRESULT(hr = m_pMediaEventSink->Notify(EC_SAMPLE_NEEDED,0)); } else { m_bHaveSubOuput = TRUE; LOG_HRESULT(hr = m_pMediaEventSink->Notify(EC_SAMPLE_NEEDED,0)); } } return hr; } HRESULT CCustomVideomixer::ProcessOutput(DWORD dwFlags,DWORD cOutputBufferCount,MFT_OUTPUT_DATA_BUFFER* pOutputSamples,DWORD* pdwStatus) { TRACE_TRANSFORM((L"CustomVideomixer::ProcessOutput")); HRESULT hr; IF_Failed_RETURN(hr = (dwFlags != 0 ? E_INVALIDARG : S_OK)); IF_Failed_RETURN(hr = (cOutputBufferCount != 1 ? E_INVALIDARG : S_OK)); IF_Failed_RETURN(hr = ((pOutputSamples == NULL || pdwStatus == NULL) ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (pOutputSamples[0].dwStreamID != 0 ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (pOutputSamples[0].pSample == NULL ? E_INVALIDARG : S_OK)); AutoLock lock(m_CriticSection); if (m_bHaveRefOuput || m_bHaveSubOuput) { IF_Failed_RETURN(hr = m_cDxva2Manager.ProcessOutput(pOutputSamples[0].pSample)); if(m_bHaveRefOuput) m_bHaveRefOuput = FALSE; if (m_bHaveSubOuput) m_bHaveSubOuput = FALSE; } else { return MF_E_TRANSFORM_NEED_MORE_INPUT; } return hr; }
您尝试用mixer形式将自定义代码注入到由Microsoft编写的IMFMediaSink接口的对象中,并且您已MF_E_CANNOT_CREATE_SINK – 将任何错误概括为MediaSink的错误消息。 这个错误可能有一百个原因。 对于这种情况,微软开发了一个特殊的工具MFTrace 。 它记录由微软开发的代码内部的调用。 而且,不可能发现错误的原因,因为你没有提供你的代码。 例如 – 您为GetStreamLimits方法设置了多少流限制,或者您为GetStreamIDs方法设置了什么ID,或者代码过程如何调用方法AddInputStreams 。 只有IMFTransform有23个方法。
你的问题有这么小的信息,所以不可能推荐一些有用的东西。
问候。
其余的代码。
CustomVideomixer_Attributes.cpp:
//---------------------------------------------------------------------------------------------- // CustomVideomixer_Attributes.cpp //---------------------------------------------------------------------------------------------- #include "StdAfx.h" HRESULT CCustomVideomixer::GetUINT32(REFGUID guidKey,UINT32* punValue) { TRACE_TRANSFORM((L"CustomVideomixer::GetUINT32")); if (punValue == NULL) return E_POINTER; if (guidKey == MF_SA_D3D_AWARE) { TRACE_TRANSFORM((L"MF_SA_D3D_AWARE")); *punValue = TRUE; return S_OK; } else if(guidKey == MF_SA_required_SAMPLE_COUNT) { TRACE_TRANSFORM((L"MF_SA_required_SAMPLE_COUNT")); *punValue = 1; return S_OK; } else { TRACE_TRANSFORM((L"ERROR : MF_E_ATTRIBUTENOTFOUND")); } return MF_E_ATTRIBUTENOTFOUND; } HRESULT CCustomVideomixer::SetBlob(REFGUID guidKey,const UINT8* pBuf,UINT32 cbBufSize) { TRACE_TRANSFORM((L"CustomVideomixer::SetBlob")); if (guidKey == VIDEO_ZOOM_RECT) { TRACE_TRANSFORM((L"VIDEO_ZOOM_RECT")); return S_OK; } else { TRACE_TRANSFORM((L"ERROR : MF_E_ATTRIBUTENOTFOUND")); } return MF_E_ATTRIBUTENOTFOUND; }
CustomVideomixer_Type.cpp:
//---------------------------------------------------------------------------------------------- // CustomVideomixer_Type.cpp //---------------------------------------------------------------------------------------------- #include "StdAfx.h" HRESULT CCustomVideomixer::GetoutputType(IMFMediaType** ppType) { TRACE_TRANSFORM((L"CustomVideomixer::GetoutputType")); HRESULT hr = S_OK; IMFMediaType* pOutputType = NULL; try { IF_Failed_THROW(hr = MFCreateMediaType(&pOutputType)); IF_Failed_THROW(hr = pOutputType->SetGUID(MF_MT_MAJOR_TYPE,MFMediaType_Video)); // MFVideoFormat_ARGB32 MFVideoFormat_RGB32 IF_Failed_THROW(hr = pOutputType->SetGUID(MF_MT_SUBTYPE,MFVideoFormat_RGB32)); IF_Failed_THROW(hr = pOutputType->SetUINT32(MF_MT_FIXED_SIZE_SAMPLES,TRUE)); IF_Failed_THROW(hr = pOutputType->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT,TRUE)); IF_Failed_THROW(hr = pOutputType->SetUINT32(MF_MT_INTERLACE_MODE,MFVideoInterlace_Progressive)); IF_Failed_THROW(hr = MFSetAttributeratio(pOutputType,MF_MT_PIXEL_ASPECT_RATIO,1)); *ppType = pOutputType; (*ppType)->AddRef(); } catch (HRESULT) {} SAFE_RELEASE(pOutputType); return hr; }
Dxva2部分:
//---------------------------------------------------------------------------------------------- // Dxva2Manager.h //---------------------------------------------------------------------------------------------- #ifndef DXVA2MANAGER_H #define DXVA2MANAGER_H class CDxva2Manager { public: CDxva2Manager(); ~CDxva2Manager() { ReleaseDxva2(); } HRESULT InitDxva2(IDirect3DDeviceManager9*,IMFMediaType*); void ReleaseDxva2(); HRESULT ProcessInput(IMFSample*,const DWORD); HRESULT ProcessOutput(IMFSample*); private: IDirectXVideoProcessor* m_pVideoProcessor; IDirect3DSurface9* m_pRefSurface9; IDirect3DSurface9* m_pSubSurface9; LONGLONG m_llDuration; LONGLONG m_llTime; UINT32 m_uiRefWidth; UINT32 m_uiRefheight; UINT32 m_uiRefLine; UINT32 m_uiSubWidth; UINT32 m_uiSubHeight; UINT32 m_uiSubLine; HRESULT GetDxva2VideoDesc(DXVA2_VideoDesc*,IMFMediaType*); }; #endif
Dxva2Manager.cpp:
//---------------------------------------------------------------------------------------------- // Dxva2Manager.cpp //---------------------------------------------------------------------------------------------- #include "StdAfx.h" CDxva2Manager::CDxva2Manager() : m_pVideoProcessor(NULL),m_pRefSurface9(NULL),m_pSubSurface9(NULL),m_llDuration(0LL),m_llTime(0LL),m_uiRefWidth(0),m_uiRefheight(0),m_uiRefLine(0),m_uiSubWidth(0),m_uiSubHeight(0),m_uiSubLine(0) { } HRESULT CDxva2Manager::InitDxva2(IDirect3DDeviceManager9* pDeviceManager,IMFMediaType* pOutputType,IMFMediaType* pRefInputType,IMFMediaType* pSubInputType) { assert(m_pVideoProcessor == NULL); assert(m_pRefSurface9 == NULL); assert(m_pSubSurface9 == NULL); HRESULT hr; IF_Failed_RETURN(hr = (pDeviceManager == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (pOutputType == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (pRefInputType == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (pSubInputType == NULL ? E_POINTER : S_OK)); IDirectXVideoProcessorService* pVideoProcessorService = NULL; HANDLE hD3d9Device = INVALID_HANDLE_VALUE; GUID subtype = GUID_NULL; UINT32 uiWidth = 0; UINT32 uiHeight = 0; D3DFORMAT D3DFormat = D3DFMT_UNKNowN; DXVA2_VideoDesc dxva2VideoDesc = { 0 }; UINT uiCount = 0; UINT uiStreamCount = 1; GUID* guids = NULL; try { IF_Failed_THROW(hr = pDeviceManager->OpenDeviceHandle(&hD3d9Device)); IF_Failed_THROW(hr = pDeviceManager->GetVideoService(hD3d9Device,IID_PPV_ARGS(&pVideoProcessorService))); IF_Failed_THROW(hr = GetDxva2VideoDesc(&dxva2VideoDesc,pRefInputType)); IF_Failed_THROW(hr = pVideoProcessorService->GetVideoProcessorDeviceGuids(&dxva2VideoDesc,&uiCount,&guids)); IF_Failed_THROW(hr = pVideoProcessorService->CreateVideoProcessor(guids[0],&dxva2VideoDesc,D3DFMT_X8R8G8B8,uiStreamCount,&m_pVideoProcessor)); IF_Failed_THROW(hr = pRefInputType->GetGUID(MF_MT_SUBTYPE,&subtype)); IF_Failed_THROW(hr = MfgetAttributeSize(pRefInputType,MF_MT_FRAME_SIZE,&uiWidth,&uiHeight)); if (subtype == MFVideoFormat_NV12) D3DFormat = (D3DFORMAT)D3DFMT_NV12; else IF_Failed_THROW(hr = E_FAIL); IF_Failed_THROW(hr = pVideoProcessorService->CreateSurface(uiWidth,uiHeight,D3DFormat,D3DPOOL_DEFAULT,DXVA2_VideoProcessorrendertarget,&m_pRefSurface9,NULL)); m_uiRefWidth = uiWidth; m_uiRefheight = uiHeight; m_uiRefLine = m_uiRefheight + (m_uiRefheight / 2); IF_Failed_THROW(hr = pSubInputType->GetGUID(MF_MT_SUBTYPE,&subtype)); IF_Failed_THROW(hr = MfgetAttributeSize(pSubInputType,&uiHeight)); /*if (subtype == MFVideoFormat_AYUV) D3DFormat = (D3DFORMAT)D3DFMT_AYUV; else IF_Failed_THROW(hr = E_FAIL);*/ m_uiSubWidth = uiWidth; m_uiSubHeight = uiHeight; m_uiSubLine = m_uiSubHeight + (m_uiSubHeight / 2); IF_Failed_THROW(hr = pVideoProcessorService->CreateSurface(uiWidth,&m_pSubSurface9,NULL)); } catch (HRESULT) {} CoTaskMemFree(guids); if (hD3d9Device != INVALID_HANDLE_VALUE) { LOG_HRESULT(pDeviceManager->CloseDeviceHandle(hD3d9Device)); } SAFE_RELEASE(pVideoProcessorService); return hr; } void CDxva2Manager::ReleaseDxva2() { SAFE_RELEASE(m_pVideoProcessor); SAFE_RELEASE(m_pRefSurface9); SAFE_RELEASE(m_pSubSurface9); m_llDuration = 0LL; m_llTime = 0LL; m_uiRefWidth = 0; m_uiRefheight = 0; m_uiRefLine = 0; m_uiSubWidth = 0; m_uiSubHeight = 0; m_uiSubLine = 0; } HRESULT CDxva2Manager::ProcessInput(IMFSample* pSample,const DWORD dwStreamId) { HRESULT hr = S_OK; IMFMediaBuffer* pBuffer = NULL; BYTE* pData = NULL; DWORD dwLength = 0; IDirect3DSurface9* pSurface9 = NULL; D3DLOCKED_RECT d3dRect; LONG lStride = 0; UINT32 uiWidth; UINT32 uiLine; IMF2DBuffer* p2DBuffer = NULL; try { if (dwStreamId == 0) { IF_Failed_THROW(hr = pSample->GetSampleTime(&m_llTime)); IF_Failed_THROW(hr = pSample->GetSampleDuration(&m_llDuration)); } IF_Failed_THROW(hr = pSample->GetBufferByIndex(0,&pBuffer)); IF_Failed_THROW(hr = pBuffer->QueryInterface(IID_PPV_ARGS(&p2DBuffer))); IF_Failed_THROW(hr = p2DBuffer->Lock2D(&pData,&lStride)); if (dwStreamId == 0) { pSurface9 = m_pRefSurface9; uiWidth = m_uiRefWidth; uiLine = m_uiRefLine; } else if (dwStreamId == 1) { pSurface9 = m_pSubSurface9; //uiWidth = m_uiSubWidth * 4; //uiLine = m_uiSubHeight; uiWidth = m_uiSubWidth; uiLine = m_uiSubLine; } IF_Failed_THROW(hr = pSurface9->LockRect(&d3dRect,NULL,0)); IF_Failed_THROW(hr = MFcopyImage((BYTE*)d3dRect.pBits,d3dRect.Pitch,pData,lStride,uiWidth,uiLine)); IF_Failed_THROW(hr = pSurface9->UnlockRect()); } catch (HRESULT) {} if (pBuffer && pData) { LOG_HRESULT(p2DBuffer->Unlock2D()); } SAFE_RELEASE(p2DBuffer); SAFE_RELEASE(pBuffer); return hr; } HRESULT CDxva2Manager::ProcessOutput(IMFSample* pSample) { HRESULT hr = S_OK; IMFMediaBuffer* pBuffer = NULL; IDirect3DSurface9* pSurface = NULL; DXVA2_VideoProcessBltParams blt = { 0 }; RECT rc = { 0,m_uiRefWidth,m_uiRefheight }; DXVA2_AYUVSample16 color; color.Cr = 0x0000; color.Cb = 0xFFFF; color.Y = 0x0000; color.Alpha = 0xFFFF; const UINT EX_COLOR_INFO[][2] = { // SDTV ITU-R BT.601 ycbcr to driver's optimal RGB range { DXVA2_VideoTransferMatrix_BT601,DXVA2_NominalRange_UnkNown },// SDTV ITU-R BT.601 ycbcr to studio RGB [16...235] { DXVA2_VideoTransferMatrix_BT601,DXVA2_NominalRange_16_235 },// SDTV ITU-R BT.601 ycbcr to computer RGB [0...255] { DXVA2_VideoTransferMatrix_BT601,DXVA2_NominalRange_0_255 },// HDTV ITU-R BT.709 ycbcr to driver's optimal RGB range { DXVA2_VideoTransferMatrix_BT709,// HDTV ITU-R BT.709 ycbcr to studio RGB [16...235] { DXVA2_VideoTransferMatrix_BT709,// HDTV ITU-R BT.709 ycbcr to computer RGB [0...255] { DXVA2_VideoTransferMatrix_BT709,DXVA2_NominalRange_0_255 } }; DXVA2_Fixed32 ProcAmpValues[4] = { 0 }; DXVA2_Fixed32 NFilterValues[6] = { 0 }; DXVA2_Fixed32 DFilterValues[6] = { 0 }; DXVA2_VideoSample samples[2] = { 0 }; UINT uiStreamCount = 2; blt.TargetFrame = m_llTime; blt.TargetRect = rc; blt.ConstrictionSize.cx = rc.right; blt.ConstrictionSize.cy = rc.bottom; blt.BackgroundColor = color; blt.DestFormat.VideoChromasubsampling = DXVA2_VideoChromasubsampling_UnkNown; blt.DestFormat.NominalRange = EX_COLOR_INFO[0][1]; blt.DestFormat.VideoTransferMatrix = DXVA2_VideoTransferMatrix_UnkNown; blt.DestFormat.VideoLighting = DXVA2_VideoLighting_dim; blt.DestFormat.VideoPrimaries = DXVA2_VideoPrimaries_BT709; blt.DestFormat.VideoTransferFunction = DXVA2_VideoTransFunc_709; blt.DestFormat.SampleFormat = DXVA2_SampleProgressiveFrame; blt.ProcAmpValues.Brightness = ProcAmpValues[0]; blt.ProcAmpValues.Contrast.Value = 1; blt.ProcAmpValues.Hue = ProcAmpValues[2]; blt.ProcAmpValues.Saturation.Value = 1; blt.Alpha = DXVA2_Fixed32OpaqueAlpha(); blt.NoiseFilterLuma.Level = NFilterValues[0]; blt.NoiseFilterLuma.Threshold = NFilterValues[1]; blt.NoiseFilterLuma.Radius = NFilterValues[2]; blt.NoiseFilterChroma.Level = NFilterValues[3]; blt.NoiseFilterChroma.Threshold = NFilterValues[4]; blt.NoiseFilterChroma.Radius = NFilterValues[5]; blt.DetailFilterLuma.Level = DFilterValues[0]; blt.DetailFilterLuma.Threshold = DFilterValues[1]; blt.DetailFilterLuma.Radius = DFilterValues[2]; blt.DetailFilterChroma.Level = DFilterValues[3]; blt.DetailFilterChroma.Threshold = DFilterValues[4]; blt.DetailFilterChroma.Radius = DFilterValues[5]; samples[0].Start = m_llTime; samples[0].End = m_llTime + m_llDuration; samples[0].SampleFormat.VideoChromasubsampling = DXVA2_VideoChromasubsampling_MPEG2; samples[0].SampleFormat.NominalRange = DXVA2_NominalRange_16_235; samples[0].SampleFormat.VideoTransferMatrix = EX_COLOR_INFO[0][0]; samples[0].SampleFormat.VideoLighting = DXVA2_VideoLighting_dim; samples[0].SampleFormat.VideoPrimaries = DXVA2_VideoPrimaries_BT709; samples[0].SampleFormat.VideoTransferFunction = DXVA2_VideoTransFunc_709; samples[0].SampleFormat.SampleFormat = DXVA2_SampleProgressiveFrame; samples[0].SrcSurface = m_pRefSurface9; samples[0].SrcRect = rc; rc.bottom = m_uiRefheight / 2; samples[0].DstRect = rc; samples[0].Planaralpha = DXVA2FloatToFixed(float(0xFF) / 0xFF); rc.right = m_uiSubWidth; rc.bottom = m_uiSubHeight; samples[1] = samples[0]; samples[1].SampleFormat = samples[0].SampleFormat; samples[1].SampleFormat.SampleFormat = DXVA2_SampleSubStream; samples[1].SrcSurface = m_pSubSurface9; samples[1].SrcRect = rc; rc.top = m_uiSubHeight / 2; samples[1].DstRect = rc; try { IF_Failed_THROW(hr = pSample->ConvertToContiguousBuffer(&pBuffer)); IF_Failed_THROW(hr = MfgetService(pBuffer,MR_BUFFER_SERVICE,__uuidof(IDirect3DSurface9),(void**)&pSurface)); IF_Failed_THROW(hr = m_pVideoProcessor->VideoProcessBlt(pSurface,&blt,samples,NULL)); } catch (HRESULT) {} SAFE_RELEASE(pBuffer); SAFE_RELEASE(pSurface); return hr; } HRESULT CDxva2Manager::GetDxva2VideoDesc(DXVA2_VideoDesc* dxva2VideoDesc,IMFMediaType* pRefInputType) { HRESULT hr; IF_Failed_RETURN(hr = (dxva2VideoDesc == NULL ? E_POINTER : S_OK)); IF_Failed_RETURN(hr = (pRefInputType == NULL ? E_POINTER : S_OK)); D3DFORMAT D3DFormat = D3DFMT_UNKNowN; GUID subtype = { 0 }; UINT32 uiWidth = 0; UINT32 uiHeight = 0; UINT32 uiNumerator = 0; UINT32 uiDenominator = 0; const UINT EX_COLOR_INFO[][2] = { // SDTV ITU-R BT.601 ycbcr to driver's optimal RGB range { DXVA2_VideoTransferMatrix_BT601,DXVA2_NominalRange_0_255 } }; IF_Failed_RETURN(hr = pRefInputType->GetGUID(MF_MT_SUBTYPE,&subtype)); IF_Failed_RETURN(hr = MfgetAttributeSize(pRefInputType,&uiHeight)); IF_Failed_RETURN(hr = MfgetAttributeratio(pRefInputType,MF_MT_FRAME_RATE,&uiNumerator,&uiDenominator)); if (subtype == MFVideoFormat_NV12) D3DFormat = (D3DFORMAT)D3DFMT_NV12; else IF_Failed_RETURN(hr = E_FAIL); dxva2VideoDesc->SampleWidth = uiWidth; dxva2VideoDesc->SampleHeight = uiHeight; dxva2VideoDesc->SampleFormat.VideoChromasubsampling = DXVA2_VideoChromasubsampling_MPEG2; dxva2VideoDesc->SampleFormat.NominalRange = DXVA2_NominalRange_16_235; dxva2VideoDesc->SampleFormat.VideoTransferMatrix = EX_COLOR_INFO[0][0]; dxva2VideoDesc->SampleFormat.VideoLighting = DXVA2_VideoLighting_dim; dxva2VideoDesc->SampleFormat.VideoPrimaries = DXVA2_VideoPrimaries_BT709; dxva2VideoDesc->SampleFormat.VideoTransferFunction = DXVA2_VideoTransFunc_709; dxva2VideoDesc->SampleFormat.SampleFormat = DXVA2_SampleProgressiveFrame; dxva2VideoDesc->Format = D3DFormat; dxva2VideoDesc->InputSampleFreq.Numerator = uiNumerator; dxva2VideoDesc->InputSampleFreq.Denominator = uiDenominator; dxva2VideoDesc->OutputFrameFreq.Numerator = uiNumerator; dxva2VideoDesc->OutputFrameFreq.Denominator = uiDenominator; return hr; }
Finally,the EvrMediaSession code :
//---------------------------------------------------------------------------------------------- // Main.cpp //---------------------------------------------------------------------------------------------- #pragma once #define WIN32_LEAN_AND_MEAN #define STRICT #include <WinSDKVer.h> #include <new> #include <windows.h> //---------------------------------------------------------------------------------------------- // Common MFNode Files #ifdef _DEBUG #define MF_USE_LOGGING 1 #else #define MF_USE_LOGGING 0 #endif #include "C:ProjectMFNodeCommonMFInclude.h" // {B2F74C92-79DF-45DE-9C55-A99DE8276679} DEFINE_GUID(CLSID_CustomVideomixer,0xb2f74c92,0x79df,0x45de,0x9c,0x55,0xa9,0x9d,0xe8,0x27,0x66,0x79); #define WINDOWAPPLICATION_CLASS L"WindowApplication" // Hardcoded : change if needed #define VIDEO_WIDTH_1 320 #define VIDEO_HEIGHT_1 240 #define VIDEO_FILE_1 L"C:\Project\h264\big_buck_bunny_240p_5mb.mp4" #define VIDEO_FILE_2 L"C:\Project\h264\big_buck_bunny_240p_5mb - copie.mp4" HWND g_hWnd = NULL; HANDLE g_hSessionEvent = NULL; IMFMediaSession* g_pSession = NULL; IMFMediaSource* g_pVideoSource1 = NULL; IMFMediaSource* g_pVideoSource2 = NULL; IMFMediaSource* g_pAggregatedSource = NULL; class CCustomAsyncCallback : public IMFAsyncCallback { public: CCustomAsyncCallback() : m_nRefCount(1) {} virtual ~CCustomAsyncCallback() {} // IUnkNown STDMETHODIMP QueryInterface(REFIID riid,void** ppv) { static const QITAB qit[] = { QITABENT(CCustomAsyncCallback,IMFAsyncCallback),ppv); } STDMETHODIMP_(ULONG) AddRef() { LONG lRef = InterlockedIncrement(&m_nRefCount); return lRef; } STDMETHODIMP_(ULONG) Release() { ULONG uCount = InterlockedDecrement(&m_nRefCount); if (uCount == 0) { delete this; } return uCount; } // IMFAsyncCallback STDMETHODIMP GetParameters(DWORD*,DWORD*) { return E_NOTIMPL; } STDMETHODIMP Invoke(IMFAsyncResult* pAsyncResult) { IMFMediaEvent* pEvent = NULL; HRESULT hr = S_OK; HRESULT hrStatus; MediaEventType EventType; AutoLock lock(m_CriticSection); try { IF_Failed_THROW(hr = g_pSession->EndGetEvent(pAsyncResult,&pEvent)); IF_Failed_THROW(hr = pEvent->GetType(&EventType)); TRACE((L"Invoke %s",MFEventString(EventType))); IF_Failed_THROW(hr = pEvent->GetStatus(&hrStatus)); if (Failed(hrStatus)) { LOG_HRESULT(hr = hrStatus); LOG_HRESULT(hr = g_pSession->BeginGetEvent(this,NULL)); SAFE_RELEASE(pEvent); //SetEvent(g_hSessionEvent); return S_OK; } if (EventType == MESessionTopologyStatus) { MF_TOPOSTATUS TopoStatus = MF_TOPOSTATUS_INVALID; LOG_HRESULT(hr = pEvent->GetUINT32(MF_EVENT_TOPOLOGY_STATUS,(UINT32*)&TopoStatus)); TRACE((L"TopoStatus %s",MFTopologyStatusstring(TopoStatus))); if(TopoStatus == MF_TOPOSTATUS_READY) SetEvent(g_hSessionEvent); } if (EventType != MESessionClosed) { LOG_HRESULT(hr = g_pSession->BeginGetEvent(this,NULL)); } else { SetEvent(g_hSessionEvent); } } catch (HRESULT) {} SAFE_RELEASE(pEvent); return S_OK; } private: CriticSection m_CriticSection; volatile long m_nRefCount; }; CCustomAsyncCallback* g_pCustomAsyncCallback = NULL; void FreeMediaObject(); HRESULT ProcessVideo(); HRESULT CreateMediaSource(IMFMediaSource**,LPCWSTR); HRESULT CreateAggregatedSource(IMFMediaSource*,IMFMediaSource*,IMFMediaSource**); HRESULT CreatetopologyAggregated(IMFTopology**,IMFMediaSource*); HRESULT BuildTopology(IMFTopology*,ImfpresentationDescriptor*,IMFStreamSink*,IMFStreamSink*); HRESULT CreateSourceStreamNode(IMFMediaSource*,IMFStreamDescriptor*,IMFTopologyNode**); HRESULT CreateOutputNode(IMFStreamDescriptor*,IMFTopologyNode**,IMFStreamSink*); HRESULT Initwindow(const UINT,const UINT); LRESULT CALLBACK WindowApplicationMsgProc(HWND,UINT,WParaM,LParaM); void main() { HRESULT hr; LOG_HRESULT(hr = CoInitializeEx(NULL,COINIT_APARTMENTTHREADED | COINIT_disABLE_OLE1DDE)); if (SUCCEEDED(hr)) { LOG_HRESULT(hr = MFStartup(MF_VERSION,MFSTARTUP_LITE)); if (SUCCEEDED(hr)) { LOG_HRESULT(hr = ProcessVideo()); if (SUCCEEDED(hr)) { MSG msg; ZeroMemory(&msg,sizeof(MSG)); while (GetMessage(&msg,0) > 0) { TranslateMessage(&msg); dispatchMessage(&msg); } } FreeMediaObject(); LOG_HRESULT(hr = MFShutdown()); } CoUninitialize(); } } void FreeMediaObject() { HRESULT hr = S_OK; if (g_pSession != NULL) { LOG_HRESULT(hr = g_pSession->Close()); DWORD dwWaitResult = WaitForSingleObject(g_hSessionEvent,10000); if (dwWaitResult == WAIT_TIMEOUT) { assert(FALSE); } } if (g_pAggregatedSource) { g_pAggregatedSource->Shutdown(); SAFE_RELEASE(g_pAggregatedSource); } SAFE_RELEASE(g_pVideoSource1); SAFE_RELEASE(g_pVideoSource2); SAFE_RELEASE(g_pCustomAsyncCallback); if (g_pSession) { LOG_HRESULT(hr = g_pSession->Shutdown()); ULONG ulTest = g_pSession->Release(); g_pSession = NULL; assert(ulTest == 0); } if (g_hSessionEvent) { CloseHandle(g_hSessionEvent); g_hSessionEvent = NULL; } if (IsWindow(g_hWnd)) { DestroyWindow(g_hWnd); UnregisterClass(WINDOWAPPLICATION_CLASS,GetmoduleeHandle(NULL)); g_hWnd = NULL; } } HRESULT ProcessVideo() { HRESULT hr = S_OK; IMFTopology* pTopology = NULL; PROPVARIANT varStart; PropVariantinit(&varStart); varStart.vt = VT_EMPTY; try { g_pCustomAsyncCallback = new (std::nothrow)CCustomAsyncCallback(); IF_Failed_THROW(hr = (g_pCustomAsyncCallback == NULL ? E_OUTOFMEMORY : S_OK)); g_hSessionEvent = CreateEvent(NULL,FALSE,NULL); IF_Failed_THROW(hr = (g_hSessionEvent == NULL ? E_OUTOFMEMORY : S_OK)); IF_Failed_THROW(hr = CreateMediaSource(&g_pVideoSource1,VIDEO_FILE_1)); IF_Failed_THROW(hr = CreateMediaSource(&g_pVideoSource2,VIDEO_FILE_2)); IF_Failed_THROW(hr = CreateAggregatedSource(g_pVideoSource1,g_pVideoSource2,&g_pAggregatedSource)); IF_Failed_THROW(hr = CreatetopologyAggregated(&pTopology,g_pAggregatedSource)); IF_Failed_THROW(hr = MFCreateMediaSession(NULL,&g_pSession)); IF_Failed_THROW(hr = g_pSession->BeginGetEvent((IMFAsyncCallback*)g_pCustomAsyncCallback,NULL)); IF_Failed_THROW(hr = g_pSession->SetTopology(0,pTopology)); DWORD dwWaitResult = WaitForSingleObject(g_hSessionEvent,10000); if (dwWaitResult == WAIT_TIMEOUT) { IF_Failed_THROW(hr = E_FAIL); } LOG_HRESULT(hr = g_pSession->Start(&GUID_NULL,&varStart)); } catch (HRESULT) {} SAFE_RELEASE(pTopology); PropVariantClear(&varStart); return hr; } HRESULT CreateMediaSource(IMFMediaSource** ppSource,LPCWSTR szURL) { HRESULT hr = S_OK; MF_OBJECT_TYPE ObjectType = MF_OBJECT_INVALID; IMFSourceResolver* pSourceResolver = NULL; IUnkNown* pSource = NULL; try { IF_Failed_THROW(hr = MFCreateSourceResolver(&pSourceResolver)); IF_Failed_THROW(hr = pSourceResolver->CreateObjectFromURL(szURL,MF_RESOLUTION_MEDIASOURCE,&ObjectType,&pSource)); IF_Failed_THROW(hr = pSource->QueryInterface(IID_PPV_ARGS(ppSource))); } catch (HRESULT) {} SAFE_RELEASE(pSource); SAFE_RELEASE(pSourceResolver); return hr; } HRESULT CreateAggregatedSource(IMFMediaSource* pSource1,IMFMediaSource* pSource2,IMFMediaSource** ppAggregatedSource) { IMFCollection* pCollection = NULL; HRESULT hr = MFCreateCollection(&pCollection); if (SUCCEEDED(hr)) { hr = pCollection->AddElement(pSource1); } if (SUCCEEDED(hr)) { hr = pCollection->AddElement(pSource2); } if (SUCCEEDED(hr)) { hr = MFCreateAggregateSource(pCollection,ppAggregatedSource); } SAFE_RELEASE(pCollection); return hr; } HRESULT CreatetopologyAggregated(IMFTopology** ppTopology,IMFMediaSource* pSource) { assert(ppTopology != NULL); assert(pSource != NULL); HRESULT hr = S_OK; IMFTopology* pTopology = NULL; ImfpresentationDescriptor* pSourcePD = NULL; IMFActivate* pEvrActivate = NULL; //IMFVideoRenderer* pVideoRenderer = NULL; IMFMediaSink* pEvrSink = NULL; IMFStreamSink* pStreamSink1 = NULL; IMFStreamSink* pStreamSink2 = NULL; try { IF_Failed_THROW(hr = MFCreatetopology(&pTopology)); IF_Failed_THROW(hr = pSource->CreatePresentationDescriptor(&pSourcePD)); IF_Failed_THROW(hr = Initwindow(VIDEO_WIDTH_1,VIDEO_HEIGHT_1)); IF_Failed_THROW(hr = MFCreateVideoRendererActivate(g_hWnd,&pEvrActivate)); IF_Failed_THROW(hr = pEvrActivate->SetGUID(MF_ACTIVATE_CUSTOM_VIDEO_mixer_CLSID,CLSID_CustomVideomixer)); //IF_Failed_THROW(hr = pEvrActivate->ActivateObject(__uuidof(IMFVideoRenderer),reinterpret_cast<void**>(&pVideoRenderer))); //IF_Failed_THROW(hr = pVideoRenderer->InitializeRenderer(NULL,NULL)); IF_Failed_THROW(hr = pEvrActivate->ActivateObject(__uuidof(IMFMediaSink),reinterpret_cast<void**>(&pEvrSink))); IF_Failed_THROW(hr = pEvrSink->GetStreamSinkByIndex(0,&pStreamSink1)); IF_Failed_THROW(hr = pEvrSink->AddStreamSink(1,&pStreamSink2)); IF_Failed_THROW(hr = BuildTopology(pTopology,pSourcePD,pSource,pStreamSink1,pStreamSink2)); *ppTopology = pTopology; (*ppTopology)->AddRef(); } catch (HRESULT) {} SAFE_RELEASE(pStreamSink2); SAFE_RELEASE(pStreamSink1); SAFE_RELEASE(pEvrSink); //SAFE_RELEASE(pVideoRenderer); SAFE_RELEASE(pEvrActivate); SAFE_RELEASE(pTopology); SAFE_RELEASE(pSourcePD); return hr; } HRESULT BuildTopology(IMFTopology* pTopology,ImfpresentationDescriptor* pSourcePD,IMFMediaSource* pSource,IMFStreamSink* pStreamSink1,IMFStreamSink* pStreamSink2) { assert(pTopology != NULL); HRESULT hr = S_OK; IMFStreamDescriptor* pSourceSD = NULL; IMFTopologyNode* pSourceNode = NULL; IMFTopologyNode* pOutputNode = NULL; IMFMediaTypeHandler* pHandler = NULL; BOOL bSelected = FALSE; DWORD dwStreamCount; GUID guidMajorType = GUID_NULL; BOOL bRef = TRUE; try { IF_Failed_THROW(hr = pSourcePD->GetStreamDescriptorCount(&dwStreamCount)); for (DWORD i = 0; i < dwStreamCount; i++) { IF_Failed_THROW(hr = pSourcePD->GetStreamDescriptorByIndex(i,&bSelected,&pSourceSD)); if (bSelected) { IF_Failed_THROW(hr = pSourceSD->GetMediaTypeHandler(&pHandler)); IF_Failed_THROW(hr = pHandler->GetMajorType(&guidMajorType)); if (guidMajorType == MFMediaType_Video) { IF_Failed_THROW(hr = CreateSourceStreamNode(pSource,pSourceSD,&pSourceNode)); if (bRef) { bRef = FALSE; IF_Failed_THROW(hr = CreateOutputNode(pSourceSD,&pOutputNode,pStreamSink1)); IF_Failed_THROW(hr = pTopology->AddNode(pSourceNode)); IF_Failed_THROW(hr = pTopology->AddNode(pOutputNode)); IF_Failed_THROW(hr = pSourceNode->ConnectOutput(0,pOutputNode,0)); } else { IF_Failed_THROW(hr = CreateOutputNode(pSourceSD,pStreamSink2)); IF_Failed_THROW(hr = pTopology->AddNode(pSourceNode)); IF_Failed_THROW(hr = pTopology->AddNode(pOutputNode)); IF_Failed_THROW(hr = pSourceNode->ConnectOutput(0,0)); } } /*else if (guidMajorType == MFMediaType_Audio) { IF_Failed_THROW(hr = CreateSourceStreamNode(pSource,&pSourceNode)); IF_Failed_THROW(hr = CreateOutputNode(pSourceSD,NULL)); IF_Failed_THROW(hr = pTopology->AddNode(pSourceNode)); IF_Failed_THROW(hr = pTopology->AddNode(pOutputNode)); IF_Failed_THROW(hr = pSourceNode->ConnectOutput(0,0)); }*/ else { IF_Failed_THROW(hr = pSourcePD->deselectStream(i)); } SAFE_RELEASE(pHandler); SAFE_RELEASE(pOutputNode); SAFE_RELEASE(pSourceNode); } SAFE_RELEASE(pSourceSD); } } catch (HRESULT) {} SAFE_RELEASE(pHandler); SAFE_RELEASE(pOutputNode); SAFE_RELEASE(pSourceNode); SAFE_RELEASE(pSourceSD); return hr; } HRESULT CreateSourceStreamNode(IMFMediaSource* pSource,IMFStreamDescriptor* pSourceSD,IMFTopologyNode** ppNode) { if (!pSource || !pSourcePD || !pSourceSD || !ppNode) { return E_POINTER; } IMFTopologyNode* pNode = NULL; HRESULT hr = S_OK; try { IF_Failed_THROW(hr = MFCreatetopologyNode(MF_TOPOLOGY_SOURCESTREAM_NODE,&pNode)); IF_Failed_THROW(hr = pNode->SetUnkNown(MF_TOPONODE_SOURCE,pSource)); IF_Failed_THROW(hr = pNode->SetUnkNown(MF_TOPONODE_PRESENTATION_DESCRIPTOR,pSourcePD)); IF_Failed_THROW(hr = pNode->SetUnkNown(MF_TOPONODE_STREAM_DESCRIPTOR,pSourceSD)); *ppNode = pNode; (*ppNode)->AddRef(); } catch (HRESULT) {} SAFE_RELEASE(pNode); return hr; } HRESULT CreateOutputNode(IMFStreamDescriptor* pSourceSD,IMFTopologyNode** ppNode,IMFStreamSink* pStreamSink) { IMFTopologyNode* pNode = NULL; IMFMediaTypeHandler* pHandler = NULL; IMFMediaType* pMediaType = NULL; IMFActivate* pActivate = NULL; GUID guidMajorType = GUID_NULL; HRESULT hr = S_OK; try { IF_Failed_THROW(hr = pSourceSD->GetMediaTypeHandler(&pHandler)); IF_Failed_THROW(hr = pHandler->GetMajorType(&guidMajorType)); IF_Failed_THROW(hr = MFCreatetopologyNode(MF_TOPOLOGY_OUTPUT_NODE,&pNode)); if (MFMediaType_Video == guidMajorType) { IF_Failed_THROW(hr = pHandler->GetCurrentMediaType(&pMediaType)); IF_Failed_THROW(hr = pMediaType->SetUINT32(MF_MT_INTERLACE_MODE,MFVideoInterlace_Progressive)); IF_Failed_THROW(hr = pHandler->SetCurrentMediaType(pMediaType)); IF_Failed_THROW(hr = pNode->Setobject(pStreamSink)); } else if (MFMediaType_Audio == guidMajorType) { IF_Failed_THROW(hr = MFCreateAudioRendererActivate(&pActivate)); IF_Failed_THROW(hr = pNode->Setobject(pActivate)); } else { IF_Failed_THROW(hr = E_FAIL); } *ppNode = pNode; (*ppNode)->AddRef(); } catch (HRESULT) {} SAFE_RELEASE(pNode); SAFE_RELEASE(pHandler); SAFE_RELEASE(pMediaType); SAFE_RELEASE(pActivate); return hr; } HRESULT Initwindow(const UINT uiWidth,const UINT uiHeight) { WNDCLASSEX WndClassEx; WndClassEx.cbSize = sizeof(WNDCLASSEX); WndClassEx.style = CS_HREDRAW | CS_VREDRAW; WndClassEx.lpfnWndProc = WindowApplicationMsgProc; WndClassEx.cbClsExtra = 0L; WndClassEx.cbWndExtra = 0L; WndClassEx.hInstance = GetmoduleeHandle(NULL); WndClassEx.hIcon = NULL; WndClassEx.hCursor = LoadCursor(NULL,IDC_ARROW); WndClassEx.hbrBackground = NULL; WndClassEx.lpszMenuName = NULL; WndClassEx.lpszClassName = WINDOWAPPLICATION_CLASS; WndClassEx.hIconSm = NULL; if (!RegisterClassEx(&WndClassEx)) { return E_FAIL; } int iWndL = uiWidth + 8 + GetSystemMetrics(SM_CXSIzefRAME) * 2; int iWndH = uiHeight + 8 + GetSystemMetrics(SM_CYSIzefRAME) * 2 + GetSystemMetrics(SM_CYCAPTION); int iXWnd = (GetSystemMetrics(SM_CXSCREEN) - iWndL) / 2; int iYWnd = (GetSystemMetrics(SM_CYSCREEN) - iWndH) / 2; if ((g_hWnd = CreateWindowEx(WS_EX_ACCEPTFILES,WINDOWAPPLICATION_CLASS,WS_OVERLAPPEDWINDOW,iXWnd,iYWnd,iWndL,iWndH,GetDesktopWindow(),GetmoduleeHandle(NULL),NULL)) == NULL) { return E_FAIL; } RECT rc; GetClientRect(g_hWnd,&rc); // If Failed change iWndL or/and iWndH to be TRUE assert(rc.right == VIDEO_WIDTH_1 && rc.bottom == VIDEO_HEIGHT_1); ShowWindow(g_hWnd,SW_SHOW); return S_OK; } LRESULT CALLBACK WindowApplicationMsgProc(HWND hWnd,UINT msg,WParaM wParam,LParaM lParam) { if (msg == WM_PAINT) { ValidateRect(hWnd,NULL); return 0L; } else if (msg == WM_ERASEBKGND) { return 1L; } else if (msg == WM_CLOSE) { PostQuitMessage(0); return 0L; } return DefWindowProc(hWnd,msg,wParam,lParam); }
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 [email protected] 举报,一经查实,本站将立刻删除。