IMFSinkWriter: Merit validation failed for MFT (Intel Quick Sync Video H.264 Encoder MFT) RRS feed

  • Question

  • Hi,

    We are actually trying the run a simple Reader --> Writer (transcoder, VC1 -> H264) in Media Foundation. 
    The source data (VC1) is captured with our own equipment, so no "premium protected content" or similar is used, the goal is to use Intel® Quick Sync Video H.264 Encoder MFT.

    Looking in the MediaFoundation trace log we can see that a hardware MFT is enumerated and created BUT it fails.

    CoCreateInstance @ Created {4BE8D3C0-0515-4A37-AD55-E4BAE19AF471} Intel® Quick Sync Video H.264 Encoder MFT (c:\Program Files\Intel\Media SDK\mfx_mft_h264ve_w7_32.dll)
    MFGetMFTMerit @ Merit validation failed for MFT @06A42CA0 (hr=E_FAIL)

    We provide a IDirect3DDeviceManager9 pointer MediaFoundation when creating our source reader + sink writer according to the documentation.
    It's rather strange that MF wants to use a protected media path. Are we supposed to pass some encoder params to disable this type of behavior? Any ideas?

    Standard monitor with DVI cable is used with Windows 7

    best regards,


    Thursday, November 7, 2013 3:22 PM

All replies

  • Hi Carl,

    I am using Sink Writer to encode raw yuv to h264. I config MF_READWRITE_ENABLE_HARDWARE_TRANSFORMS to load Intel Quick Sync H.264 Encoder.

    From mftrace log, I also see "MFGetMFTMerit @ Merit validation failed for MFT @07E72FD0 (hr=80004005 E_FAIL)".

    I searched Regedit.exe, there is no assigned Merit associate with Intel H.264 encoder. And Sink writer doesn't support MFT_CODEC_MERIT_Attribute.

    Can I know how was your problem solved?

    Thank you in advance.


    Friday, September 12, 2014 9:31 PM
  • Hi Jun and Carl,

    I hit the same wall and seems like we are the few lucky one that encounter this "merit validation failed"
    are you guys manage to solve this problem?

    can kindly share any solution for this?

    thank you


    Saturday, September 20, 2014 12:17 PM
  • I am seeing the same and you point out in a different thread that this happens whenever you use mftrace.exe and Intel Quick-Sync.   I am also seeing it on Win7, not sure if it only happens on Win7.
    Friday, June 3, 2016 6:00 PM
  • Hi

    From what i read Adam and persskog you are using Windows 7 right ? Do you have an NVIDIA card running or are you on integrated graphics only ? Please the others also post your OS and graphics cards. Somehow my guts telling me that you guys have an NVIDIA card in your system....

    I recently encountered a similar MFT enumeration and unlocking problem when using a system that has integrated graphics ( AMD APU with R7 ) but the main graphics adapter was an NVIDIA card. The problem is NVIDIA has good marketing but.... they are 2-3 years late to the party with their MFTs. While AMD and INTEL are registering their Encoder MFT with their driver package since Windows 8.0, NVIDIA just started here with Windows 10....

    If you have an NVIDIA card and you are not on Windows 10 then the problem is as follows :

    Normaly MFTs are installed and registered with the driver package of the graphics card. Since NVIDIA's MFTs are only registered under Windows 10 they are not getting enumerated using MFTEnumEx using an OS below that. However the method will enumerate the Hardware Encoders and Decoders of the integrated graphics adapter ( IGPU ). The thing is you can create a class instance with CoCreateInstance of the integrated encoder or decoder but as soon as you call any function on the class or you try to set an attribute ( like the async unlock for example which is needed to unlock async/hardware MFTs ) you will get an E_FAIL or E_UNEXPECTED.

    The IGPU's MFTs are getting enumerated because the driver package installed them, but the adapter is not the main graphics adapter. Normaly drivers of dedicated GPUs are registering MFTs and overwrite what the IGPU driver might had registered for a certain format. Since NVIDIA is not offering any MFTs below Windows 10 you only get the IGPU's MFTs enumerated, but they are not functional internally. You will get an E_FAIL or E_UNEXPECTED if you try to use them. I had to google and test out a lot but it is a confirmed issue when you have an NVIDIA card and you are on an OS below Windows 10.


    The above described problem can also happen with an AMD INTEL combo of course. It all depends on your combination of dedicated and integrated graphics hardware and on what operating system you are. To summerize this :

    If your CPU has an IGPU but your main graphics adapter is a dedicated graphics card and if the IGPU registers a Hardware MFT but the dedicated graphics card does not then the enumerated MFT of the IGPU is not functional.

    As far as i knew AMD and INTEL are installing their MFT's with their driver beginning with Windows 8.0. But from what i read here in this thread INTEL seems to register its encoder also in Windows 7. So if you are on Win 7 and if you have an INTEL IGPU but your main graphics adapter is a dedicated GPU from AMD or NVIDIA then the INTEL MFT wont work.


    Maybe you got a completely different problem here but it sounds so familiar i had to answer.

    Please post specs guys.



    Saturday, June 4, 2016 8:43 PM
  • I forgot to add that you could simply install a Windows 10 partition somewhere on an unused drive to test this. A propably easier way though would be to built yourself a Class Factory patch through like meantioned in this post :

    All you need to do is to check for the vendor on the primary DXGI graphics adapter and then compare it to the enumerated vendor from MFTEnumEx. Just check case insensitive for the strings "AMD", "INTEL", "NVIDIA" in the MFT's Friendly Name and set yourself an int/enum. Then compare that vendor you got to the primary DXGI graphics adapter.

    This is an easy validation task.



    Saturday, June 4, 2016 8:56 PM
  • Francis,  

    I am on a Win7 Laptop with built in Intel HD 3000 and also a Nvidia NVS 4200M display adapter.

    I assume the main graphics adapter is the Intel built-in and the Nvidia is used for external displays so I'm not sure if it exactly matches the situation you are outlining?
    Wednesday, June 8, 2016 6:34 PM
  • Adam,

    Normaly when there is a dedicated graphics card in your system the IGPU gets disabled. This is at least on desktop systems the case. If you slap in a dedicated GPU into a desktop machine most BIOS not even gray out all options for the IGPU but mostly hiding them. It should be the same for notebooks, but i am not entirely shure for all cases there.

    Here use this code to enumerate your graphics adapters and the displays on them ( put the code in a header and run the function ) :

    #pragma once
    #include <fstream>
    #include <dxgi.h>
    #pragma comment(lib, "dxgi")
    #define VENDOR_AMD 0x1002
    #define VENDOR_INTEL 0x8086
    #define VENDOR_NVIDIA 0x10DE
    template <class T> inline void SafeRelease(T **ppT)
    	if (*ppT)
    		*ppT = NULL;
    HRESULT EnumAdaptersAndDisplays(LPCWSTR pFilePath)
    	HRESULT hr = S_OK;
    	UINT32 num_of_paths = 0;
    	UINT32 num_of_modes = 0;
    	UINT32 adapIndex = 0;
    	UINT32 dispIndex = 0;
    	IDXGIAdapter1 *pAdapter = NULL;
    	IDXGIFactory1 *pFactory = NULL;
    	hr = CreateDXGIFactory1(__uuidof(IDXGIFactory1), (void**)&pFactory);
    	if (FAILED(hr))return hr;
    	std::wofstream logFile;, std::ios::app);
    	logFile << "-DXGI ADAPTER ENUMERATION-\n";
    	logFile << "(0 is the default adapter)\n\n";
    	while (pFactory->EnumAdapters1(adapIndex, &pAdapter) != DXGI_ERROR_NOT_FOUND)
    		DXGI_ADAPTER_DESC1 desc;
    		logFile << "ADAPTER " << adapIndex << " : " << desc.Description << "\n";
    		for (UINT i = 0; i < num_of_modes; i++)
    			if (pDisplayModes[i].infoType != DISPLAYCONFIG_MODE_INFO_TYPE_TARGET)
    			if (pDisplayModes[i].adapterId.LowPart == desc.AdapterLuid.LowPart &&
    				pDisplayModes[i].adapterId.HighPart == desc.AdapterLuid.HighPart)
    				devName.header.size = sizeof(DISPLAYCONFIG_TARGET_DEVICE_NAME);
    				devName.header.adapterId = pDisplayModes[i].adapterId; = pDisplayModes[i].id;
    					<< "Display "
    					<< dispIndex
    					<< " = "
    					<< devName.monitorFriendlyDeviceName
    					<< "\n";
    		logFile << "\n";
    	logFile << "__________________________\n\n";
    	return S_OK;

    The method enumerates all graphics adapters and the belonging displays and writes them into a text file. The parameter pFilePath must be a full path with filename and extension ( .txt ).

    The NVIDIA NVS 4200M should be your default adapter.

    Post your result please.




    Saturday, June 11, 2016 11:55 AM
  • FYI the same data with pre-built tool: MediaFoundationDxgiCapabilities-x64.exe

    I have Nvidia card and onboard Intel adapter which is not connected to anything physically. Having it enabled in BIOS, I see it in Device Manager, via DXGI API, and I can leverage its H.264 capabilities.

    Saturday, June 11, 2016 7:26 PM
  • Roman, why would someone build an extra tool around it ? oO

    All that the tool does is printing out the structs from the enumeration ( DXGI_ADAPTER_DESC and DXGI_OUPUT_DESC ). You can extend my posted code with 5 lines of codes to add 5 values from that struct. I put that together in no time, its a task of a few minutes.

    To your Intel onboard adapter :

    Its good to hear that you can still use an INTEL IGPU when you have a dedicated graphics card in your system. I stated that an IGPU gets disabled when using a dedicated GPU because with AMD its infact like that. But it seems with INTEL its not, well at least not in all cases ( which is also what i stated above ).

    My working machine is an AMD A10-7850K, and if you put a dedicated GPU into the system all entries vanish in the ASUS BIOS. The IGPU is not usable in any way.



    Sunday, June 12, 2016 11:25 AM
  • Its good to hear that you can still use an INTEL IGPU when you have a dedicated graphics card in your system. I stated that an IGPU gets disabled when using a dedicated GPU because with AMD its infact like that. But it seems with INTEL its not, well at least not in all cases ( which is also what i stated above ).

    Yes, it is exactly what I used to see - Intel's adapter is available to back H.264 encoding MFT even though otherwise the adapter is not used (not connected to physical monitors, just enabled in BIOS). This is scenario I saw on several systems and it is what I generally expect to see on other where I expect to find and use Intel QSV.

    Sunday, June 12, 2016 11:40 AM
  • This might be due to Intel has a dedicated block of transistors for Encoding/Decoding. NVIDIA and INTEL are having an extra dedicated block on the die for Decoding/Encoding since they started offering H.264 hardware acceleration.

    AMD has the names UVD and VCE for their decoding/encoding engines, but the computation was in fact shader based uptil Tonga/Fiji, while NVIDIA and INTEL have an extra block of transistors for this purpose. Its a seperate pipeline which is independent to the most rest of the chip.

    Maybe thats why i cant use my IGPU when i have a dedicated GPU in the system. The A10-7850 has no Tonga IP, its the last generation of GCN without dedicated transistors for Encoding/Decoding.

    Mmh i expected to have the same behavior with INTEL IGPUs but guess i was wrong. To be honest i never read deeply into INTEL's H.264 Decoding/Encoding as i refuse to buy/use anything that is from this company. Once it was a company of culture with Noice and Moore but since end of the 90's it turned into a company without Innovation and crime behavior and i am with AMD since then.

    Back to the topic of this thread:

    It seems that it is possible to use INTEL IGPU's for H.264 acceleration on Windows 7, even while having a dedicated graphics card in the system. The problem must be something else. Well i answered this thread because the problem felt similar to what i experienced with AMD IGPUs.



    Sunday, June 12, 2016 1:13 PM
  • fwiw, here is the output from my system:

    (0 is the default adapter)

    ADAPTER 0 : Intel(R) HD Graphics 3000
    Display 0 = 

    ADAPTER 1 : NVIDIA NVS 4200M    


    # System

     * Version: 6.1.7601, Windows 7, Service Pack 1.0, VER_SUITE_SINGLEUSERTS, VER_NT_WORKSTATION
     * Computer Name: `win7test1-PC`
     * User Name: `win7test1-PC\win7test1` 
     * Local Time: `6/13/2016 10:23:59 AM`
     * Architecture: AMD/Intel x64 (x64 Application)
     * Processors: `4`, Active Mask `0xF`
     * Page Size: `0x1000`
     * Application Address Space: `0x0000000000010000`..`0x000007FFFFFEFFFF`
     * Physical Memory: `3,979` MB
     * Committed Memory Limit: `7,956` MB
     * Application Version: ``

    # Display Devices

     * Intel(R) HD Graphics 3000
      * Instance: PCI\VEN_8086&DEV_0126&SUBSYS_21D017AA&REV_09\3&21436425&0&10
      * DEVPKEY_Device_Manufacturer: Intel Corporation
      * DEVPKEY_Device_DriverVersion:
     * NVIDIA NVS 4200M    
      * Instance: PCI\VEN_10DE&DEV_1057&SUBSYS_21D017AA&REV_A1\4&31FCFBB7&0&0008
      * DEVPKEY_Device_Manufacturer: NVIDIA
      * DEVPKEY_Device_DriverVersion:

    # DXGI 1.1 Capabilities

    ## Adapters

    ### Adapter: Intel(R) HD Graphics 3000

     * Vendor Identifier: 0x8086
     * Device Identifier: 0x0126
     * Subsystem Identifier: 0x21D017AA
     * Revision: 0x0009
     * Dedicated Video Memory: 64 MB
     * Dedicated System Memory: 0 MB
     * Shared System Memory: 1,632 MB
     * Adapter LUID: 0.00007CAB

    #### Output: \\.\DISPLAY1

     * Desktop Coordinates: (0, 0) - (1600, 900); 1600 x 900
     * Attached To Desktop: 1
     * Monitor: 0x00010001

    ### Adapter: NVIDIA NVS 4200M    

     * Vendor Identifier: 0x10DE
     * Device Identifier: 0x1057
     * Subsystem Identifier: 0x21D017AA
     * Revision: 0x00A1
     * Dedicated Video Memory: 977 MB
     * Dedicated System Memory: 0 MB
     * Shared System Memory: 1,733 MB
     * Adapter LUID: 0.00009BE3
    Monday, June 13, 2016 2:34 PM