Share via

HDMI suddenly stopped working with Windows 11

Aurèle Blanchis 5 Reputation points
2025-11-15T10:13:03.9533333+00:00

Hello,

I have an issue with my computer : the HDMI doesn’t work anymore with Windows 11. I have a RTX 5070 ti. There is 2 screens on Display port and 1 TV screen on HDMI.

All was working fine since I build this setup, but one day I simply disconnected HDMI cable with my PC turn on, and since this, impossible to display anything with HDMI.

This is not an issue with cables, screens, or composant, I tried them separately and works fine. Moreover, I can display the BIOS with the HDMI cable so it means that all works fine. But when Windows boot after, black screen and no signal.

Windows don’t detect any screen on HDMI. Display port works fine. I tried to reinstall GPU drivers from scratch, remove many registry keys, update Windows 11, but nothing seems works.

For exemple I removed all entries here : hkey_local_machine\system\currentcontrolset\control\graphicsdrivers\configuration Then after reboot, the screen connected on display port come back to this registry, but not the one connected with HDMI.

Before reset my entire Windows, anyone could have an idea to help me ?

Thank you !

Windows for home | Windows 11 | Display and graphics

6 answers

Sort by: Most helpful
  1. Aurèle Blanchis 5 Reputation points
    2025-11-18T11:06:27.48+00:00

    On another forum, someone told me this:

    " Yes, a GPU can remember a monitor's EDID by reading it from the monitor and saving it to a file, which is a feature available on some professional-grade graphics cards, like NVIDIA Quadro and AMD FirePro, to force a specific EDID and overcome limitations. Standard consumer GPUs typically read the EDID dynamically each time the display is connected, and may have limitations in emulating it.

    How it works

    ** Reading the EDID: When a monitor is connected, the GPU reads its EDID to learn about its supported resolutions, refresh rates, and other capabilities.*

    ** Saving and emulating: For professional cards, you can use software (like the NVIDIA Control Panel) to export the EDID from the display and save it to a file. The GPU can then be configured to load this file, effectively "emulating" the monitor's EDID even when it's not connected or the connection is faulty.*

    ** Limitations for consumer cards: Most consumer-grade GPUs (like GeForce cards) lack the built-in "EDID emulation" feature and cannot save and force a specific EDID file in the same way. They rely on dynamically reading the monitor's EDID and will not remember it in the same way.*

    ** Workarounds: For standard PCs, if EDID is not read correctly, a workaround is to use a hardware EDID emulator or dummy plug to force a common resolution.*

    ** Default behavior: Without a working EDID, a GPU will typically default to a basic resolution like 1024x768, which may not be the monitor's native resolution and can cause image quality issues. "*

    That’s exactly what I saw on my monitor: 1024×768 when my GPU was connected. So GPU can remember EDID.

    From what I understand, I need to use a dummy plug. I’ve already bought one and should receive it tomorrow, but I’m not sure it will support 4K, 120Hz, VRR, and HDR10.

    There is no other way to change EDID? With Windows tool or something else?

    Thank you.

    0 comments No comments

  2. Aurèle Blanchis 5 Reputation points
    2025-11-18T09:24:27.96+00:00

    I ran some tests with many friends, and it turns out the issue is specifically between my GPU’s HDMI port and the three monitors that were connected during the incident.

    My GPU works perfectly with any other monitor, and my monitors work fine with any other GPU.

    No matter which PC I put my GPU in, as long as it’s connected to one of my three monitors, it will never work.

    My theory is this:

    Maybe my GPU “stored” the EDID information from my monitors and is now sending corrupted data to them. But I can’t find anything about this online — it’s like this issue has never happened to anyone else.

    (Quick reminder: when I boot, the monitor turns on (blue LED), and the panel shows a faint glow.)

    So I have many questions :

    • Is that even possible? Can a GPU remember a monitor’s EDID ? Or can monitors remember a GPU?
    • Is there a way to “factory reset” a GPU using an Nvidia tool?
    • Is there a safe way to change or reset a monitor’s EDID? Could it work?

    Thank you very much. At this point, I have no solution. And the warranty service will never believe that my GPU only fails with my monitors, so… I’m in a difficult situation.

    Aurele

    0 comments No comments

  3. Mario Bressan 3,190 Reputation points Independent Advisor
    2025-11-16T13:37:48.0333333+00:00

    Yes, that's probably it. I strongly recommend backing up your files first and then doing a clean reinstall of Windows to fix the issue, okay?


  4. Mario Bressan 3,190 Reputation points Independent Advisor
    2025-11-15T11:28:08.89+00:00

    Hello, my name is Mario, and I would be happy to assist you with your issue.I’m sorry for what happened!

    While reformatting (reinstalling) the operating system would likely solve the issue you're experiencing, I can assist you with some procedures to help you avoid such a drastic approach:

    Test with DDU (Display Driver Uninstaller) Completely uninstall video and audio drivers Restart your computer Reinstall fresh drivers from the official website

    If possible, test this graphics card on another computer.

    Please perform these tests and let me know if you're successful.

    Best regards,

    Mario B.


  5. Q&A Assist
    2025-11-15T10:13:18.6+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    It sounds like you've already tried several troubleshooting steps for your HDMI issue with Windows 11. Here are a few additional suggestions you might consider:

    1. Check Display Settings: Right-click on the desktop and select Display settings. Ensure that the HDMI display is detected. If not, try clicking on Detect to see if Windows can find the HDMI connection.
    2. Graphics Driver Update: Even though you've reinstalled the GPU drivers, ensure that you are using the latest version from the NVIDIA website specifically for your RTX 5070 ti. Sometimes, using the manufacturer’s drivers can resolve compatibility issues.
    3. Windows Update: Ensure that your Windows 11 is fully updated. Sometimes, updates can fix underlying issues with hardware compatibility.
    4. Check Graphics Card Settings: Open the NVIDIA Control Panel and check if the HDMI output is enabled. Sometimes settings can change, and the output might not be set correctly.
    5. Reset Display Settings: If you can access the display settings through the DisplayPort, try resetting the display settings to default. This can sometimes resolve issues where the HDMI output is not recognized.
    6. BIOS Settings: Since you can see the BIOS, check if there are any settings related to graphics output that might need to be adjusted.
    7. Try Different HDMI Ports: If your graphics card has multiple HDMI outputs, try connecting to a different HDMI port to see if that resolves the issue.

    If none of these steps work, you may want to consider performing a system restore to a point when the HDMI was working, or as a last resort, resetting Windows 11.


    References:

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.