On another forum, someone told me this:
" Yes, a GPU can remember a monitor's EDID by reading it from the monitor and saving it to a file, which is a feature available on some professional-grade graphics cards, like NVIDIA Quadro and AMD FirePro, to force a specific EDID and overcome limitations. Standard consumer GPUs typically read the EDID dynamically each time the display is connected, and may have limitations in emulating it.
How it works
** Reading the EDID: When a monitor is connected, the GPU reads its EDID to learn about its supported resolutions, refresh rates, and other capabilities.*
** Saving and emulating: For professional cards, you can use software (like the NVIDIA Control Panel) to export the EDID from the display and save it to a file. The GPU can then be configured to load this file, effectively "emulating" the monitor's EDID even when it's not connected or the connection is faulty.*
** Limitations for consumer cards: Most consumer-grade GPUs (like GeForce cards) lack the built-in "EDID emulation" feature and cannot save and force a specific EDID file in the same way. They rely on dynamically reading the monitor's EDID and will not remember it in the same way.*
** Workarounds: For standard PCs, if EDID is not read correctly, a workaround is to use a hardware EDID emulator or dummy plug to force a common resolution.*
** Default behavior: Without a working EDID, a GPU will typically default to a basic resolution like 1024x768, which may not be the monitor's native resolution and can cause image quality issues. "*
That’s exactly what I saw on my monitor: 1024×768 when my GPU was connected. So GPU can remember EDID.
From what I understand, I need to use a dummy plug. I’ve already bought one and should receive it tomorrow, but I’m not sure it will support 4K, 120Hz, VRR, and HDR10.
There is no other way to change EDID? With Windows tool or something else?
Thank you.