Graphic Cards :: Second Monitor Shows Black Screen - Cannot Display This Video Mode
Dec 3, 2015
I just upgraded from Windows 7 to Windows 10 and now my computer only detects one display rather than two. I have a Dell desktop computer and two Dell monitors. Using Windows 7 both monitors were being detected/used. How to correct. The second display shows a black screen with the message "Cannot Display This Video Mode".
Since upgrading to Windows 10 from 8.1, I've seen the following message several times.
Display Driver stopped responding and has recovered Display driver Intel HD Graphics Drivers for Windows 8(R) stopped responding and has successfully recovered.
When it happens, my laptop screen goes black for a few seconds, except for the taskbar and toast notification. I also have an external monitor that doesn't seem affected at all.
I can reproduce the error pretty consistently if I rapidly switch a Flash video back and forth between full-screen and embedded, but that's not the only time I've seen the problem.
As far as I can tell, everything continues running. Even the full-screen video continues running after the screen isn't black any more.
The weird thing is that the driver name it shows isn't the driver I'm using. I've already upgraded my display drivers to the latest available from Intel's site.
Some things I've observed:
The information presented by Windows in Settings -> Display -> Advanced display settings shows the driver for Windows 10 that I downloaded from Intel.The "Intel HD Graphics Control Center" that was installed along with the driver shows the latest driver version that I downloaded from Intel.In Settings -> Apps & Features there's only one "Intel(R) Processor Graphics Driver" listed.Event viewer shows very little useful information:
General Display driver igfx stopped responding and has successfully recovered.Details (XML View): - <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> - <System> <Provider Name="Display" /> <EventID Qualifiers="0">4101</EventID>
It's conceivable that Intel updated the driver, but forgot to change the string somewhere so the error is actually for the Windows 10 driver. I don't know how to find out exactly what failed or why. It's getting that string from somewhere.
So, my old 670 burnt out in a weird way where it would do basic video but only if nvidia drivers werent installed, if they were installed then the system with that card was a brick. Was using a 550 while waiting on shipping, works fine but obviously can't have good settings with it. My new Gigabyte gtx 960 windforce 4GB just arrived, I let it get up to room temperature since it had been outside in the cold for a little bit, then plugged it in. Through HDMI and both DVI ports the only output is a blank screen with a " _ " at the top and nothing else. Looks like a prompt but does nothing. It does not even boot the bios that I can tell. Even if the old 670 had a problem it at least did basic video and showed the bios.
Windows 10 64 Bit Intel i7 3770 @ 3.4 GHz Gigabyte g1.sniper m3 BIOS F10f Gigabyte gtx 960 Windforce 4GB HP Pavilion 27xi 27"
The monitor is connected to the computer via HDMI and we can see Windows on the screen but Windows sprawls beyond the physical edges of the screen in every direction. That means we see only the top little bit of the task bar and the Windows button is beyondn the left edge of the screen and the clock is beyond the right edge of the screen. I've tried a few different screen resolutions but the image ALWAYS sprawls so that all four edges are lost. What do I need to do so that this sprawling stops?
I have a MSI GS60 ghost pro gaming laptop which has Intel HD graphics 4600 and Nvidia GTX970M GPU, CPU i7 4720HQ, running Windows 10. I study abroad, several months ago i went back home and plugged-in my laptop to the TV via HDMI for gaming like normal. Yesterday i bought an ASUS VC239H external monitor and tried to do the same thing, but there is no HDMI input signal, the Laptop cannot detect the monitor (include display settings, intel HD graphics settings, nvidia settings). This is what i tried:
- Fn+f2, Fn+f8 keys, windows P switching to duplicate or extend or anykind of options - plug the monitor into my friend's laptop with windows 8 through that HDMI cable and it works fine, the same with other laptop running windows 7==> cable and monitor are normal - plug my laptop to my neighbor's smart TV. There was ''connected' symbol which appears on the TV but black screen instead of image, Laptop couldn't detect the TV - reinstalling nvidia and onboard-graphics drivers and update to the latest - Made a clean install of windows, tried windows 8 also but the same result - Updated my BIOS and EC - Bought an adapter HDMI-Mini Displayport to use the mini displayport with HDMI cable
I've been running a multi-monitor setup for a while now, with an ASUS VG248QE 144Hz monitor as my main, and one or more spare monitors as secondary displays. I've set the ASUS monitor to 144Hz without issue in the past, while the other monitors stay at 60Hz.
The other night, Windows wanted to update. I figured it was the usual patch session, so I let it do its thing. However, after restarting my computer, I noticed that my ASUS monitor would only operate at 144Hz if it was the only monitor plugged into the system. When I try to set it higher in the Nvidia control panel, it just reverts back to 60Hz after applying changes. I updated my graphics drivers (though I was only one version behind) and even sought a special ASUS driver for the monitor to resolve the issue, but nothing has worked.
Did Windows recently add a forced refresh rate sync between all monitors recently? If so, that really sucks because I only own one 144Hz monitor.
I installed Windows 10 and found some apps did not fit the monitor screen (too big). I went in and changed the monitor resolution to a larger size (windows recommended 1920 x 1080) but I tried something larger, and now the monitor has a black screen with a little box moving around which says "input not supported". Not being able to use that monitor to go in and change it back, I hooked up a flat screen t.v. which worked but Windows only showed the current monitor size (which was smaller than the original monitor.
I tried the monitor on another computer I have also running Windows 10 and it worked fine. It shows the resolution as 1920 x 1080. When I plugged it back into the original computer I get the same plack screen with "input not supported" so I can't change it there. I suspect the issue is with the registry somewhere in Windows 10 on the first computer, but what the problem is. I've been working on this for a whole day now. You can probably tell I'm not well versed in computer technology. I just want my old monitor back (Acer K272HL) and I promise I won't mess around with it again.
I am cleaning out and setting up an old-ish PC and whenever I connect it to my TV or Monitor it says either "No Input to TV" or it is just black when I turn it on, I am using VGA to connect it, I tried with a GTX 650 and with just the built in graphics, and still no luck, any reason why?
So I recently had to reinstall Windows 10 to resolve an issue I was having, so my Nvidia drivers were deleted in that process. When I went to turn my computer back on after i was finished, there was no display on the screen so I took out my GPU and plugged the VGA into the integrated graphics and it worked fine. This just made me think I have to install the Nvidia drivers again but then I found out that you need the GPU in your computer to install the drivers, but My display won't work anymore with the GPU inside and no drivers for it.
I have temporarily connected a 22 inch LCD tv monitor( until i get a new one) to pc via HDMI lead but i cannot get resolution set right.The recommended native makes the taskbar disappear of screen and the desktop icons are barely visible on left side of screen.
I have tried various resolution settings but all are no good apart from one which makes the desktop look like like a laptop screen.
I have reset Windows 10 and updated Intel Graphic Card drivers to the latest but still resolution cannot be set correctly for monitor and used a VGA lead aswell.
Is it because you need a proper pc monitor to have correct resolution ?
Have a user with a Sony vaio laptop, 15" screen, 1920x1080. They also plug an HDMI cable into a 23" dell widescreen monitor for a second display. This 23" monitor is also 1920x1080. They are complaining that fonts on this second monitor are blurry. They tried to live with it a week but they said can it be made clear again like it was when they had Windows 7?
Are you familiar with the possibility of fonts being blurry only on one monitor if they are both set to the native?
I'm asking them to see if they change the font dpi to 100%, logoff and on if its any better. For whatever reason that seemed to work for my laptop (another Sony Vaio), but my concern is they will be disappointed at legibility on the laptop screen (it makes everything tiny).
I have a Dell XPS 13 (2016) with a Dell USB-C adapter. It will not detect my NEC LCD external monitor. I have tried a Dell USB-C adapter and a D-SUB cable to the monitor. I have also tried a Monoprice USB-C adapter with a Monoprice HDMI cable. Neither work. The monitor works fine with my Mac. I have told the computer to extend and duplicate the display using the Windows-P command. I have tried to update the driver for the monitor (I have NEC's driver), but it will not install because the monitor is not detected.
So I bought a new monitor, the MG279Q and tried plugging it in to my PC via displayport, the monitor comes with a DP->miniDP cable, (from GTX780 DP to monitors miniDP) and the monitor gives no signal. However the computer does recognize the monitor and for example gives the sound alarm for new device attached. I can also see the device describes as "Generic pnp monitor" and I can see it in the Windows's own control panel for monitor + the nvidia control panel (pics in the end). But it won't allow me to set the monitor to use..
I suspected that the cable might be broken so I tested it with my UX32LN (GeForce 840M) laptop which has a miniDP. I plugged it to the monitor (from laptops miniDP to monitors normal DP, the monitor has both DP and mDP) and wow, image straight away. So the cable should be fine I guess.. But this is somewhat of an different situation as I use different port in the monitor compared to my desktops GTX780. And I don't have other computers or gadgets that I could use to test the monitors miniDP port.
Next I plugged in the monitor via HDMI to my desktop and that worked just fine. It recognizes the monitor without problems. Using my old XL2411Z monitor with DVI + MG279Q with DP or the MG279Q with HDMI+DP at the same time I can see the MG279Q being connected twice but it only chooses the HDMI connection and doesn't let me change it to DP even though it shows the connection.
Here are some pics to explain it better:
Multiple panel settings window. It shows the MG279Q on the top (under GTX780 the one not clicked) but it won't let me click it to use. Sometimes it lets me tick it for a second but instantly ticks the box off. The monitor nro 2 is the MG279Q via HDMI.
Surround+PhysX window shows that I've connected the monitor via DP but it is greyed out. It also won't let me make a surround set with the Xl2411z + mg279q DP.
Changing resolution settings only displays the HDMI connected monitor.
So I tried uninstalling the Nvidia drivers + freshly installing them, did not work at all. Neither did uninstalling the generic pnp monitor so the computer would reinstall the displays drivers. I also freshly installed from Win7 Pro -> Win 10 Pro and no difference (was intended to do that later on anyway so I thought that I might check, my laptop is running Win10 and it's DP worked after all). I also went to BIOS and checked that iGPU is disabled + monitor setting is set to PCIE instead of auto, no use. I would also test the GTX780's DP on other desktop computer but I can't. Anyway I'm somewhat suspecting a software/hardware compatibility issue as the monitor is being recognized. Doubt the fact that the monitors miniDP or my graphics cards DP port is broken..
What should I do or try? I really want to run the monitor via DP so I can utilize the 144Hz refresh rate.
Since updating to Windows 10 I have a display settings issue. I set my display setting to 1920x1080 (recommended) apply and check 'keep settings'. When I shut down and restart the display has changed to 1366x768. Happens every time I restart.
ATI Mobility Radeon HD 4200 seriesAMD Radeon HD 6300M series
For one; the ATI Mobility card there are no drivers installed for this device. Ive been on the AMD website, where I downloaded an old legacy catalyst suite, built for windows vista/7. after that there are still no drivers installed for this card.
For the AMD Radeon; I'm getting a "Windows has stopped this device because it has reported problems. (Code 43)"
I'm starting to believe the problems for the 6300M card is coming from the 4200 card not having any drivers.
I've uninstalled and reinstalled the drivers for the 6300 card, and can only get the code 43 to go away if I disable and then re-enable it through the Device Manager. After a restart though, the problem persists.
My display driver kept Crashing my Computer, so I updated it with the latest driver for Windows 10.
Now I keep getting a message to say that 'Scene Selection Has Changed' which annoyingly keeps on popping up on the Screen at regular intervals. (How can I stop this before it drives me Crazy!) too late it already has!
Then I get another 'Pop Up' which says 'Display Driver Stopped Responding but has now recovered' and is something else that I don't need to know (or maybe I do?)
If that message pops up a couple of times, it Crashes the Computer which then re-boots itself (sometimes!) or else gives me a 'Black Screen'
My Nvidea Graphics Card is virtually brand new and always worked faultlessly with XP and Windows 7.
Could it be the Updates we keep getting or something else?
when i connect a dvi or hdmi from my gtx 780 to my acer H274HL Monitor all works fine but when i want to connect display port cable on my gtx 780 and the other side is hdmi that goes to my monitor than there is no signal at all.
i am running windows 10 64 bit enterprise and all drivers are up to date also windows updates are up to date.
It worked fine in windows 7 but now when I choose "duplicate display" from the side menu popup the tv goes blank as if Ive unplugged the hdmi. All 3 of the remaining display options work including the extend display and tv only setting so its not a cable issue. Laptop model is XPS 17 L702X. Now when I used to do it in win 7 sometimes it wouldnt autodetect it so I would have to right click desktop, open invidia up and click rigorous display detect or something like that and then laptop screen would blink and make the beep noise like when hdmi cable got connected. Now in the invida panel it just simply wont allow mirroring, nothing happens when I right click on the display box that would normally show 1 and a smaller box would say 2. I dont get it but Id really like that feature back If its something simple I could do.
My computer currently has 3 monitors connected to it. 2 through the graphics card (R9 280), 1 through the integrated graphics (AMD HD 3000). Not sure when why or how, but the integrated graphics card can never pick up what type of monitor is connected to it... so it always gives me a bunch of generic 4:3 resolutions...
I've tried uninstalling the drivers and letting them reinstall. I've tried swapping around the cables to see if i could get lucky with one of the other screens, but they all face the same issues. I've tried buying brand new cables across the board because sometimes people say that's the issue. But not success.
The cloest thing i have is apparently a custom resolution setter, but it was created for windows 7 in mind and doesn't seem to work on windows 10 (PowerStrip).
My question is, is there anyway to format the list so i can select the correct resolution? It's a Toshiba TV/Monitor and it seems like every thing else attached to my pc (even 3rd party programs i tried to use to forcably change the resolution) can spot it and hand me back the correct resolution but i can't seem to change it. I get Genertic pnp through the control panel (the only place the monitor shows up in) and even then the correct driver doesn't show, it's the Microsoft Generic driver.
Mobo: MSI AMD 3000 series, cant find exact model.
I've updated my AMD drivers to the latest and so far can only modify the resolution to stock ones that look horrid on the screen. Any program in windows 10 to change the resolution of a screen.
Ok so I recently updated to Windows 10 but now when I connect my laptop to my external monitor the laptop screen doesn't show up on the external monitor.
My laptop's screen is broken so I can't see anything. The laptop is still on the screen immediately after updating so I could blindly work my way through the update screens, but someone would have to upload pictures of the screens immediately after updating to Windows 10.