Graphic Cards :: Monitor Goes Dark Blue / Green And Returns To Normal Saying Photoshop Has Stopped Working
Feb 10, 2016
Surfing the web like normal with Photoshop in the background (not doing anything on it yet)Monitor goes dark blue/green and returns to normal saying Photoshop has stopped working, and saying in the bottom right corner that the driver had failed and then recovered.
I've been running a multi-monitor setup for a while now, with an ASUS VG248QE 144Hz monitor as my main, and one or more spare monitors as secondary displays. I've set the ASUS monitor to 144Hz without issue in the past, while the other monitors stay at 60Hz.
The other night, Windows wanted to update. I figured it was the usual patch session, so I let it do its thing. However, after restarting my computer, I noticed that my ASUS monitor would only operate at 144Hz if it was the only monitor plugged into the system. When I try to set it higher in the Nvidia control panel, it just reverts back to 60Hz after applying changes. I updated my graphics drivers (though I was only one version behind) and even sought a special ASUS driver for the monitor to resolve the issue, but nothing has worked.
Did Windows recently add a forced refresh rate sync between all monitors recently? If so, that really sucks because I only own one 144Hz monitor.
I installed Windows 10 and found some apps did not fit the monitor screen (too big). I went in and changed the monitor resolution to a larger size (windows recommended 1920 x 1080) but I tried something larger, and now the monitor has a black screen with a little box moving around which says "input not supported". Not being able to use that monitor to go in and change it back, I hooked up a flat screen t.v. which worked but Windows only showed the current monitor size (which was smaller than the original monitor.
I tried the monitor on another computer I have also running Windows 10 and it worked fine. It shows the resolution as 1920 x 1080. When I plugged it back into the original computer I get the same plack screen with "input not supported" so I can't change it there. I suspect the issue is with the registry somewhere in Windows 10 on the first computer, but what the problem is. I've been working on this for a whole day now. You can probably tell I'm not well versed in computer technology. I just want my old monitor back (Acer K272HL) and I promise I won't mess around with it again.
I am cleaning out and setting up an old-ish PC and whenever I connect it to my TV or Monitor it says either "No Input to TV" or it is just black when I turn it on, I am using VGA to connect it, I tried with a GTX 650 and with just the built in graphics, and still no luck, any reason why?
I have temporarily connected a 22 inch LCD tv monitor( until i get a new one) to pc via HDMI lead but i cannot get resolution set right.The recommended native makes the taskbar disappear of screen and the desktop icons are barely visible on left side of screen.
I have tried various resolution settings but all are no good apart from one which makes the desktop look like like a laptop screen.
I have reset Windows 10 and updated Intel Graphic Card drivers to the latest but still resolution cannot be set correctly for monitor and used a VGA lead aswell.
Is it because you need a proper pc monitor to have correct resolution ?
Have a user with a Sony vaio laptop, 15" screen, 1920x1080. They also plug an HDMI cable into a 23" dell widescreen monitor for a second display. This 23" monitor is also 1920x1080. They are complaining that fonts on this second monitor are blurry. They tried to live with it a week but they said can it be made clear again like it was when they had Windows 7?
Are you familiar with the possibility of fonts being blurry only on one monitor if they are both set to the native?
I'm asking them to see if they change the font dpi to 100%, logoff and on if its any better. For whatever reason that seemed to work for my laptop (another Sony Vaio), but my concern is they will be disappointed at legibility on the laptop screen (it makes everything tiny).
I have a Dell XPS 13 (2016) with a Dell USB-C adapter. It will not detect my NEC LCD external monitor. I have tried a Dell USB-C adapter and a D-SUB cable to the monitor. I have also tried a Monoprice USB-C adapter with a Monoprice HDMI cable. Neither work. The monitor works fine with my Mac. I have told the computer to extend and duplicate the display using the Windows-P command. I have tried to update the driver for the monitor (I have NEC's driver), but it will not install because the monitor is not detected.
I'm having this frequent problem with my desktop computer. Since updating from Windows 7 to Windows 10, every time I use my PC (at random times) the whole screen will just go blank and I'll get a little message popup in the right hand corner saying.
Display driver stopped responding and has recovered Display driver AMD driver stopped responding and has successfully recovered.
I don't use my PC for anything real heavy it just randomly happens.. From either scrolling up and down a website page or watching a YouTube video etc... I've tried upgrading my graphics driver but still no luck.
Computer type: PC/Desktop System Manufacturer/Model Number: DELL INSPIRON ONE OS: Windows 10 CPU: AMD Athlon II X2 240 Motherboard: Dell 0DPRF9 Memory: DDR 3 4GB Graphics Card(s): AMD Mobility Radeon HD 5000 Series
It worked fine in windows 7 but now when I choose "duplicate display" from the side menu popup the tv goes blank as if Ive unplugged the hdmi. All 3 of the remaining display options work including the extend display and tv only setting so its not a cable issue. Laptop model is XPS 17 L702X. Now when I used to do it in win 7 sometimes it wouldnt autodetect it so I would have to right click desktop, open invidia up and click rigorous display detect or something like that and then laptop screen would blink and make the beep noise like when hdmi cable got connected. Now in the invida panel it just simply wont allow mirroring, nothing happens when I right click on the display box that would normally show 1 and a smaller box would say 2. I dont get it but Id really like that feature back If its something simple I could do.
My computer currently has 3 monitors connected to it. 2 through the graphics card (R9 280), 1 through the integrated graphics (AMD HD 3000). Not sure when why or how, but the integrated graphics card can never pick up what type of monitor is connected to it... so it always gives me a bunch of generic 4:3 resolutions...
I've tried uninstalling the drivers and letting them reinstall. I've tried swapping around the cables to see if i could get lucky with one of the other screens, but they all face the same issues. I've tried buying brand new cables across the board because sometimes people say that's the issue. But not success.
The cloest thing i have is apparently a custom resolution setter, but it was created for windows 7 in mind and doesn't seem to work on windows 10 (PowerStrip).
My question is, is there anyway to format the list so i can select the correct resolution? It's a Toshiba TV/Monitor and it seems like every thing else attached to my pc (even 3rd party programs i tried to use to forcably change the resolution) can spot it and hand me back the correct resolution but i can't seem to change it. I get Genertic pnp through the control panel (the only place the monitor shows up in) and even then the correct driver doesn't show, it's the Microsoft Generic driver.
Mobo: MSI AMD 3000 series, cant find exact model.
I've updated my AMD drivers to the latest and so far can only modify the resolution to stock ones that look horrid on the screen. Any program in windows 10 to change the resolution of a screen.
I have a MSI GS60 ghost pro gaming laptop which has Intel HD graphics 4600 and Nvidia GTX970M GPU, CPU i7 4720HQ, running Windows 10. I study abroad, several months ago i went back home and plugged-in my laptop to the TV via HDMI for gaming like normal. Yesterday i bought an ASUS VC239H external monitor and tried to do the same thing, but there is no HDMI input signal, the Laptop cannot detect the monitor (include display settings, intel HD graphics settings, nvidia settings). This is what i tried:
- Fn+f2, Fn+f8 keys, windows P switching to duplicate or extend or anykind of options - plug the monitor into my friend's laptop with windows 8 through that HDMI cable and it works fine, the same with other laptop running windows 7==> cable and monitor are normal - plug my laptop to my neighbor's smart TV. There was ''connected' symbol which appears on the TV but black screen instead of image, Laptop couldn't detect the TV - reinstalling nvidia and onboard-graphics drivers and update to the latest - Made a clean install of windows, tried windows 8 also but the same result - Updated my BIOS and EC - Bought an adapter HDMI-Mini Displayport to use the mini displayport with HDMI cable
Ok so I recently updated to Windows 10 but now when I connect my laptop to my external monitor the laptop screen doesn't show up on the external monitor.
My laptop's screen is broken so I can't see anything. The laptop is still on the screen immediately after updating so I could blindly work my way through the update screens, but someone would have to upload pictures of the screens immediately after updating to Windows 10.
Every time I restart or boot up my PC with my monitors turned on first I always get a issue with the monitor connected to the HDMI port, it always says "Input not supported" and all I do is turn my monitor off then back on and the issue has been resolved...
Since upgrading to Windows 10 from 8.1, I've seen the following message several times.
Display Driver stopped responding and has recovered Display driver Intel HD Graphics Drivers for Windows 8(R) stopped responding and has successfully recovered.
When it happens, my laptop screen goes black for a few seconds, except for the taskbar and toast notification. I also have an external monitor that doesn't seem affected at all.
I can reproduce the error pretty consistently if I rapidly switch a Flash video back and forth between full-screen and embedded, but that's not the only time I've seen the problem.
As far as I can tell, everything continues running. Even the full-screen video continues running after the screen isn't black any more.
The weird thing is that the driver name it shows isn't the driver I'm using. I've already upgraded my display drivers to the latest available from Intel's site.
Some things I've observed:
The information presented by Windows in Settings -> Display -> Advanced display settings shows the driver for Windows 10 that I downloaded from Intel.The "Intel HD Graphics Control Center" that was installed along with the driver shows the latest driver version that I downloaded from Intel.In Settings -> Apps & Features there's only one "Intel(R) Processor Graphics Driver" listed.Event viewer shows very little useful information:
General Display driver igfx stopped responding and has successfully recovered.Details (XML View): - <Event xmlns="http://schemas.microsoft.com/win/2004/08/events/event"> - <System> <Provider Name="Display" /> <EventID Qualifiers="0">4101</EventID>
[Code] ....
It's conceivable that Intel updated the driver, but forgot to change the string somewhere so the error is actually for the Windows 10 driver. I don't know how to find out exactly what failed or why. It's getting that string from somewhere.
I'm running my audio via HDMI from my graphics card and it plays through the speakers built into my monitor.
When I first did the Win 10 update I had no audio whatsoever, and followed some suggestions on online forums that lead me to disable the Realtek High Definition Audio which I did and upon a reboot- my HDMI audio was working.
But then I had another weird issue. My PC is set to never sleep as I use it as a media server, but my screens turn off automatically after 10 minutes. What im finding is that when I wake up my monitors the audio stops. When I view playback devices I can't do any tests because "the device is already in use by another application"
(note: I have the checkbox for allowing an application to take exclusive control of a device un-checked).
So far there are only 2 ways to restore audio. 1) restart the PC. 2) go into device manager > Sound, video and game controllers > AMD High Definition Audio Device > Disable... wait a few moments > Enable.
I just upgraded from Windows 7 to Windows 10 and now my computer only detects one display rather than two. I have a Dell desktop computer and two Dell monitors. Using Windows 7 both monitors were being detected/used. How to correct. The second display shows a black screen with the message "Cannot Display This Video Mode".
This is probably the most painful issue I have with Windows 10 right now (and likely previous versions as well, but I didn't have a multi-monitor setup back then).
The monitors I have are as follows: 3840x2160 (4K UHD) monitor with preferred DPI: 144 (150%)1920x1080 (Full HD) monitor with preferred DPI: 96 (100%)
Whenever one of these monitors are set as primary, all desktop applications displayed on the secondary monitor (doesn't matter which) has blurry text. Exceptions are the Windows Store Apps like Windows Store and Microsoft Edge, along with the Taskbar/Start Menu, the Taskbar/Start Menu settings screen, the Taskbar context menu, and the desktop context menu which passes the DPI test with flying colours, with crispy text on both monitors (occasionally a DPI switch bug gets in, but I can mostly ignore that). The problem is, as you can probably guess, is that >99% of the applications I use aren't Windows Store Apps.
Here are some screenshots. The "Taskbar and Start Menu Properties" text is what the text should look like while the Visual Studio 2015 text is an example of the text most desktop apps get. The blurry image is what happens when the UHD monitor is not the primary monitor. Attachment 48493Attachment 48494 Note: Both of these screenshots came from the 150% DPI monitor so it's best viewed at that (144) DPI level. The 96 DPI monitor is similarly affected.
Things I've already tried: Reinstall the graphics driver. Did this multiple times in fact for unrelated reasons.Reinstall Windows 10 (through Reset This PC recovery option). I did this for also unrelated reasons but it definitely doesn't fix this issue.Use the XP Explorer "fix". Merely worsens the problem. Adjusting Clear Type options. Alleviates the issue a bit but see next point.Disabling Clear Type on the affected monitor. The text obviously sharpens, but it's painful to read and a close inspection of the text reveals the issue isn't solved at all, only mitigated slightly.Replacing the video card. I swapped this in with my older GTX 560 Ti but it's obvious the problem remains. Both it and my current card are NVIDIAs though, so it's vaguely possible the drivers or the cards themselves are the cause. I don't have an ATI/AMD card (that still works, at least) to test the setup and every Intel iGPU I have either has only one monitor output or is incapable of handling UHD resolutions.
Things I won't try: Setting both monitors' DPI to 96. Text would become microscopic considering the UHD monitor's actual size.Use the text resizing feature instead. I'm going take a wild guess that this is not monitor-specific and would cause everything on the HD monitor to be far too large to the point that I'd rather unplug it.
Looking for multi-resolution, multi-DPI, multi-monitor setup with or without this issue? The text is painful to read on whichever is the secondary monitor right now, and is extremely apparent whenever the background is dark.
Just unboxed my computer from CyberpowerPC, GTX 720 graphics card, installed/updated all drivers with drivereasy (paid for just now) and I am still having the issue with HDMI port not working. I have tried to plug into a monitor and a TV, neither are working, I have also tried to use multiple HDMI cables, still does not work. My monitor has an auto-switch source availability and turned on, when I plug my computer into the monitor with HDMI it continually cycles, finds no signal and goes into sleep mode. Then rinse/repeats until I either turn off the computer, monitor or just unplug the monitor from the computer. Currently I am using a DVI(computer) to HDMI (tv) and its working fine. So I know its the port on the computer. Do I need to contact CyberpowerPC to send the computer back to get this issue resolved?
I have a quite good PC (i7, 16GB RAM, NVIDIA Quadro K600 graphics and on-board Intel HD Graphics 4600, newest drivers). I used 1920x1200 monitor without any issues. Now I have UHD monitor (DELL P2715Q) which works perfect but I have one strange issue - File Explorer stopped working. When I try to start File Explorer nothing happens for few seconds (PC's fans work louder for a while) and then Windows Explorer restarts. And that's all. It's the same on both graphic cards. New monitor is connected via DisplayPort, for old one I used DVI connection.
When I connect my old monitor File Explorer works as usual (without restarting the PC, just switching monitors).
I just installed Windows 10. Now my laptop monitor won't work - only the external monitor. I go to Graphic setting and click on multiple displays, but it states it doesn't see another device only the external monitor. Tried disconnecting the external monitor and rebooting, but the laptop display still doesn't turn on.
The Windows Photo Viewer was perfect for me in Windows 7 so I got it working on Windows 10, too. The new Photos software that came with Windows 10 is useless to me because it doesn't zoom, at lease I can't figure out how as of yet. I just want to keep things simple with the Windows Photo Viewer.
All was OK until I calibrated my monitors & at about the same time there was a Windows update. After calibration the monitors looked much better. But when I opened the Windows Photo Viewer I noticed all photos looked extremely dark. All other photo viewing software shows the same photos normal. I just assumed it was the Windows update that caused this but it wasn't that.
Today I have figured it out that the problem is Windows Photo Viewer software is somehow clashing with my monitor color profile. If I remove the .icc profile, the photos look normal using Windows Photo Viewer. Once the monitor profile is returned, they go back to too dark. As I said, other photo viewing software is not affected by this profile, just Windows Photo Viewer.
I don't have a clue how to fix this other than returning to the default profile which would wreck my Photoshop business. It just seem odd that the color profile only effects the Windows Photo Viewer this way & not other programs.
When i click ''start menu'', the App's icons are red, green, blue or yellow, changing color (or not), you know when your anti-virus change from green to red if you don't make the upgrade, then to red to green after the upgrade. What's this color change mean in window10 ?.
So I recently had to reinstall Windows 10 to resolve an issue I was having, so my Nvidia drivers were deleted in that process. When I went to turn my computer back on after i was finished, there was no display on the screen so I took out my GPU and plugged the VGA into the integrated graphics and it worked fine. This just made me think I have to install the Nvidia drivers again but then I found out that you need the GPU in your computer to install the drivers, but My display won't work anymore with the GPU inside and no drivers for it.
In NVIDIA Control Panel, when I select my settings and click apply every things is OK. The next time I reboot they change. Why are they not being saved?
I upgrade my Win7 Pro to Win10 Pro last night and lost my 3840x2160x30 setup, being reduced back to 1980x1200. I've since done a clean install and reinstalled the latest drivers for my Samsung U28D590D and My ATI Radeon HD 5770.Nothing i have done has given me back my 4K resolution.