Graphic Cards :: Monitor Connected To HDMI Port - Input Not Supported
Aug 11, 2015
Every time I restart or boot up my PC with my monitors turned on first I always get a issue with the monitor connected to the HDMI port, it always says "Input not supported" and all I do is turn my monitor off then back on and the issue has been resolved...
I installed Windows 10 and found some apps did not fit the monitor screen (too big). I went in and changed the monitor resolution to a larger size (windows recommended 1920 x 1080) but I tried something larger, and now the monitor has a black screen with a little box moving around which says "input not supported". Not being able to use that monitor to go in and change it back, I hooked up a flat screen t.v. which worked but Windows only showed the current monitor size (which was smaller than the original monitor.
I tried the monitor on another computer I have also running Windows 10 and it worked fine. It shows the resolution as 1920 x 1080. When I plugged it back into the original computer I get the same plack screen with "input not supported" so I can't change it there. I suspect the issue is with the registry somewhere in Windows 10 on the first computer, but what the problem is. I've been working on this for a whole day now. You can probably tell I'm not well versed in computer technology. I just want my old monitor back (Acer K272HL) and I promise I won't mess around with it again.
when i connect a dvi or hdmi from my gtx 780 to my acer H274HL Monitor all works fine but when i want to connect display port cable on my gtx 780 and the other side is hdmi that goes to my monitor than there is no signal at all.
i am running windows 10 64 bit enterprise and all drivers are up to date also windows updates are up to date.
Just unboxed my computer from CyberpowerPC, GTX 720 graphics card, installed/updated all drivers with drivereasy (paid for just now) and I am still having the issue with HDMI port not working. I have tried to plug into a monitor and a TV, neither are working, I have also tried to use multiple HDMI cables, still does not work. My monitor has an auto-switch source availability and turned on, when I plug my computer into the monitor with HDMI it continually cycles, finds no signal and goes into sleep mode. Then rinse/repeats until I either turn off the computer, monitor or just unplug the monitor from the computer. Currently I am using a DVI(computer) to HDMI (tv) and its working fine. So I know its the port on the computer. Do I need to contact CyberpowerPC to send the computer back to get this issue resolved?
It worked fine in windows 7 but now when I choose "duplicate display" from the side menu popup the tv goes blank as if Ive unplugged the hdmi. All 3 of the remaining display options work including the extend display and tv only setting so its not a cable issue. Laptop model is XPS 17 L702X. Now when I used to do it in win 7 sometimes it wouldnt autodetect it so I would have to right click desktop, open invidia up and click rigorous display detect or something like that and then laptop screen would blink and make the beep noise like when hdmi cable got connected. Now in the invida panel it just simply wont allow mirroring, nothing happens when I right click on the display box that would normally show 1 and a smaller box would say 2. I dont get it but Id really like that feature back If its something simple I could do.
I have temporarily connected a 22 inch LCD tv monitor( until i get a new one) to pc via HDMI lead but i cannot get resolution set right.The recommended native makes the taskbar disappear of screen and the desktop icons are barely visible on left side of screen.
I have tried various resolution settings but all are no good apart from one which makes the desktop look like like a laptop screen.
I have reset Windows 10 and updated Intel Graphic Card drivers to the latest but still resolution cannot be set correctly for monitor and used a VGA lead aswell.
Is it because you need a proper pc monitor to have correct resolution ?
I'm running my audio via HDMI from my graphics card and it plays through the speakers built into my monitor.
When I first did the Win 10 update I had no audio whatsoever, and followed some suggestions on online forums that lead me to disable the Realtek High Definition Audio which I did and upon a reboot- my HDMI audio was working.
But then I had another weird issue. My PC is set to never sleep as I use it as a media server, but my screens turn off automatically after 10 minutes. What im finding is that when I wake up my monitors the audio stops. When I view playback devices I can't do any tests because "the device is already in use by another application"
(note: I have the checkbox for allowing an application to take exclusive control of a device un-checked).
So far there are only 2 ways to restore audio. 1) restart the PC. 2) go into device manager > Sound, video and game controllers > AMD High Definition Audio Device > Disable... wait a few moments > Enable.
A friend of mine upgraded to W10 this afternoon after being on W7 Ultimate. Everything seemed okay until he went to change his monitor resolution. It isn't listed under the resolution settings in the control panel. His monitor's native is 1366x768, but even when selecting 1360x768, it says that his input is not supported.
This is strange considering it worked just fine when on Windows 7. My only thought is that his GPU is quite old - It's a ATI Radeon HD 4600 Series, and the monitor is an Acer X193HQ. He doesn't know an awful lot about hardware, but I believe he's using a DVI cable from the PO.
So I bought a new monitor, the MG279Q and tried plugging it in to my PC via displayport, the monitor comes with a DP->miniDP cable, (from GTX780 DP to monitors miniDP) and the monitor gives no signal. However the computer does recognize the monitor and for example gives the sound alarm for new device attached. I can also see the device describes as "Generic pnp monitor" and I can see it in the Windows's own control panel for monitor + the nvidia control panel (pics in the end). But it won't allow me to set the monitor to use..
I suspected that the cable might be broken so I tested it with my UX32LN (GeForce 840M) laptop which has a miniDP. I plugged it to the monitor (from laptops miniDP to monitors normal DP, the monitor has both DP and mDP) and wow, image straight away. So the cable should be fine I guess.. But this is somewhat of an different situation as I use different port in the monitor compared to my desktops GTX780. And I don't have other computers or gadgets that I could use to test the monitors miniDP port.
Next I plugged in the monitor via HDMI to my desktop and that worked just fine. It recognizes the monitor without problems. Using my old XL2411Z monitor with DVI + MG279Q with DP or the MG279Q with HDMI+DP at the same time I can see the MG279Q being connected twice but it only chooses the HDMI connection and doesn't let me change it to DP even though it shows the connection.
Here are some pics to explain it better:
Multiple panel settings window. It shows the MG279Q on the top (under GTX780 the one not clicked) but it won't let me click it to use. Sometimes it lets me tick it for a second but instantly ticks the box off. The monitor nro 2 is the MG279Q via HDMI.
Surround+PhysX window shows that I've connected the monitor via DP but it is greyed out. It also won't let me make a surround set with the Xl2411z + mg279q DP.
Changing resolution settings only displays the HDMI connected monitor.
So I tried uninstalling the Nvidia drivers + freshly installing them, did not work at all. Neither did uninstalling the generic pnp monitor so the computer would reinstall the displays drivers. I also freshly installed from Win7 Pro -> Win 10 Pro and no difference (was intended to do that later on anyway so I thought that I might check, my laptop is running Win10 and it's DP worked after all). I also went to BIOS and checked that iGPU is disabled + monitor setting is set to PCIE instead of auto, no use. I would also test the GTX780's DP on other desktop computer but I can't. Anyway I'm somewhat suspecting a software/hardware compatibility issue as the monitor is being recognized. Doubt the fact that the monitors miniDP or my graphics cards DP port is broken..
What should I do or try? I really want to run the monitor via DP so I can utilize the 144Hz refresh rate.
I have an Asus Desktop. I plugged a working HDMI cord into my older Sony Bravia TV (Smart TV). I have a GTX 760 192 Bit NVIDIA graphics card. I went to the NVIDIA control panel, and it wont read the TV. I tried going through the windows 10 64 bit version of trying to find connected devices, and the TV shows up but as a media streaming device. It almost seems like the PC itself doesn't know it has an HDMI slot.
I've been running a multi-monitor setup for a while now, with an ASUS VG248QE 144Hz monitor as my main, and one or more spare monitors as secondary displays. I've set the ASUS monitor to 144Hz without issue in the past, while the other monitors stay at 60Hz.
The other night, Windows wanted to update. I figured it was the usual patch session, so I let it do its thing. However, after restarting my computer, I noticed that my ASUS monitor would only operate at 144Hz if it was the only monitor plugged into the system. When I try to set it higher in the Nvidia control panel, it just reverts back to 60Hz after applying changes. I updated my graphics drivers (though I was only one version behind) and even sought a special ASUS driver for the monitor to resolve the issue, but nothing has worked.
Did Windows recently add a forced refresh rate sync between all monitors recently? If so, that really sucks because I only own one 144Hz monitor.
Most people I've seen usually have appalling (almost unreadable) quality when using an external monitor with an HDMI cable -- it's quite simple to fix really easily. Films / video are usually OK but if you are reading email or doing standard computer work then you want the monitor to give you decent results.
First ensure that the "Sharpness" setting on the monitor is set correctly -- oversharpening makes text etc look terrible.
secondly use something like "Auto size" so the picture size fits the screen properly rather than be slightly too large or small even if say 1920 X 1080 HD is selected.
thirdly use correct frequency for scanning -- in Europe should be 60 or 59HZ - use the "p" choice not the "I" one.
finally use sensibly the smart picture settings if you have one or use colour / tint / contrast / brightness settings properly.
Messing around with these controls really does turn an external monitor from in some cases having an almost unreadable screen to a pleasure to use --even on cheap 22 inch TV's.
Use the monitors own menu - don't do it from the PC display settings.
A typical laptop's video card will be perfectly ok on these types of monitors.
I am cleaning out and setting up an old-ish PC and whenever I connect it to my TV or Monitor it says either "No Input to TV" or it is just black when I turn it on, I am using VGA to connect it, I tried with a GTX 650 and with just the built in graphics, and still no luck, any reason why?
Have a user with a Sony vaio laptop, 15" screen, 1920x1080. They also plug an HDMI cable into a 23" dell widescreen monitor for a second display. This 23" monitor is also 1920x1080. They are complaining that fonts on this second monitor are blurry. They tried to live with it a week but they said can it be made clear again like it was when they had Windows 7?
Are you familiar with the possibility of fonts being blurry only on one monitor if they are both set to the native?
I'm asking them to see if they change the font dpi to 100%, logoff and on if its any better. For whatever reason that seemed to work for my laptop (another Sony Vaio), but my concern is they will be disappointed at legibility on the laptop screen (it makes everything tiny).
I have a Dell XPS 13 (2016) with a Dell USB-C adapter. It will not detect my NEC LCD external monitor. I have tried a Dell USB-C adapter and a D-SUB cable to the monitor. I have also tried a Monoprice USB-C adapter with a Monoprice HDMI cable. Neither work. The monitor works fine with my Mac. I have told the computer to extend and duplicate the display using the Windows-P command. I have tried to update the driver for the monitor (I have NEC's driver), but it will not install because the monitor is not detected.
My computer currently has 3 monitors connected to it. 2 through the graphics card (R9 280), 1 through the integrated graphics (AMD HD 3000). Not sure when why or how, but the integrated graphics card can never pick up what type of monitor is connected to it... so it always gives me a bunch of generic 4:3 resolutions...
I've tried uninstalling the drivers and letting them reinstall. I've tried swapping around the cables to see if i could get lucky with one of the other screens, but they all face the same issues. I've tried buying brand new cables across the board because sometimes people say that's the issue. But not success.
The cloest thing i have is apparently a custom resolution setter, but it was created for windows 7 in mind and doesn't seem to work on windows 10 (PowerStrip).
My question is, is there anyway to format the list so i can select the correct resolution? It's a Toshiba TV/Monitor and it seems like every thing else attached to my pc (even 3rd party programs i tried to use to forcably change the resolution) can spot it and hand me back the correct resolution but i can't seem to change it. I get Genertic pnp through the control panel (the only place the monitor shows up in) and even then the correct driver doesn't show, it's the Microsoft Generic driver.
Mobo: MSI AMD 3000 series, cant find exact model.
I've updated my AMD drivers to the latest and so far can only modify the resolution to stock ones that look horrid on the screen. Any program in windows 10 to change the resolution of a screen.
I have a MSI GS60 ghost pro gaming laptop which has Intel HD graphics 4600 and Nvidia GTX970M GPU, CPU i7 4720HQ, running Windows 10. I study abroad, several months ago i went back home and plugged-in my laptop to the TV via HDMI for gaming like normal. Yesterday i bought an ASUS VC239H external monitor and tried to do the same thing, but there is no HDMI input signal, the Laptop cannot detect the monitor (include display settings, intel HD graphics settings, nvidia settings). This is what i tried:
- Fn+f2, Fn+f8 keys, windows P switching to duplicate or extend or anykind of options - plug the monitor into my friend's laptop with windows 8 through that HDMI cable and it works fine, the same with other laptop running windows 7==> cable and monitor are normal - plug my laptop to my neighbor's smart TV. There was ''connected' symbol which appears on the TV but black screen instead of image, Laptop couldn't detect the TV - reinstalling nvidia and onboard-graphics drivers and update to the latest - Made a clean install of windows, tried windows 8 also but the same result - Updated my BIOS and EC - Bought an adapter HDMI-Mini Displayport to use the mini displayport with HDMI cable
Ok so I recently updated to Windows 10 but now when I connect my laptop to my external monitor the laptop screen doesn't show up on the external monitor.
My laptop's screen is broken so I can't see anything. The laptop is still on the screen immediately after updating so I could blindly work my way through the update screens, but someone would have to upload pictures of the screens immediately after updating to Windows 10.
I just upgraded from Windows 7 to Windows 10 and now my computer only detects one display rather than two. I have a Dell desktop computer and two Dell monitors. Using Windows 7 both monitors were being detected/used. How to correct. The second display shows a black screen with the message "Cannot Display This Video Mode".
Surfing the web like normal with Photoshop in the background (not doing anything on it yet)Monitor goes dark blue/green and returns to normal saying Photoshop has stopped working, and saying in the bottom right corner that the driver had failed and then recovered.
This is probably the most painful issue I have with Windows 10 right now (and likely previous versions as well, but I didn't have a multi-monitor setup back then).
The monitors I have are as follows: 3840x2160 (4K UHD) monitor with preferred DPI: 144 (150%)1920x1080 (Full HD) monitor with preferred DPI: 96 (100%)
Whenever one of these monitors are set as primary, all desktop applications displayed on the secondary monitor (doesn't matter which) has blurry text. Exceptions are the Windows Store Apps like Windows Store and Microsoft Edge, along with the Taskbar/Start Menu, the Taskbar/Start Menu settings screen, the Taskbar context menu, and the desktop context menu which passes the DPI test with flying colours, with crispy text on both monitors (occasionally a DPI switch bug gets in, but I can mostly ignore that). The problem is, as you can probably guess, is that >99% of the applications I use aren't Windows Store Apps.
Here are some screenshots. The "Taskbar and Start Menu Properties" text is what the text should look like while the Visual Studio 2015 text is an example of the text most desktop apps get. The blurry image is what happens when the UHD monitor is not the primary monitor. Attachment 48493Attachment 48494 Note: Both of these screenshots came from the 150% DPI monitor so it's best viewed at that (144) DPI level. The 96 DPI monitor is similarly affected.
Things I've already tried: Reinstall the graphics driver. Did this multiple times in fact for unrelated reasons.Reinstall Windows 10 (through Reset This PC recovery option). I did this for also unrelated reasons but it definitely doesn't fix this issue.Use the XP Explorer "fix". Merely worsens the problem. Adjusting Clear Type options. Alleviates the issue a bit but see next point.Disabling Clear Type on the affected monitor. The text obviously sharpens, but it's painful to read and a close inspection of the text reveals the issue isn't solved at all, only mitigated slightly.Replacing the video card. I swapped this in with my older GTX 560 Ti but it's obvious the problem remains. Both it and my current card are NVIDIAs though, so it's vaguely possible the drivers or the cards themselves are the cause. I don't have an ATI/AMD card (that still works, at least) to test the setup and every Intel iGPU I have either has only one monitor output or is incapable of handling UHD resolutions.
Things I won't try: Setting both monitors' DPI to 96. Text would become microscopic considering the UHD monitor's actual size.Use the text resizing feature instead. I'm going take a wild guess that this is not monitor-specific and would cause everything on the HD monitor to be far too large to the point that I'd rather unplug it.
Looking for multi-resolution, multi-DPI, multi-monitor setup with or without this issue? The text is painful to read on whichever is the secondary monitor right now, and is extremely apparent whenever the background is dark.
I upgraded to Windows 10 yesterday, and at first, everything was fine, but when I play league of legends, randomly, my computer just says "input not supported" or "no signal" or sometimes nothing at all, and then shuts down. This never happened before I upgraded to Windows 10. I went back to Windows 8.1, but it still shuts down in the middle of league games. Again, never happened before going to Windows 10. Is it my GPU? or CPU? or PSU?
CPU: Intel Core i7-4790K 4.0GHz Quad-Core Processor CPU Cooler: Corsair H105 73.0 CFM Liquid CPU Cooler Motherboard: MSI Z97-G55 SLI ATX LGA1150 Motherboard RAM: G.Skill Ripjaws Z Series 8GB (2 x 4GB) DDR3-2400 Memory Storage: Intel 530 Series 240GB 2.5" Solid State Drive Seagate Barracuda 1TB 3.5" 7200RPM Internal Hard Drive GPU: Asus GeForce GTX 970 4GB STRIX Video Card PSU: Corsair CX 600W 80+ Bronze Certified Semi-Modular ATX Power Supply
Also, the SSD is my boot drive and has league on it, it has only about 35GB out of 240GB left.
after upgrading to the windows 10 OS, I could not get my HDMI port to work. It does not get recognized at all and is very frustrating. I run the OS on my laptop which is a HP Pavilion. I really need it to be working. I love windows 10 but this is a big problem.
Windows 10 plays Mp3 Files just fine. Connect an HDMI cable to the video output port, and then windows 10 suddenly wont play those files. An error message pops us inside windows media player and Grove that this file type is not supported. Also no sound is played through the HDMI Cable to my TV, and no sound is played when i click the volume bar on the bottom right side.
I work for an integration company. We just installed a couple of conference rooms for a company in which we have an HDMI cable running from the table through the floor in the walls and to the TV. From there it breaks out into the Amplifier and plays audio in the room. Normally i bring my Mac book pro for testing but i purchased this Windows 10 computer as a test device. I did not have a second laptop with me to test this out.
However, With the computer and the HDMI cable i am getting video to the TV. When i enter my TV menu and arrow around the menu option my sound works on the built in speakers in the room. So i can confirm that the cabling is all correct. This is the identical setup that we always use. I have also plugged my Iphone into the cabling behind the TV and played music which worked perfectly fine.
The computer, as i mentioned, would play MP3 files perfectly fine, untill i plugged in the HDMI cable. Infact i can be playing and MP3 file and then plug in the hdmi cable and suddenly the computer stops playing the MP3 file and gives me the error that windows can not support this file type. Those same MP3's play on an SDCard in the TV when i access them directly from the tv. So those files are not bad. Also anything with audio fails to play anywhere (tv or locally on the laptop) when the HDMI is plugged in.
Also i have gone to the control panel and selected the Audio section and looked at the video card / audio output selections. When i have no HDMI cable selected the option for audio output changes to the internal sound card. (which is correct) and then i plug in the HDMI Cable and the audio switches to the HDMI output (which is correct)
the only two conclusions i can come up with are this particular computer has a bad video card. or windows 10 is not supporting audio output. I have never tested our setups with windows 10 before.
Last bit i'll include this is a brand new computer. Core i5 8Gb ram 1tb hard drive. its a toshiba.
I just installed Windows 10. Now my laptop monitor won't work - only the external monitor. I go to Graphic setting and click on multiple displays, but it states it doesn't see another device only the external monitor. Tried disconnecting the external monitor and rebooting, but the laptop display still doesn't turn on.