Multiple Display not being recoginsed

[Deleted User][Deleted User] Posts: 4,512
Forum Member
✭✭✭
http://www.intel.com/support/graphics/sb/cs-031026.htm

I have the software in the link above, but when I plug in my second monitor, it doesn't detect it so the options for multiple displays don't appear.

Any idea how I can force detect the second screen?

Comments

  • chrisjrchrisjr Posts: 33,282
    Forum Member
    ✭✭✭
    What kit are you using? ie what graphics card, presumably it is a PC rather than a laptop? If a PC are you plugging one monitor into a separate card and the other into the on-board graphics?

    If you are using a separate card then it is possible the on-board graphics might be disabled. But if you are using a separate card with dual outputs then tell us exactly how they are connected.

    One possible cause is if the card has VGA and DVI-I if you try to use a VGA monitor on the DVI-I connector via an adapter and the separate VGA socket that might not work. There could be only one VGA driver chipset that can only drive one VGA monitor at a time. The same sort of thing could apply to DVI and HDMI sockets. You might only be able to use on at a time.

    But it would help us be more specific knowing what you've got and how you are trying to use it.
  • [Deleted User][Deleted User] Posts: 4,512
    Forum Member
    ✭✭✭
    Yes it's a PC.

    Intel Pentium Dual CPU
    E2160 @ 1.8GHz

    It's all on-board graphics and it has a VGA socket and a DVI-I socket. I've used the VGA as primary for a few years and have bought a 'DVI-i to VGA' to run a second VGA monitor however it is just not recognising the monitor at all.
  • chrisjrchrisjr Posts: 33,282
    Forum Member
    ✭✭✭
    I would not be surprised if that is the problem. As I posted it might just have the one VGA driver chipset that is wired to both the VGA socket and the VGA pins on the DVI-I connector.

    There is a chance that if you plug into the VGA socket that it disables the DVI-I connection. Easy to check, just unplug the VGA socket and see if that makes the DVI-I socket work. If it does then I suspect you will have to use VGA and DVI monitors.

    Haven't seen a VGA only monitor for years now so be surprised if neither monitor has got DVI.
  • [Deleted User][Deleted User] Posts: 4,512
    Forum Member
    ✭✭✭
    Taking out the VGa doesn't activate the DVI-I socket :(

    I have both DVI monitors and VGA, but nothing i plug into it enables the 'Multiple Monitors' feature in my graphics control panel
  • chrisjrchrisjr Posts: 33,282
    Forum Member
    ✭✭✭
    OK go back to first principles.

    Use just the one monitor and plug into the VGA socket and check that produces an image. Then use the DVI-VGA adapter and check that too displays an image. Then use a DVI to DVI lead to check the DVI output works.

    If each possible output works independently then that at least tells you the problem is not the actual outputs but with getting them to co-operate with each other.

    Another thing. If you have a VGA and DVI monitor plugged in what happens when you start up the PC? If the machine is seeing both monitors then it is usual to see both monitors display the same image right up to the point Windows loads. So you should see any BIOS info screen and the start up graphic for Windows on both. If you do not then that may suggest the PC isn't able to drive two monitors. Or there may be a BIOS setting that needs changing to enable multi-monitor support.

    It would help to know the graphics chipset used. If you don't know for certain what it is then install this

    http://www.piriform.com/speccy

    which should display the make/model of the chipset in the summary screen. Knowing what the chipset is might help determine if it is capable of multi monitor support.
Sign In or Register to comment.