How to Disable an Integrated Graphics Card

Written by michael hintz
  • Share
  • Tweet
  • Share
  • Pin
  • Email
How to Disable an Integrated Graphics Card
(Jupiterimages/ Images)

Disable your computer’s integrated graphics card before installing a new card’s drivers. Failing to do so can result in conflicts between the two graphics cards. There are two ways to disable an integrated graphics card: through the computer’s basic input output system (BIOS) or through Windows, using Device Manager and Add or Remove Programs (or another operating system’s similar functions). Not every BIOS has the option to disable an installed integrated graphics card.

Skill level:

Other People Are Reading


  1. 1

    Restart your computer by selecting Start > Shut Down, and selecting Restart from the options menu.

  2. 2

    Press the button listed on the screen for BIOS once the computer has rebooted. (The correct key will vary depending upon the computer's manufacturer; if necessary, consult the computer's manual or contact the manufacturer to determine which key is used, then hold the button before the computer starts and until BIOS boots.) Commonly, BIOS is accessed using one of the F-keys or the Delete key.

  3. 3

    Navigate the BIOS menus (you should now be in a blue screened selections menu) and look for a setting labelled "onboard", "integrated video" or "VGA". This should be located in the Integrated Peripherals section, but it could be in another of the menu options depending upon the BIOS manufacturer; the option may not be available at all on some models (if this is the case, disable the card in Windows).

  4. 4

    Change the integrated graphics setting to "disabled" or "off" by hitting Enter to cycle through the options.

  5. 5

    Save your changes and exit BIOS by pressing the corresponding F-key (it will be listed at the bottom of the screen under Save and Exit), and selecting "Y" for yes to confirm.

  1. 1

    While your computer is on, right click '"My Computer" (or "Computer" in newer versions), and select "Manage".

  2. 2

    Select "Device Manager" from the list on the left.

  3. 3

    Find "Display Adapters" on the list of computer peripherals and select your graphics card. (If you’ve already installed the new card, make sure you don’t choose that one.)

  4. 4

    Right-click on the integrated graphics card, and select "Disable" from the menu. A pop-up will appear to warn you that doing this will stop the card from functioning. Click Yes.

  5. 5

    Go to "Add or Remove Programs" and look for any old video software or drivers. Uninstall these programs to prevent any other conflicts between graphics cards. (Look for the brand of the old integrated graphics card, which was likely Intel, ATI or NVIDIA.)

Tips and warnings

  • If you need to turn your integrated graphics card back on for any reason, you can do so easily by reversing the steps you took to disable it: in BIOS, change the setting back to integrated; in Windows (or other operating system), use Device Manager to enable the card and reinstall any drivers you had uninstalled with "Add or Remove Programs".
  • Make sure your monitor is plugged into a new graphics card (which should in turn be plugged into the computer’s PCI, AGP or PCI-E port). After you disable the integrated graphics card, the port on the back of your computer will no longer work for displaying video.

Don't Miss

  • All types
  • Articles
  • Slideshows
  • Videos
  • Most relevant
  • Most popular
  • Most recent

No articles available

No slideshows available

No videos available

By using the site, you consent to the use of cookies. For more information, please see our Cookie policy.