What are Video Card Benchmarks?

Article Details
  • Written By: M. McGee
  • Edited By: Lauren Fritsky
  • Last Modified Date: 03 November 2019
  • Copyright Protected:
    Conjecture Corporation
  • Print this Article
Free Widgets for your Site/Blog
In 2009, swimming’s governing body banned the full-body "supersuits" worn by many athletes at the 2008 Olympics.  more...

November 14 ,  1972 :  The Dow Jones closed higher than 1,000 for the first time in history.  more...

Benchmarking is the process of determining the capabilities of a piece of computer hardware. Video card benchmarks are the result of benchmarking done to a computer’s video card. Benchmarking may be done through a wide range of methods, such as internal monitors, specialized programs or simply observation of the results of using the hardware in a typical way. Due to the expense and complexity of modern video cards, this form of benchmarking is an extensive process.

The purpose of benchmarking is to illustrate the real world capabilities of a piece of hardware. Manufacturers will often cite numbers and speeds in order to show how their product is superior to others. These numbers are often no more than a broad guideline at best and, at worst, totally meaningless. Performing benchmark tests will show the hardware’s actual performance in a real computer.

Of the three main benchmarking types, video cards rarely use internal monitors. Nearly all video card benchmarks fall in the other two categories; industry-standard software and real world programs. With common industry-standard tests, video cards are sent a series of challenges, and they output results. Many of these challenges operate completely inside the card.


These types of video card benchmarks are still only moderately useful, as they don’t reflect actual usage. It is only in very rare circumstances that video cards operate without interaction with hardware systems. The video card systems require information from hard drives, sequences stored in computer memory and the results of problems given to the computer’s processor. As a result, the real world computing method is a very common method of benchmarking a video system.

Video card benchmarks made by running real world programs are common across the industry. Most testers will take a standard system and test several cards using it. This system will have several high-end programs installed on it—usually video games. Each card will run through the same section of each game using the same settings. The individual frame rate for the game is monitored during the test, and the average is used as a rating for the card.

In order to make the test fair to cards with different capabilities, several different programs are used to make up the video card benchmarks. Each program is picked because of its use of video resources—if it doesn’t test the card’s abilities, there is little reason to use it. Still, there are programs that focus on different areas. Often, a multiplayer first-person shooter tests the card's ability to render graphics quickly. Action titles are used to test detail during motion, and slower games, like a role-playing game, is used to test overall detail-rendering capabilities.


You might also Like


Discuss this Article

Post your comments

Post Anonymously


forgot password?