Home Theater & Entertainment TV & Displays VGA vs. HDMI: What's the Difference? If you still use VGA you may want to consider upgrading. by Ryan Dube Writer Ryan Dube is a freelance contributor to Lifewire and former Managing Editor of MakeUseOf, senior IT Analyst, and an automation engineer. our editorial process Facebook Twitter LinkedIn Ryan Dube Updated on October 22, 2020 TV & Displays HDMI & Connections Samsung Projectors Antennas Remote Controls Tweet Share Email The main difference between VGA vs. HDMI video cables and ports is that the VGA signal is analog, while HDMI is digital. This means VGA signals transmit information via electrical wave size. HDMI digital signals transmit data in bits of data (on or off) in varying frequencies. There are many other differences between the two, which should help you decide which cable and converters you may need to use. Lifewire Overall Findings VGA Adapters can convert to HDMI. Only transmits video. Max refresh rate of 60 Hz Max resolution of 1600x1200 HDMI Supported by modern devices. Transmits video and audio. Max refresh rate of 240 Hz. Max resolution of 1920 x 1200 Video Graphics Array (VGA) was the standard video cable for computers when it was first released in 1987 and are easily recognizable by their blue 15-pin connectors. At that time, the supported resolution was 640x480, but eventually expanded in stages up to Ultra Extended Graphics Array (UXGA) in 2007. UXGA could support 15" monitors at 1600x1200 pixels. High Definition Multimedia Interface (HDMI) was developed in 2002 and soon became the new standard for computing. The main feature offered by HDMI that no other video cable could offer was the ability to transmit audio in the same cable as the video signal. HDMI supports HD video at 1920x1200 pixels and 8 audio channels. Few devices support VGA anymore. You'll find most computers and TVs have an HDMI port and no VGA port. However, you might have a need for a VGA cable if you still use older technology like older projectors or older video game consoles. Compatibility: Modern Monitors Use HDMI VGA Available on older monitors. Supported on older graphics cards. Adapters can convert to HDMI. Converters degrade the signal. HDMI Available on newer monitors. Adapters can convert to VGA. Supported by most graphics cards. If you still have a very old monitor with a VGA port, you will have a need for a VGA cable. However you likely will need a VGA to HDMI converter to connect to any modern monitors. If you're using a monitor built from 2000 through 2006, you'll likely need a VGA to DVI converter. However, since VGA can't transmit high definition video signals to newer displays like HDMI can, even with a converter you'll notice significantly degraded video. If you're using a newer computer with an older monitor that has a VGA port, there are HDMI to VGA converters available as well. Audio: HDMI Supports High Definition Audio Signals VGA VGA only transmits video. Requires second audio output. Newer graphics cards do not support VGA HDMI Supports 32 audio channels. Supports Dolby, DTS, and DST high-resolution audio. Doesn't require second audio cable. VGA can only transmit a single video signal without any audio, while HDMI can transmit up to 32 channels of digital audio. HDMI supports most high definition audio signals like Dolby Digital, DTS, and DST. If you use a VGA to HDMI converter to display from an older computer to a newer monitor, you'll still need a second audio cable to transmit sound. If you use an HDMI to VGA converter to display from a newer computer to an older monitor, a second audio cable is still needed if the monitor supports sound. If it doesn't, you'll need to connect your computer's audio to separate speakers. Data Transfer Speed: HDMI Is Far Superior VGA Maximum refresh rate of 85 Hz. Less input lag. More signal interference. Not hot-pluggable. HDMI Maximum refresh rate of 240 Hz. Slight input lag. Almost no signal interference. Hot-pluggable. An HDMI cable has 19 or 29 pins and transmits video and audio. HDMI 2.0 is capable of achieving 240 Hz at 1080p resolution. VGA on the other hand has 15 pins and uses an RGB analog video signal. This analog signal is only capable of a refresh rate from 60 Hz to potentially 85 Hz. Another significant difference is that you can unplug and plug in an HDMI video cable while the computer is turned on and the video cable is transmitting (hot pluggable). You can't do this with VGA. You'd need to stop the video stream or turn off the computer before plugging in the VGA cable. The one benefit to VGA's analog signal is that there's no post-processing of digital signals, which means there will be no "input lag". However in the case of HDMI, the data transfer and refresh rates are so much higher that this input lag is insignificant by comparison. VGA signals are also subject to significant signal interference from outside sources like microwaves or cellphones. HDMI cables are far less susceptible to this, and with thick shielding almost completely impervious to interference. Final Verdict: If you're using a much older computer that only has a VGA port, you're eventually going to have to use a VGA to HDMI converter to use newer displays. However, you're never going to be able to enjoy the much higher detail and refresh rates that a full HDMI port and cable offers. The only time you may need to use a VGA cable is if you're still using older devices like vintage gaming consoles. In this case you'll want to keep a VGA cable with the device, as well as the required converters. Ultimately, you're going to want to upgrade your desktop or laptop to a newer one that offers the best video output possible. You'll find that the latest video outputs use USB-C, but there are plenty of converters that allow you to output from USB-C to HDMI displays without any signal loss at all.