In the old days of technology, the screen resolution or display resolution was not much of an issue. Windows came with a few preset options and, to get higher resolution, more colors or both, you would install a driver for your video card. As time went on, you could choose better video cards and better monitors as well. Today we have lots of options when it comes to displays, their quality, and the supported resolutions. In this article, we take you through a bit of history and explain all the essential concepts, including common acronyms for screen resolutions, like 1080p, 2K, QHD or 4K. Let’s get started:
It all started with IBM and CGA
The color graphics technology was first developed by IBM. CGA was first, followed by EGA and VGA – color graphics adapter, enhanced graphics adapter, video graphics array. Regardless of the capability of your monitor, you would still have to choose from one of the few options available through your graphics card’s drivers. For the sake of nostalgia, here is a look at how things looked on a once well-known CGA display.
Image source: Wikipedia
With the advent of high definition video and the increased popularity of the 16:9 aspect ratio (we explain aspect ratios in a bit), selecting a screen resolution is not the simple affair it once was. However, this also means that there are a lot more options to choose from, with something to suit almost everyone’s preferences. Let’s look at what today’s terminology is, and what it means:
The screen is what by what?
The term “resolution” is not correct when it is used to refer to the number of pixels on a screen. That says nothing about how densely the pixels are clustered. That is covered by another metric, called PPI (Pixels Per Inch).
“Resolution” is technically the number of pixels per unit of area, rather than the total number of pixels. In this article, we are using the term as it is commonly understood, rather than the absolutely technologically correct usage. Since the beginning, the resolution has been described (accurately or not) by the number of pixels arranged horizontally and vertically on a monitor. For example, 640 x 480 = 307200 pixels. The choices available were determined by the capability of the video card, and they differed from manufacturer to manufacturer.
The resolutions built into Windows were limited, so if you did not have the driver for your video card, you would be stuck with the lower-resolution screen that Windows provided. If you have watched the old Windows Setup or installed a newer version of a video driver, you may have seen the 640 x 480 low-resolution screen for a moment or two. It was ugly, but that was the Windows default.
As monitor quality improved, Windows began offering a few more built-in options, but the burden was still mostly on the graphics card manufacturers, especially if you wanted a really high-resolution display. The more recent versions of Windows can detect the default screen resolution for your monitor and graphics card and adjust accordingly. This does not mean that what Windows chooses is always the best option, but it works, and you can change it if you wish, after you see what it looks like. If you need guidance, read Change the screen resolution and make text and icons bigger in Windows 10.
Also, if you are curious to find out the resolution of your screen, you should check:
Mind your P’s, and I’s about screen resolutions
You may have seen the screen resolution described as something like 720p, 1080i or 1080p. What does that mean? To begin with, the letters tell you how the picture is “painted” on the monitor. A “p” stands for progressive, and an “i” stands for interlaced.
The interlaced scan is a holdover from television and early CRT monitors. The monitor or TV screen has lines of pixels arranged horizontally across it. The lines were relatively easy to see if you got up close to an older monitor or TV, but nowadays the pixels on the screen are so small that they are hard to see even with magnification. The monitor’s electronics “paint” each screen line by line, too quickly for the eye to see. An interlaced display paints all the odd lines first, then all the even lines. Since the screen is being painted in alternate lines, flicker has always been a problem with interlaced scans. Manufacturers have tried to overcome this problem in various ways. The most common way is to increase the number of times an entire screen is painted in a second, which is called the refresh rate. The most common refresh rate was 60 times per second, which is acceptable for most people, but it could be pushed a bit higher to get rid of the flicker that some people still perceived.
As people moved away from the older CRT displays, the terminology changed from refresh rate to frame rate, because of the difference in the way an LED monitor works. The frame rate is the speed with which the monitor displays each separate frame of data. The most recent versions of Windows set the framerate at 60 Hertz or 60 cycles per second, and LED screens do not flicker. Moreover, the system changed from interlaced scan to progressive scan because the new digital displays are so much faster. In a progressive scan, the lines are painted on the screen in sequence rather than first the odd lines and then the even lines. If you want to translate, 1080p, for example, is used for displays that are characterized by 1080 horizontal lines of vertical resolution and a progressive scan. There’s a rather eye-boggling illustration of the differences between progressive and interlaced scans on Wikipedia here: Progressive scan. For another fascinating history lesson, also read Interlaced video.
What about the numbers: 720p, 1080p, 1440p, 2K, 4K and 8K?
When high-definition TVs became the norm, manufacturers developed a shorthand to explain their display resolution. The most common numbers you see are 720p, 1080p, 1140p or 4K. As we have seen, the “p” and the “i” tell you whether it is a progressive-scan or an interlaced-scan display. Moreover, these shorthand numbers are sometimes used to describe computer monitors as well, even though in general a monitor is capable of a higher definition display than a TV. The number always refers to the number of horizontal lines on the display.
Here’s how the shorthand translates:
720p = 1280 x 720 – is usually known as HD or “HD Ready” resolution
1080p = 1920 x 1080 – is usually known as FHD or “Full HD” resolution
1440p = 2560 x 1440 – is commonly known as QHD or Quad HD resolution, and it is typically seen on gaming monitors and on high-end smartphones. 1440p is four times the resolution of 720p HD or “HD ready.” To make things even more confusing, many premium smartphones feature a so-called 2960×1440 Quad HD+ resolution, which still fits into 1440p.
4K or 2160p = 3840 x 2160 – is commonly known as 4K, UHD or Ultra HD resolution. It is a huge display resolution, and it is found on premium TVs and computer monitors. 2160p is called 4K because the width is close to 4000 pixels. In other words, it offers four times the pixels of 1080p FHD or “Full HD.”
8K or 4320p = 7680 x 4320 – is known as 8K and it offers 16 times more pixels than the regular 1080p FHD or “Full HD” resolution. For now, you see 8K only on expensive TVs from Samsung and LG. However, you can test whether your computer can render such a large amount of data using this 8K video sample:
The problem with 2K is that it does not exist for consumer devices
In cinematography, the 2K resolution exists, and it refers to 2048 × 1080. However, in the consumer market, it would be considered 1080p. To make things worse, some display manufacturers use the term 2K for resolutions like 2560×1440, because their displays have a horizontal resolution of 2000 pixels or more. Unfortunately, that is incorrect, as this resolution is 1440p, or Quad HD, and not 2K.
Therefore, when you hear about a TV, computer monitor, smartphone or tablet having a 2K resolution, this statement is incorrect. The real resolution is likely to be something like 1440p or Quad HD.
Can you see high-resolution videos on lower resolution screens?
You might wonder whether you can watch a high-resolution video on a smaller resolution screen. For example, is it possible to use a 720p TV to watch a 1080p video? The answer is yes! Regardless of what your screen resolution is, you can watch any video on it, no matter the video’s resolution (higher or lower). However, if the video you want to watch has a higher resolution than that of your display, your device converts the video’s resolution to one that fits the resolution of your display. This is called downsampling.
For example, if you want to watch a video with a 4K resolution on a 720p screen, that video is shown at 720p resolution, because that is all that your screen can offer.
What is the Aspect Ratio?
The term aspect ratio was initially used in motion pictures, indicating how wide the picture was in relation to its height. Movies were initially in 4:3 aspect ratio, and this carried over into television and early computer displays. Motion picture aspect ratio changed much more quickly to a wider screen, which meant that, when movies were shown on TV, they had to be cropped or the image had to be manipulated in other ways to fit the TV screen.
As display technology improved, TV and monitor manufacturers began to move toward widescreen displays as well. Originally “widescreen” referred to anything wider than the typical 4:3 display, but it quickly came to mean a 16:10 ratio and later 16:9. Nowadays, nearly all computer monitors and TVs are only available in widescreen, and TV broadcasts and web pages have adapted to match.
Until 2010, 16:10 was the most popular aspect ratio for widescreen computer displays. However, with the rise in popularity of high definition televisions, which were using high definition resolutions such as 720p and 1080p and made these terms synonyms with high-definition, 16:9 has become the high-definition standard aspect ratio.
Depending on the aspect ratio of your display, you can use only resolutions that are specific to its width and height. Some of the most common resolutions that can be used for each aspect ratio are the following:
- 4:3 aspect ratio resolutions: 640×480, 800×600, 960×720, 1024×768, 1280×960, 1400×1050, 1440×1080, 1600×1200, 1856×1392, 1920×1440, and 2048×1536.
- 16:10 aspect ratio resolutions: 1280×800, 1440×900, 1680×1050, 1920×1200, and 2560×1600.
- 16:9 aspect ratio resolutions: 1024×576, 1152×648, 1280×720 (HD), 1366×768, 1600×900, 1920×1080 (FHD), 2560×1440 (QHD), 3840×2160 (4K), and 7680 x 4320 (8K).
Is there a relation between aspect ratio and display orientation?
The display orientation refers to how you look at a screen: the most common screen orientations used are landscape and portrait. Landscape orientation means that the width of the screen is larger than its height, while portrait orientation means the opposite. Most large screens, such as the ones we use on our computers, laptops, or TVs, use landscape orientation. Smaller screens, such as the ones on our smartphones, are normally used in portrait mode, but, because their size allows you to easily rotate them, they can also be used in landscape mode. The aspect ratio of the screen defines the ratio of its longer side to its shorter side. Consequently, that means that the aspect ratio of the screen tells you the ratio of the width to height when you look at it in landscape mode. The aspect ratio is not used to describe screens (or any rectangular shapes) in portrait mode.
In other words, you could say that an aspect ratio of 16×9 is the same as 9×16, but the latter is not an accepted form of referring to aspect ratio. However, you can refer to the screen resolution in both ways. For example, a resolution of 1920×1080 pixels is the same as 1080×1920 pixels; it is just that the orientation differs.
How does the size of the screen affect resolution?
Although a 4:3 TV’s display can be adjusted to show black bars at the top and bottom of the screen, while a widescreen movie or show is being displayed, this does not make sense with a monitor, so Windows does not even offer you the widescreen display as a choice. You can watch movies with black bars as if you were watching a TV screen, but this is done by your media player.
The most important thing is not the monitor size, but its ability to display higher resolution images. The higher you set the resolution, the smaller the images on the screen are, and there comes a point at which the text on the screen becomes so small it is not readable. On a larger monitor it is possible to push the resolution very high indeed, but if that monitor’s pixel density is not up to par, you won’t get the maximum possible resolution before the image becomes unreadable. In many cases, the monitor does not display anything at all if you tell Windows to use a resolution that the monitor cannot handle. In other words, do not expect miracles out of a cheap monitor. When it comes to high-definition displays, you definitely get what you pay for.
Do you have any other questions about screen resolutions?
If you are not technical, it is likely that you are confused by so many technicalities about displays and resolutions. Hopefully, this article has managed to help in your understanding of the most essential characteristics of a display: aspect ratio, resolutions or type. If you have any questions on this subject, do not hesitate to ask in the comments section below.