Why 1366x768 laptop
And so on. This can create noticeable differences in fonts and icons. If you plan to hook up a laptop to a monitor or use a desktop, this fact will come into play. The difference in resolution is often less notable while watching films or online videos. As we said, a higher resolution lends you more pixels with which to work.
That comes into play if your laptop has one resolution and a connected monitor uses another. A higher-resolution monitor has more pixels. Thus it also has greater visual detail, assuming screen sizes are equal. We sit much closer to computers than we do TV screens. It makes resolution differences less significant though never unimportant. A higher resolution will produce smaller fonts and more on opened websites.
You can use some software to enlarge text if you find the text too small to read comfortably. That also gives you the option to see far more information displayed at any given time than could fit on a lower screen resolution. In a situation like this, you might rightly wonder which is better suited for you or how big the resolution difference could be.
However, there are still distinct disadvantages to opting for a higher resolution with lower power. This is a significant factor and many do not weigh it carefully enough in their selection. Higher-resolution screens can be very hard on GPUs. This may negatively impact how the game looks.
Then comes the issue of frames per second, since many especially online games need a high FPS to play right. You might think, in theory, that the framerate would suffer by half.
In some cases, you might not be able to run a game at all, since it becomes too much for your GPU to handle. It is the reason for this to become a common screen resolution. The display aspect ratio of a device is its relation with the width and the height. It is shown by two numbers separated by a colon :. Common aspect ratios for displays include , , and Most of the laptop screens have a wide display of aspect ratio This corresponds to pixels x pixels.
Although this resolution does not match this aspect ratio exactly but is commonly marketed like this. In , the computer industry introduced as a standard for the aspect ratios of laptops and computers. The reason to shift from and aspect ratios was that the user demanded a widescreen display for movie viewing and game playing. In , almost all companies started producing devices with display aspect ratios.
The manufacturing of other devices was progressively decreased. People prefer budget laptops over expensive ones because most people use laptops for their business works that demand performance. So, resolution can be compromised in front of the performance. That is why it is a very big reason for its popularity and usage. This aspect ratio provides wide display screen that amuses the users. The integer ratio that you're fixated on isn't significant in any way.
Expressed as a decimal, it's 1. I suppose you've never heard of x, with ratio or, approximately if you will? Add a comment. Active Oldest Votes. Improve this answer. It also makes it easy to pillarbox applications designed to run well at x How is this the closest they could get?
Kaiserludi Odd numbers are really flaky to deal with. Kaiserludi In this case, you would want to go slightly above , not slightly below. With pixels, you would have to cut off the left or right margin of a wide-screen movie, or you would have to scale it vertically to pixels. MarcksThomas That's neglecting every single other aspect besides the frame buffer, which for a bit depth would round up anyway.
Show 3 more comments. WXGA generally used 24 bits color stored in 32 bits so you'd need a 64 Mbit chip instead of a 32 bit chip, but the logic still applies. I had the same question in the , because my computer doesn't supported my default tv resolution x and I found this: WHY does x exist? Facundo Pedrazzini Facundo Pedrazzini 1 1 silver badge 5 5 bronze badges. All about costs folks. It's not just that people "like" powers of two, but that it's very convenient to deal with powers of 2 in the computer world - since a power of 2 is just a bit shift away or ann additional bit on the address bus, etc.
To paraphrase Johnny, powers of 2 correspond to "how many bits". If you use a number that's not a power of 2 then you either need fractional bits silly of course or you have hardware that's not fully usable.
For example, to address pixels you need 8 bits but that's a waste of the 8 bits because you can address pixels with 8 bits.
So some people just don't like wasting those bits and just move up to pixels. Johnny Powers of two are convenient to work with. But in most of the graphics I have worked with it is in the horizontal resolution where it makes the most of a difference. That makes a very strange number since it doesn't divide evenly by powers of two. The Overflow Blog. Does ES6 make JavaScript frameworks obsolete?
0コメント