VOGONS

Common searches


How many people actually use 1080p?

Topic actions

First post, by m1so

User metadata
Rank Member
Rank
Member

The Internet is filled with people saying stuff like "todays norm is to game at 1080p" etc. but I don't actually know anyone who uses a 1080p display on their PC, so why are all benchmark etc. done in 1080p? Because "troo gamers" (those who think 100 fps is "unplayably slow" and game in 1600p) think that it is "teh minimum norm" around the world? I want to replace my faulty ATI card with a new better one, yet NOWHERE I see any 1600x900 benchmarks. I don't know ANY of my friends who have a 1080p computer monitor, and all of them are 15-20yo young people.

Reply 1 of 85, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++

I have four 1080p screens in my place. A 55" LED TV, a 27" LED monitor and two 23.6" LED monitors 😀 It is actually hard NOT to purchase a 1080p screen these days.

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 3 of 85, by Sol_HSA

User metadata
Rank Member
Rank
Member

When I was shopping for a monitor a few months back, I realized that 1080p is pretty much the only option out there. So here I am, with my first widescreen monitor ever, and it's actually kinda nice.

We'll see if my next one will be a 4k monitor. With the pace I update my monitors, that's actually likely..

http://iki.fi/sol - my schtuphh

Reply 5 of 85, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie

I don't understand anything of this "youtube resolution naming scheme". Why is the horizontal resolution suddenly not important anymore? Why is there no mention of the refresh rate? And why is progressive-scan an important enough feature to mention?

To answer your question: I use 800x600@160Hz in games, and 2048x1536@80Hz on the desktop. So that would be 600p and 1536p respectively.

Reply 6 of 85, by m1so

User metadata
Rank Member
Rank
Member
Mau1wurf1977 wrote:

I have four 1080p screens in my place. A 55" LED TV, a 27" LED monitor and two 23.6" LED monitors 😀 It is actually hard NOT to purchase a 1080p screen these days.

Yeah, it is hard to NOT purchase such a screen, IF you live in the few countries of the world where people buy a new PC every year. I bought my PC 3 years ago and it has a 1600x900 screen.

Reply 7 of 85, by keropi

User metadata
Rank l33t++
Rank
l33t++

1920x1200 lcd here... that extra 120px is nice to have 😀

🎵 🎧 PCMIDI MPU , OrpheusII , Action Rewind , Megacard and 🎶GoldLib soundcard website

Reply 8 of 85, by F2bnp

User metadata
Rank l33t
Rank
l33t

I used to have a 20" 1600x900 Samsung monitor, needless to say it wasn't the best purchase I made. I kept it for 3 years and by the end of 2012 it was really starting to show its age. I got myself an old SyncMaster T240HD from my parents and I'm now enjoying 1920x1200 with a 16:10 aspect ratio 😀.
1080p nowdays is just as common as people having old 17" and 19" 4:3 LCDs with a max res of 1280x1024. At least that's how it is here in Greece.

On my retro machines I use an iiyama Vision Master Pro 413 with a max res of 1600x1200 although I mostly use 1024x768 and 1280x1024.

Reply 9 of 85, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie
F2bnp wrote:

1080p nowdays is just as common as people having old 17" and 19" 4:3 LCDs with a max res of 1280x1024.

That would be 5:4. Which I still think is the weirdest aspect ratio ever.

Reply 10 of 85, by m1so

User metadata
Rank Member
Rank
Member
jwt27 wrote:

I don't understand anything of this "youtube resolution naming scheme". Why is the horizontal resolution suddenly not important anymore? Why is there no mention of the refresh rate? And why is progressive-scan an important enough feature to mention?

To answer your question: I use 800x600@160Hz in games, and 2048x1536@80Hz on the desktop. So that would be 600p and 1536p respectively.

That's one of my "pet hates" as well. I'd say it is because of the HDTV hype, not just Youtube. In my opinion, they should have brought the ability to stream uncompressed video in lower resolution, it would have looked a lot better. Blue Ray looks wonderful, but Youtube "HD" is just a mess of artifacts. Crash Bandicoot (240p, or properly said 512x240) on my PSX on a CRT TV looks crisper than Youtube "HD".

Reply 11 of 85, by m1so

User metadata
Rank Member
Rank
Member
F2bnp wrote:

I used to have a 20" 1600x900 Samsung monitor, needless to say it wasn't the best purchase I made. I kept it for 3 years and by the end of 2012 it was really starting to show its age. I got myself an old SyncMaster T240HD from my parents and I'm now enjoying 1920x1200 with a 16:10 aspect ratio 😀.
1080p nowdays is just as common as people having old 17" and 19" 4:3 LCDs with a max res of 1280x1024. At least that's how it is here in Greece.

On my retro machines I use an iiyama Vision Master Pro 413 with a max res of 1600x1200 although I mostly use 1024x768 and 1280x1024.

I don't see anything bad about a res of 1600x900. I have far better FPS in games in a native resolution. I don't understand the need to "always go forward at all cost" and probably never will understand.

Reply 12 of 85, by bristlehog

User metadata
Rank Oldbie
Rank
Oldbie

I've got a DELL 2007FP 1600x1200 20" LCD, that does it for work and for ancient games.

Hardware comparisons and game system requirements: https://technical.city

Reply 13 of 85, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie
m1so wrote:
jwt27 wrote:

I don't understand anything of this "youtube resolution naming scheme". Why is the horizontal resolution suddenly not important anymore? Why is there no mention of the refresh rate? And why is progressive-scan an important enough feature to mention?

To answer your question: I use 800x600@160Hz in games, and 2048x1536@80Hz on the desktop. So that would be 600p and 1536p respectively.

That's one of my "pet hates" as well. I'd say it is because of the HDTV hype, not just Youtube. In my opinion, they should have brought the ability to stream uncompressed video in lower resolution, it would have looked a lot better. Blue Ray looks wonderful, but Youtube "HD" is just a mess of artifacts. Crash Bandicoot (240p, or properly said 512x240) on my PSX on a CRT TV looks crisper than Youtube "HD".

Yes, I should have said "HDTV resolution naming scheme". On youtube it's just a number with a p suffix. It has even less to do with resolution and image quality. "1080p" does not denote a compression algorithm so it doesn't say anything about the image quality. And you can watch "1080p" in a tiny screen or "240p" on full screen (your desktop resolution), so it doesn't say anything about the resolution either.

Reply 14 of 85, by gulikoza

User metadata
Rank Oldbie
Rank
Oldbie

I use 1920x1200 for years now. For gaming as well because the image is much crispier at native res (1600x1200 works ok too for 4:3 as pixel size is the same). Actually I have a hard time working on a 4:3 diplay, I feel confined.

The problem with todays displays (as opposed to a couple of years back) is that most of the stuff is 6-bit FRC (even when it says 8-bit colors) and ofc the horrible 1920x1080 and 16:9 AR.

http://www.si-gamer.net/gulikoza

Reply 15 of 85, by jwt27

User metadata
Rank Oldbie
Rank
Oldbie
gulikoza wrote:

The problem with todays displays (as opposed to a couple of years back) is that most of the stuff is 6-bit FRC (even when it says 8-bit colors) and ofc the horrible 1920x1080 and 16:9 AR.

Not to mention the flickering as a result of that, high black levels, low overall contrast, low refresh rates, high response times, having only one native resolution, backlight bleeding... and I could go on for a while.

Reply 16 of 85, by fillosaurus

User metadata
Rank Member
Rank
Member

From beginning of '09 until last summer I had a 1680x1050 22". Then I bought my full HD 24" LED, which is even better. And my video card is still adequate for my gaming needs. Would have loved a 6850 or 6870, but I had to settle for a 6790. And I am not sorry, is a decent card.

Y2K box: AMD Athlon K75 (second generation slot A)@700, ASUS K7M motherboard, 256 MB SDRAM, ATI Radeon 7500+2xVoodoo2 in SLI, SB Live! 5.1, VIA USB 2.0 PCI card, 40 GB Seagate HDD.
WIP: external midi module based on NEC wavetable (Yamaha clone)

Reply 17 of 85, by F2bnp

User metadata
Rank l33t
Rank
l33t
m1so wrote:

I don't see anything bad about a res of 1600x900. I have far better FPS in games in a native resolution. I don't understand the need to "always go forward at all cost" and probably never will understand.

I found 1600x900 way too limiting for desktop use, for example when viewing photos. 1280x1024 was better for Desktop use IMO. Games of course looked nicer at 1600x900. However, they looked kinda... murky as of late. I found this especially annoying in Dishonored, which is heralded for its art style, but wow that looked insanely bad at 1600x900. On my current monitor it is much better although the game's textures and graphics overall leave something to be desired.
1600x900 was my native resolution by the way.

But anyway, m1so, I do not know how old you are but from what I gather from some of your other posts here you give a feeling like you're 40+, but you're probably in your 30's.
You give the feeling of a man who considers stuff that he doesn't own completely useless. On some other thread you were complaining that 20 fps is completely playable and there is no difference from 30 to 60 fps.
Now here you are with the monitor subject and how you don't understand the "always go forward at all cost". Whelp, I think almost everything has its use. I would agree with you to some extent, 4 way GeForce Titan SLI in order to play at 5760x1200 with 8x MSAA is probably not worth it. 🤣

But damn, I love my monitor now. 1920x1200 is very very awesome.

Reply 18 of 85, by m1so

User metadata
Rank Member
Rank
Member

I'm actually just 19. I know I give kind of a hateful vibe, but I simply cannot stand most "gamers" on the internet. Most of them seem like people who think everyone's mommy or daddy will buy them a 500 dollar videocard. I think it is also because I don't see anyone around me throwing around money like most "rig builders" do. When it comes to people I know, most of them are lucky to have a cheapskate laptop with a 1280x960 screen and horrible integrated video. My family earns way more than the average person in this country yet I don't throw money away like even some "average" people in the West do. It is easy to say "500 dollars is cheap" when 90 percent of your friends parents barely make more than that in a month.

Anyways, is there any rational, non spoiled reason for me to buy a "HD" monitor? Or is "progress" defined as "you must become a spoiled fucking brat"?

Reply 19 of 85, by dirkmirk

User metadata
Rank Oldbie
Rank
Oldbie

I dont know what your on about, a 23/24" widescreen 1080p widescreen monitor can be bought for like $140, I bought my first 17" lcd 1280x1024 in 2004 for $650, 1 year later it was half the price, now in 2013 you can be buy a far, far superior monitor for less than half the price again, the cost of a good monitor is so cheap why limit yourself to an old technology?

Nothing spoiled about a cheap ass 23/24" monitor today its the norm, perhaps its because your too young to remember how expensive hardware use to be and people simply see the value in these types of things, I'm continuosly shaking my head thinking about how cheap things are these days.

Now for something I dont get like a few months ago I had lunch with my brother in law and he was telling me how he just ordered a new Nvidias graphics card, the price? A staggering $1500! Think it was a titan GPU? I don't know if I could ever stomach spending $1,500 on a videocard no matter how rich I was.