First post, by m1so
- Rank
- Member
In the last few years, some people seem to have been so fooled by DirectX 9/11 level graphics that there is a growing trend to proclaim that we are "there" when it comes to graphics quality, that it is already "good enough", that it is "CGI quality". Are people blind? Let's look at actual prerendered CGI art http://gizmodo.com/9-of-the-most-photorealist … e-web-823379988 http://www.artstation.com/artwork/cg-garden-a … ion-still-video http://www.youtube.com/watch?v=8LYpUw3Rx0c . Does anyone seriously think we've really "reached the top" with our jaggies, blurry sprites for grass and leaves and angular baloons in Bioshock Infinite that would'nt feel out of place in a 90s title?
When I play a game, I always look at the walls/ground/grass from up close for a second to see how it REALLY looks like. It always gives the same blurry filtered look. No real detail. Many people think games are CGI quality really compare them to something 20 years old and prerendered, like Myst. The same goes for "Prerendered graphics are obsolete, we can produce render quality graphics now in realtime" and it's completely wrong. Sure, noone wants FMV monstrosities in the style of Sewer Shark to come back, but prerendering was great for adventure games and could produce nearly completely photorealistic results today.
Bottom line, graphics still look crap. You might say they're good enough, but most people in the SNES era would probably say the same thing.