Scali wrote:REYES is most definitely *NOT* raytracing though.
It's basically polygon rasterization. RenderMan promoted the use of things like shadowmapping and cubemaps for reflection/refraction effects.
At first, and for this reason it suffered when doing proper GI and true reflections (you could tell from the results o.0)... basically anything that required a 'bounce'. Rasterizers can't 'bounce'. Now I don't know what it is now, but even back when, it incorporated a form of stochastic ray casting iirc (although you are probably right, it only came about with Cars, a lot later than I thought 🙁). But yes Fair enough, Renderman traditonally not a raytracer... always has been and may always be this weird hybrid thing (not real-time).
Scali wrote:There are a number of downsides to raytracing, which is why it never was very useful even for offline rendering.
o.0 ... Are we talking 'real-time'/60FPS here? Offline is the only useful...erm...use of it.
Scali wrote:
Some downsides include:
- Performance is awful. Acceleration sturctures only get you so far, because they don't really work for animated geometry, especially not with NURBS-based animation.
- Controlling quality is difficult as well. Since each ray is essentially 'isolated', you can't perform any kind of texture filtering based on gradient deltas (which are basically just partial derivatives). This makes performing things like trilinear or anisotropic filtering very troublesome. Usually it's just handled with bruteforce: apply a lot of supersampling to compensate.
waaaa.... why do you need anisotropic filtering though? isn't this kinda like an interpolation to compensate for a low sampled pixel? More samples, no problemo! Certainly high sample counts isn't as taboo as it is appears it with rasterisers... essentially they substitute quality in favour of speed...everything else they do is an attempt to regain that quality and photographic accuracy. Clutching at straws if you ask me. As performance increases, the desire to sacrifice this accuracy will diminish 😀.
Always the same arguments, usually centering around performance? Is there any other weapon you have in your arsenal against ray tracing? blah doubt it... mark my words... ray/path/tracing/real photorealism will rule supreme, probably not for a while, probably not in my life time.... we will probably have processing capabilities to do real-time fully fledged 1024 sampled path tracing, and still use ponsy rasterisers with oodles of shaders.... but one day, we turn around and say...OMG, this is like so fake.... whats that? I can have the real deal at 60FPS rather than 1000000FPS faked one... I want the real deal....coz I get all those realistic effects that *still* haven't been faked. All those things that a rasterizer ultimately can't do.
Did I mention FAKE!
Scali wrote:For some reason, raytracing is the algorithm that people associate with photo-realistic rendering and CGI in movies. But in reality most movies are done with RenderMan or similar technology, not raytracing.
Erm.. thats because it is photo-realistic rendering! it models light propagation as a photon, you know light... that thing that is faked with a rasterizer because it decides... light is too hard to actually model, so I'll dumb it down and make it 'good enough'... then i'll add 'per-pixel' operations to do 'fakes' because I don't allow the user to manipulate anything outside built in extensions....but I still can't bounce...dang! Here's a word that isn't in a rasterizers vocab! 'Glooooballll Illlumina-shon'. And I don't want to hear about any sort of SSOA, because to be frank... it looks fake.
Scali wrote:The liquid metal robot in Terminator 2? RenderMan with environment-mapping. Not raytracing.
T2 eh, looks like the Abyss to me o.0.. Back then yes (turn of the 90's) ... special fx were still very 'creative' with their technologies then (leveraging what they could without resorting to the already by that point well know but very time consuming/costly raytracing) and things were a lot different. However, come Jurassic Park (93 ish), ILM were rendering with ray/oath tracers, for ' The Fifth Element' digital domain had their own (nuke3D), Jim Henson creature workshop had their own in house renderer approx 97/98 iirc (and there was this flying Owl on the opening of Labryinth back in the 80's. I'm pretty sure that wasn't rasterized...although you'll probably correct me on that one o.0), DreamWorks had their own (and now use Arnold) all of which perform ray casting (which is fundamentally what a ray/path tracer does) and now pretty much anything no cartoony pixarlated i.e non renderman, is done this way. ... and even now many of the frames are compositions of multiple layers (like AO, particles (from volumetric rendering), diffuse interreflection etc), a lot of these 'baked' layers are constructed from modelling light *as is*, not a rasterized approximation 🤣 . It's not like everything to be rendered is put into the same scene and rely of the capabilities of the render engine to produce the final image.
Scali wrote:RenderMan did get raytracing as an optional 'effect', and the movie Cars was the first to feature this effect, but it was mainly used for close details of reflections, refractions and such (cars have a lot of chrome etc). For the most part they still used cubemaps.
There's a nice paper on that: http://graphics.pixar.com/library/RayTr ... /paper.pdf
Yes.. apart from the obvious BuzzLight year helmet reflection/refraction requirement in ToyStory (environment map fail)... the content of their movies didn't require them to need this effect... they could get away with it. tbh they could have probably gotten away with it with Cars if you ask me. But then again the whole talking car thing ruined the realism for me anyhow. o.0
Either way, its good! because now reflections are NOT fake.
Scali wrote:
spiroyster wrote:
This processing power of the GPU may have once been used for 'shading', however they are no longer just that, so 'shader' is not only now an irrelavant description, but also misleading imo.
Using a *G*PU for anything non-graphics-related is misleading as well, so where do you draw the line? 😀
ah tou che! dunno... the line would be drawn in ray traced/non-rasterized way anyhow so it doesn't matter.
Scali wrote:
Sounds like something both MS and nVidia wished they came up with ... har har
(yeah I didn't realise the *khronology*)
Scali wrote:Microsoft and nVidia developed HLSL together. Cg is basically NV's attempt to add 'HLSL' to OpenGL as well. GLSL didn't exist ye […]
Show full quote
Microsoft and nVidia developed HLSL together. Cg is basically NV's attempt to add 'HLSL' to OpenGL as well. GLSL didn't exist yet at that time.
NV's compiler could basically compile 'HLSL' shaders to OpenGL ARB shader extensions (which were assembly-like). So it was generic for any SM2.0 hardware. For NV there were of course special extensions to the language so you could optimize for NV hardware and make full use of their shader extensions.
For some reason, instead of adopting and standardizing Cg for OpenGL (which could have saved us a LOT of trouble), they decided to (poorly) re-invent the wheel and come up with GLSL as the 'official' OpenGL shading language.
That put Cg in the same position as AMD's Mantle: nobody is going to touch it.
Yeah because it wasn't open standard... t'was nVidia standard. Khronos, the cartel formally known as ARB, likes Open 'peer' reviewed Che Guevara-wearing standards. Decided on by comrades for the greater good of the industry... not capitalist 'where the fuck do you think I want to go today/the way it is hopefully played' pigs!
Of course it's going to be rejected o.0
Scali wrote:Except it wasn't AMD's idea to begin with.
Consoles had low-level APIs for ages, and Microsoft and Sony developed their own APIs […]
Show full quote
Except it wasn't AMD's idea to begin with.
Consoles had low-level APIs for ages, and Microsoft and Sony developed their own APIs from AMD long before we heard anything of Mantle.
AMD basically 'borrowed' the ideas they got from MS and Sony's API and rehashed it into their own 'DX12-lite'. Then started to market this vapourware like crazy, since MS hadn't officially announced DX12 yet, even though they were already working on it before Xbox One (I believe that Xbox One was meant to be launched with DX12, but it wasn't ready in time. So instead they launched it with DX11 + an extra low-level API layer to do pretty much the same thing. Which is also why MS specifically said that DX12 wasn't going to bring any gains for Xbox One. The main advantage was that the API was brought to the PC as well, allowing easier cross-platform development between Xbox and PC).
If it really was AMD's idea, then MS and Sony would simply have used Mantle, instead of developing their own APIs.
But if you look at DX12, it supports various features that Mantle does not, yet both NV and Intel have support for these features in hardware (and AMD does not). How is that possible if it was AMD's idea?
Heck, even the feature of 'async compute' can be traced back to CUDA's 'Hyper-Q', long before Mantle, Vulkan or DX12 were around.
Yeah I appreciate that..I meant 'on the PC' platform, granted these systems are tightly integrated with the provided hardware API's. Single vendor providing the interfaces/access to multiple areas of the bespoke vendor’s system...and system which does more than draw stuff and 'compute', and the *G*PU has for while been able to be used in this general purpose fashion, just not by a single 'unified' API for compute and draw. Of course it was going to come along at some point by someone (be it the DirectX 'dream team') since all the architectures appeared to be pointing that way for a number of years prior (I think about 2006/7 I started hearing about GPGPU? Again I may have been late to the party???).
And tbh, you have certainly proved that this hardware has been pushed in more 'generic' ways for a number of years? Code flipping since 2003? 2006? Why has it taken so long for someone to bring this unified API to our table? What are these Sony and MS console API's of which you speak? Not the PS3 one (RSX == GeForce, and not nVidia specific afaik) o.0 ?. How do they differ? why wasn't Mantle used? What improvement do they offer?
But yeah, In all honest, never used HLSL/DX, so can't comapre. For all I know its some mystical instruction set that grants prosperity and wealth to all those that use. And yes I love ray tracing and have been living in hope since 2001ish.
FAKE!