shevalier wrote on Today, 08:14:
lepidotós wrote on Today, 07:20:
only right now they use the CPU
Nobody wants to programming and support hardware offloading of anything (AES encryption, network traffic, audio streams). Only in the case of a critical shortage of computing power in general-purpose CPU.
As soon as performance starts to suffice, everything switches to software.
Sure, that's fair, but I can only imagine that scene complexity plays a part here. The software approach uses a relatively low-density voxel map of the scene (and so far has mainly been demonstrated, link provided for the benefit of future readers, with simplistic scenes), which you'd have to recalculate each time the scene itself changes (e.g. a car moves or a box gets broken or something like that). Nothing crazy, but if you're doing simulation of hundreds or thousands of other things at a time (say, NPCs to make the sounds? Not an unrealistic number for a Dynasty Warriors or city simulation game; Dynasty Warriors: Origins in the later levels aims to hit over 10,000 NPCs individually simulated at a time), given the slow rate of change generation over generation unless CPUs start going for more and more cores, like we start getting into 12 or 16 core CPUs in order to offset the eventual stagnation, I can see it start to really bog the CPU down.
shevalier wrote on Today, 08:14:Ray tracing in games promises to reduce development costs.
What the designer previously did statically now was to be calculated […]
Show full quote
Ray tracing in games promises to reduce development costs.
What the designer previously did statically now was to be calculated automatically at runtime.
This means the designer will be fired. And his salary will be redirected to the marketing budget. 😀
If a wave-tracing technology for audio is developed that automatically builds scenes (i.e., takes a game level, applies a bit of magic AI , and the output happens automatically), it will become widespread.
If, like in EAX, you have to manually specify a "water-metal-wood" preset for each surface, then everyone knows how that ends.
Yeah, it automatically makes a scene map. I haven't heard whether it automagically creates material profiles, probably not, which would be a possible advantage to a well-designed hardware solution. It does not, it seems, according to a comment response by the guy who wrote the engine. A custom designed architecture specifically for that purpose could likely also just access the scene geometry for the more simplistic RT implementations rather than asking for a voxelized representation of it, saving more CPU power than just the rays themselves. Additionally, game development was just... less easy and less big-budget back then. I think for most games like indies the developers would be fine with simplistic RT (I should point out a lot of developers currently also put time into making things sound as they should for each material manually anyway; using this as the basis for automatically assigned material presets for at least the early days would play a role in the transition) and the super-big-budget ones likely would have someone on staff whose job it is to do that sort of thing. I mean, with budgets in the hundreds of millions, they can afford it. They already have a few people doing lighting even with RT, either for people running on machines without RT (I think the GTX 1660 is still like number 2 or 3 most popular graphics card at this point, and even for those with 3060s and 4060s might still turn RT off for better performance) or just to split up the workload of scene composition, light placement, and color palette so as to take less time. Plus, even if you have to define the materials, it still helps significantly in the interactions between materials the same way ray tracing less so speeds up placing lights and moreso speeds up/enables how they interact with other lights and the rest of the world, including the player. Instead of having to account for, say, wood, metal, metal hitting wood, and wood hitting metal, you could have the wavetracing core handle how those interact.
The Serpent Rider wrote on Today, 07:33:
Wavetracing "accelerator" would be the final nail in Creative coffin. It's definitely something out of Kickstarter scale (for slowly dying long forgotten hardware company anyway) too.
And yeah, I think the thing I mentioned earlier, being a card with a hardware synth on it, is significantly more likely for a company in Creative's current shoes. Would probably be interesting to musicians, I'd pick one up if it was one for that purpose. Of course, if it's just nostalgia-bait and they're doing it in software there's basically no reason to go for it, but you know, that's the best case realistic scenario I can see. I have a keyboard that sends out MIDI over USB (type C) that could easily be used as a controller for a hardware synth. Plus, they can justify a couple hundred (maybe $229 would be reasonable?) for it given the prices of synths that have attached keyboards.
But hey, regarding taking the risk, Creative is going to die if it keeps its current trajectory anyway, so what's the harm? That it shuts its doors in 2029 as opposed to 2034? It dies in two out of three scenarios and the third isn't the one where they rest on their long-withererd laurels. In terms of audience interest, here's a very simplistic implementation of wavetracing in Minecraft (based on the previously linked video, which is a lot easier to implement and is completely appropriate given it's, y'know, Minecraft) that's gotten 22,000 downloads in about three months.