VOGONS

Common searches


Reply 60 of 100, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-01-29, 17:56:

Exactly and nVidia wants to push TDP even further for Hopper ..they are likely to push stock power draw well past 600 watts per GPU since they are using MCM, we might have to start using AIO cooled GPUs, I know an AIB is making a 1kw 3090ti for overclocking.

If that's true, what is wrong with nVidia?? They want to make an ENIAC GPU, 174kW power consumption;) I only hope that AMD doesn't follow nVidia like always and that they realize the energy crisis we are living, at least in Europe, and ain for energy efficient GPUs, and cheaper if possible, they are expensive enough with a cooler for 300W.

Reply 61 of 100, by TrashPanda

User metadata
Rank l33t
Rank
l33t
Hoping wrote on 2022-01-29, 18:16:
TrashPanda wrote on 2022-01-29, 17:56:

Exactly and nVidia wants to push TDP even further for Hopper ..they are likely to push stock power draw well past 600 watts per GPU since they are using MCM, we might have to start using AIO cooled GPUs, I know an AIB is making a 1kw 3090ti for overclocking.

If that's true, what is wrong with nVidia?? They want to make an ENIAC GPU, 174kW power consumption;) I only hope that AMD doesn't follow nVidia like always and that they realize the energy crisis we are living, at least in Europe, and ain for energy efficient GPUs, and cheaper if possible, they are expensive enough with a cooler for 300W.

They have hit the same issue AMD and Intel hit with CPUs, going faster and with higher TDPs wasn't providing more performance but they didn't have an alternative at the time so they kept pumping the speed and power to brute force performance, GPUs are now facing the same issue and they dont have a way out of it so they are going to brute force performance till they do.

MCM is coming but the first gen of it will still be using the previous gens cores just in a power hungry MCM format. Hopefully they can start lowering TDP again as MCM starts scaling better with GPU cores designed to take advantage of it, like how both AMD and Intel have switched to more efficient high IPC multi core CPUs that they dont need to brute force. (They can brute them and you get the 5950x and 12900k which are processing monsters honestly with far more CPU power than any one user honestly needs)

Reply 62 of 100, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
Hoping wrote:

Don't say it too loud, let them continue to think

Whatever Terascale was at the time of DX10 era and early DX11, it doesn't for very long time now. Radeon 4xxx support was dropped rather quickly, unlike Nvidia Tesla family of chips. 5xxx/6xxx are only decent if you live in the past, otherwise GCN 1.0+ cards are magnitudes better when it comes to driver support. GCN 1.0 had official driver support for almost a decade.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 63 of 100, by TrashPanda

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2022-01-29, 19:23:
Hoping wrote:

Don't say it too loud, let them continue to think

Whatever Terascale was at the time of DX10 era and early DX11, it doesn't for very long time now. Radeon 4xxx support was dropped rather quickly, unlike Nvidia Tesla family of chips. 5xxx/6xxx are only decent if you live in the past, otherwise GCN 1.0+ cards are magnitudes better when it comes to driver support.

Everything below RX400 series has been dropped from driver support so they are all pretty much in the same legacy boat now, nVidia did much the same recently with everything before the GTX900 series becoming legacy only.

https://www.tomshardware.com/news/amd-retires … pre-rx-400-gpus

Reply 64 of 100, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-01-29, 17:56:

Exactly and nVidia wants to push TDP even further for Hopper ..they are likely to push stock power draw well past 600 watts per GPU since they are using MCM, we might have to start using AIO cooled GPUs, I know an AIB is making a 1kw 3090ti for overclocking.

Not the first time they are pushing it. First time I was aware there is a TDP limit (OEM), it was 300W around 10 years ago. Now high end GPUs have officially 350W TDP. I've already reached the point where I don't know if slightly better graphics worth all this. I mean if I check the difference between games from 10 years ago and from today, I don't see a generation worth of difference. Like I beat Ryse: Son of Rome with a 130W TDP HD7850 and the game looked breath taking. But it's not like there is on quality development, an RX470 was twice as fast with 120W TDP and an RX6600 (non-XT) is 5x faster with the same TDP. But the games don't look 5x more breath taking.

The GPUs are plenty powerful, the developers should do graphics smarter. My only hope is that the graphics card shortage forces them to do that.

sreq.png retrogamer-s.png

Reply 65 of 100, by Hoping

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2022-01-29, 19:23:

Whatever Terascale was at the time of DX10 era and early DX11, it doesn't for very long time now. Radeon 4xxx support was dropped rather quickly, unlike Nvidia Tesla family of chips. 5xxx/6xxx are only decent if you live in the past, otherwise GCN 1.0+ cards are magnitudes better when it comes to driver support. GCN 1.0 had official driver support for almost a decade.

Maybe, but can't have all we want since the money is limited.

Reply 66 of 100, by cyclone3d

User metadata
Rank l33t++
Rank
l33t++
TrashPanda wrote on 2022-01-29, 16:16:
HD 4000 series was truly amazing for its generation, the HD4890 was a true powerhouse and the 4870x2 was a fire breathing monste […]
Show full quote
Hoping wrote on 2022-01-29, 16:00:
Don't say it too loud, let them continue to think they were bad so that everyone wants an Nvidia, and so they are cheaper ;) […]
Show full quote
appiah4 wrote on 2022-01-29, 11:31:

Terascale was fucking great.. HD4850/5850/6850 - you can't go wrong with any of these. Amazing workhorse cards for their time..

Don't say it too loud, let them continue to think they were bad so that everyone wants an Nvidia, and so they are cheaper 😉

Well because of this thread it occurred to me to try some relatively new games on the HD 6970 (6950 bios modded).
On that computer I am limited by the processor, a Phenom II X6 1100T, the lack of SSE4 above all is the problem, so I searched my collection for games that did not require SSE4.
Tales of arise 2021, it is not a very demanding game, and it works in 1080p with everything at its maximum without issues.
Resident evil 2019, in 1080p without problems.
Resident evil 3 (2020) also very playable in 1080p and DX11 of course.
It must be said that I am not a fan of antialiasing, and I deactivate it whenever I can, I do not like how it usually sharpens the image.
Not many games like I said because of the processor, and I'm not going to blindly try and waste a lot of time installing only to find that a game doesn't work.
If a game supports DX11 a 6970 should be able to run it better than a 750ti and better than a 1030 so expensive today.
However, a 750ti or a 1030 also don't have enough power to run new games at average quality, so supporting DX12 isn't the answer to everything either.

HD 4000 series was truly amazing for its generation, the HD4890 was a true powerhouse and the 4870x2 was a fire breathing monster GPU that required a PSU upgrade to run the damn thing .. running two of them was like having the gateway to hell open in your PC. A pair of 4890's in Xfire was an amazing setup, I still have a pair in my collection and went out of my way to grab a few spare Sapphire Vapour-X 4980's they were that good.

HD 5000 was simply a refinement of 4000, faster and not quite as hot but still really good, I own a 5970 which is the dual GPU card and its a grate GPU to play games with, it does run hot but nothing like the 4870x2. The best setup was a pair of HD5850s in SLI since they were cheaper than 5870s and not as power hungry so didn't require a new PSU to run two of them. Lots of people had Xfire HD5850 setups it was that popular, you could even grab a pair of 5850x2's and run quadfire on the cheap.

HD 6000 .. was garbage, more of a reactionary rebadge of HD5000 than anything new from AMD, it ran stupidly hot because it was HD5000 with a tweaked TDP and it was pushing that GPU core to its limits, it wasn't till the HD 7000 series that AMD truly had a competitor to nVidia.

The HD7000 series went on to become the early RX200 series which became the RX300 series which got a fourth rebadge as the RX400 series ...IIRC RX500 series was part rebadge part new tech with Polaris, I never got in at the start of the RX500 series so I cant remember. Just checked and yeah Rx500 series is a refresh of RX400 ..AMD just loves their refresh GPUs.

The HD6870 cards I have run extremely cool and are way less power hungry than the 5000 series.

A quick look up is showing that a 5870 is about 10% faster than a 6870 but uses about 30% more power than a 6870.

Yamaha modified setupds and drivers
Yamaha XG repository
YMF7x4 Guide
Aopen AW744L II SB-LINK

Reply 67 of 100, by TrashPanda

User metadata
Rank l33t
Rank
l33t
RandomStranger wrote on 2022-01-29, 19:32:
TrashPanda wrote on 2022-01-29, 17:56:

Exactly and nVidia wants to push TDP even further for Hopper ..they are likely to push stock power draw well past 600 watts per GPU since they are using MCM, we might have to start using AIO cooled GPUs, I know an AIB is making a 1kw 3090ti for overclocking.

Not the first time they are pushing it. First time I was aware there is a TDP limit (OEM), it was 300W around 10 years ago. Now high end GPUs have officially 350W TDP. I've already reached the point where I don't know if slightly better graphics worth all this. I mean if I check the difference between games from 10 years ago and from today, I don't see a generation worth of difference. Like I beat Ryse: Son of Rome with a 130W TDP HD7850 and the game looked breath taking. But it's not like there is on quality development, an RX470 was twice as fast with 120W TDP and an RX6600 (non-XT) is 5x faster with the same TDP. But the games don't look 5x more breath taking.

The GPUs are plenty powerful, the developers should do graphics smarter. My only hope is that the graphics card shortage forces them to do that.

RT and AI are doing the heavy lifting now or at least thats what AMD and Nvidia are hoping, AI is still in its infancy but it has huge benefits for graphics in general, RTX .. balls still out on this one and while it looks downright amazing when done correctly (Metro Exodus Redux) its a massive resource hog and it might not be worth the power and cost but you wont be able to get a modern GPU without RT hardware on it soon.

Eventually nVidia wants to go full pathtracing and slowly phase out traditional raster but that's many years off with hardware that will be 1000x more efficient than we have now.

Reply 68 of 100, by 386DX40

User metadata
Rank Newbie
Rank
Newbie

Chuckle if you want but I'm finding the GTX 745 4GB GDDR3 I paid US $40 for awhile back surprisingly capable. I game on a 1366x768 32" TV (I love the low resolution on the big screen with big text as my eyes are terrible) and on a 1st gen Core i7 870, 12GB DDR3, Win10 LTSB 2015 system I am currently playing Serious Sam 4 at a combo of medium/high settings, Metro Exodus at medium settings, Wolfenstein New Colossus at medium/high settings, and Doom 2016 at medium/high settings. I also play Far Cry New Dawn multiplayer with a friend at low/medium settings. All these games will fill the 4GB of memory but at my low resolution I don't feel the GDDR3 is limiting the cards performance. Keep in mind I don't care for AA, DOF, or motion blur so those always get disabled. I also turn shadow quality down, and texture quality and AF up. I also limit my FPS to 30 to keep heat down. I have BIOS modded the card to 1124MHz on GPU and 1048MHz on the memory, altered the fan curve for more fan speed at higher temps, and redid thermal paste with Arctic MX5. This is a Maxwell card so you can use new drivers too, though I prefer version 442.01 as a stable and solid driver for everything I've tried. I see these cards all over ebay for $50 to $60 bucks shipped too, so may be worth trying out as a budget GPU that isn't hopeless......

Asus A7V8X-LA - Athlon XP 1800+ - 512MB - Geforce FX5200 128MB - SoundBlaster Live - 80GB HDD - Win98SE
DTK PKM-3331Y - Evergreen 5x86 133 - 16MB - WD90C31A 1MB ISA - ESS 1869 ISA - 2.5GB HDD - MS-DOS 6.22

Reply 69 of 100, by TrashPanda

User metadata
Rank l33t
Rank
l33t
386DX40 wrote on 2022-01-30, 03:37:

Chuckle if you want but I'm finding the GTX 745 4GB GDDR3 I paid US $40 for awhile back surprisingly capable. I game on a 1366x768 32" TV (I love the low resolution on the big screen with big text as my eyes are terrible) and on a 1st gen Core i7 870, 12GB DDR3, Win10 LTSB 2015 system I am currently playing Serious Sam 4 at a combo of medium/high settings, Metro Exodus at medium settings, Wolfenstein New Colossus at medium/high settings, and Doom 2016 at medium/high settings. I also play Far Cry New Dawn multiplayer with a friend at low/medium settings. All these games will fill the 4GB of memory but at my low resolution I don't feel the GDDR3 is limiting the cards performance. Keep in mind I don't care for AA, DOF, or motion blur so those always get disabled. I also turn shadow quality down, and texture quality and AF up. I also limit my FPS to 30 to keep heat down. I have BIOS modded the card to 1124MHz on GPU and 1048MHz on the memory, altered the fan curve for more fan speed at higher temps, and redid thermal paste with Arctic MX5. This is a Maxwell card so you can use new drivers too, though I prefer version 442.01 as a stable and solid driver for everything I've tried. I see these cards all over ebay for $50 to $60 bucks shipped too, so may be worth trying out as a budget GPU that isn't hopeless......

Anything based on Maxwell is worth grabbing if you can find it at a decent price, it was a very capable Uarch, both the GTX745 and GTX750ti are great little GPUs for budget gaming, tho the 4gb 750ti does fetch a higher price right now, but even at 150AUD its worth grabbing. (The 4gb 750ti is the Chinese variant and is worth getting over the 2gb 750ti)

Reply 70 of 100, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

SLI 960s in a 6th gen i7 for under $500, does that sound alright? Might be an off the wall way to do that, xeon blade servers with similar to i7 cores and mezzanine Quadro m3000 SE cards... lots of booby traps in getting the right config together, but general idea seems plausible. Quad SLI might be doable even, with a dozen cpu cores or better... for a bit more than $500 but possibly scoring higher bang per buck, the mezzanine GPUs might go a bit cheaper in lots.

The big disadvantage/(reverse psych advantage for geek bragging rights) is that nothing that you can buy at BestBuy would plug into it. Internally, keyboards would work I hope.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 71 of 100, by 386DX40

User metadata
Rank Newbie
Rank
Newbie

Also don't forget the Maxwell based Quadro K2200 4GB GDDR5 as well though they go for $90 and above. The Quadro K620 2GB GDDR3 is also Maxwell too and those can be had for $30-$50. I've messed with the K620 and again with some overclocking it's surprisingly useful!

Asus A7V8X-LA - Athlon XP 1800+ - 512MB - Geforce FX5200 128MB - SoundBlaster Live - 80GB HDD - Win98SE
DTK PKM-3331Y - Evergreen 5x86 133 - 16MB - WD90C31A 1MB ISA - ESS 1869 ISA - 2.5GB HDD - MS-DOS 6.22

Reply 72 of 100, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

OOooooo, the K620 might be the lo-pro and single slot card I've been looking for to slap into SFF rigs... thanks.

edit: where'd that $40 thinkcenter go now? 🤣

edit2: Those K2200 are nice too, basically what you can find new 1010 for if you're lucky, but blow past a 1030 and halfway to 1050, while also being single slot.

edit3: thinkcentre was AWOL all weekend (while I had opportunity to go see) off marketplace, and just reappeared now... dunno if that's FB being a complete POS or seller suspended it while out of town or something. Anyhoooo... I guess I was dreaming it was an i3, it's only a core2 model... and I'm stacked out with core2 stuffs. But... seeing some i5 SFFs going in large quantities.... man if I were younger and more energetic I'd bang out some minecraft/fortnite boxes for teh locals.

Last edited by BitWrangler on 2022-01-31, 01:03. Edited 1 time in total.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 73 of 100, by zyzzle

User metadata
Rank Member
Rank
Member
386DX40 wrote on 2022-01-30, 03:37:

Chuckle if you want but I'm finding the GTX 745 4GB GDDR3 I paid US $40 for awhile back surprisingly capable. I game on a 1366x768 32" TV (I love the low resolution on the big screen with big text as my eyes are terrible) and on a 1st gen Core i7 870, 12GB DDR3, Win10 LTSB 2015 system I am currently playing Serious Sam 4 at a combo of medium/high settings, Metro Exodus at medium settings, Wolfenstein New Colossus at medium/high settings, and Doom 2016 at medium/high settings. I also play Far Cry New Dawn multiplayer with a friend at low/medium settings. All these games will fill the 4GB of memory but at my low resolution I don't feel the GDDR3 is limiting the cards performance. Keep in mind I don't care for AA, DOF, or motion blur so those always get disabled. I also turn shadow quality down, and texture quality and AF up. I also limit my FPS to 30 to keep heat down. I have BIOS modded the card to 1124MHz on GPU and 1048MHz on the memory, altered the fan curve for more fan speed at higher temps, and redid thermal paste with Arctic MX5. This is a Maxwell card so you can use new drivers too, though I prefer version 442.01 as a stable and solid driver for everything I've tried. I see these cards all over ebay for $50 to $60 bucks shipped too, so may be worth trying out as a budget GPU that isn't hopeless......

See, now that's being *smart*. I'm not chuckling, I'm in admiration of your efforts. Especially the overclocking, and the sensible game settings (3o fps, eliminating all the "effects" crap, and going for higest texture quality instead). Pushing existing, reasonably-priced hardware to the limit and adjusting expectations is most commendable. You've saved A TON of money by doing so (perhaps thousands of dollars!) and you've no doubt also realized that there are so many games already released of *excellent* quality that even if you miss 5 or 10 of the most-bleeding edge games, you've missed nothing at all...

Reply 74 of 100, by TrashPanda

User metadata
Rank l33t
Rank
l33t
zyzzle wrote on 2022-01-30, 08:18:
386DX40 wrote on 2022-01-30, 03:37:

Chuckle if you want but I'm finding the GTX 745 4GB GDDR3 I paid US $40 for awhile back surprisingly capable. I game on a 1366x768 32" TV (I love the low resolution on the big screen with big text as my eyes are terrible) and on a 1st gen Core i7 870, 12GB DDR3, Win10 LTSB 2015 system I am currently playing Serious Sam 4 at a combo of medium/high settings, Metro Exodus at medium settings, Wolfenstein New Colossus at medium/high settings, and Doom 2016 at medium/high settings. I also play Far Cry New Dawn multiplayer with a friend at low/medium settings. All these games will fill the 4GB of memory but at my low resolution I don't feel the GDDR3 is limiting the cards performance. Keep in mind I don't care for AA, DOF, or motion blur so those always get disabled. I also turn shadow quality down, and texture quality and AF up. I also limit my FPS to 30 to keep heat down. I have BIOS modded the card to 1124MHz on GPU and 1048MHz on the memory, altered the fan curve for more fan speed at higher temps, and redid thermal paste with Arctic MX5. This is a Maxwell card so you can use new drivers too, though I prefer version 442.01 as a stable and solid driver for everything I've tried. I see these cards all over ebay for $50 to $60 bucks shipped too, so may be worth trying out as a budget GPU that isn't hopeless......

See, now that's being *smart*. I'm not chuckling, I'm in admiration of your efforts. Especially the overclocking, and the sensible game settings (3o fps, eliminating all the "effects" crap, and going for higest texture quality instead). Pushing existing, reasonably-priced hardware to the limit and adjusting expectations is most commendable. You've saved A TON of money by doing so (perhaps thousands of dollars!) and you've no doubt also realized that there are so many games already released of *excellent* quality that even if you miss 5 or 10 of the most-bleeding edge games, you've missed nothing at all...

Quite a few of the most recent bleeding edge games have been bleeding edge bug fests or only partially complete so I would agree that you are missing nothing by skipping them, if anything give games like CP2077 a few years and itll be better than it ever was at release. I skipped Witcher 3 till it had all its DLC and patches and I feel I got the best experience I could from the 15 bucks it cost me to buy from GoG on sale, if anything I feel it has too much to do but hey thats what they think people want.

I still havent touched any of the latest AC games and from all reports they are huge games with hundreds of hours of content (Busy work) in each one, perhaps Ill get to them when Valhalla has all its content.

I might even build a SFF lounge room gaming rig around a 4gb 750ti or a 1050ti not sure yet which but itll be whatever one I can get cheapest.

Reply 75 of 100, by 386DX40

User metadata
Rank Newbie
Rank
Newbie
zyzzle wrote on 2022-01-30, 08:18:

See, now that's being *smart*. I'm not chuckling, I'm in admiration of your efforts. Especially the overclocking, and the sensible game settings (3o fps, eliminating all the "effects" crap, and going for higest texture quality instead). Pushing existing, reasonably-priced hardware to the limit and adjusting expectations is most commendable. You've saved A TON of money by doing so (perhaps thousands of dollars!) and you've no doubt also realized that there are so many games already released of *excellent* quality that even if you miss 5 or 10 of the most-bleeding edge games, you've missed nothing at all...

Thanks! I've always been a 'low-end gamer' even back in the day I pushed cards like the Geforce 2 MX, Geforce 6600, Geforce 9600GSO, etc to their limits and then tweaked game settings either in the game or through config files to make them run on hardware below recommended. As far as I'm concerned now, we are in the glory days of used hardware. Unlike the 90s where a new machine was needed at least every few years, now if you know what you are doing, even an ancient Core 2 Quad can be completely usable for a majority of modern tasks. My current system in my signature runs fantastic, and I have about $220 in the whole system. I save money some places, like the power supply is a 280W Lite-On from an old Lenovo ThinkCentre that I paid $15 shipped for, yet take it apart and it's all Japanese capacitors. Also 1st gen and 2nd gen Intel parts have basically bottomed. I paid $25 shipped for my Xeon X3470 (same as an i7 870) and $30 shipped for my SuperMicro C7SIM-Q motherboard new in box 🤣. An $11 shipped Dell CPU cooler keeps CPU below 60 degrees C under load. DDR3 memory is practically free these days too.

myPC.jpg
Filename
myPC.jpg
File size
194.76 KiB
Views
973 views
File license
Public domain

Funny thing is Serious Sam 4 which all I read is people complaining of stutters and frame drops on new high end hardware (Ryzen/3080 level) runs smooth as glass on my ancient relic.........

Back to the main subject, Quadro cards like Xeons are a way to get things cheaper as many people don't think to look at them. For newer hardware check out the Nvidia T400 2GB and T600 4GB which are Turing based and somewhat reasonably priced. Good stuff on Youtube about them too.

Asus A7V8X-LA - Athlon XP 1800+ - 512MB - Geforce FX5200 128MB - SoundBlaster Live - 80GB HDD - Win98SE
DTK PKM-3331Y - Evergreen 5x86 133 - 16MB - WD90C31A 1MB ISA - ESS 1869 ISA - 2.5GB HDD - MS-DOS 6.22

Reply 76 of 100, by Almoststew1990

User metadata
Rank Oldbie
Rank
Oldbie

I'm currently running a Core 2 Quad Q6600 2.4GHz at 3GHz with an Nvidia 560ti playing through my backlog on Windows 7. This PC is entirely adequate for my day to day browsing (I'm not one to have 80 tabs autoload when I open my browser) but it can run all the xbox360 / early ps4 era games I missed. It can run GTAV, Dragon Age Inquisition at 30fps.

My main PC is a 3700X with a 6800XT but for some reason I keep firing up the C2Q. I just 'like' it I guess!

But for the actual topic, some cards seemingly never lost their value, the 7970 has always been weirdly expensive, as has the 700 series. Id say the 500 series or AMD 6000 series is where the prices start dropping quite a lot.

Ryzen 3700X | 16GB 3600MHz RAM | AMD 6800XT | 2Tb NVME SSD | Windows 10
AMD DX2-80 | 16MB RAM | STB LIghtspeed 128 | AWE32 CT3910
I have a vacancy for a main Windows 98 PC

Reply 77 of 100, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie
Almoststew1990 wrote on 2022-01-31, 08:48:

But for the actual topic, some cards seemingly never lost their value, the 7970 has always been weirdly expensive, as has the 700 series. Id say the 500 series or AMD 6000 series is where the prices start dropping quite a lot.

GCN cards were always good for mining so the HD7900 series held its value. When the time came for them to drop, mining had a bump which kept prices high.

sreq.png retrogamer-s.png

Reply 78 of 100, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

By the way, I'll drop a note that google lens is becoming disturbingly* good at recognising (clear) pictures of random GPUs, which means you can use that to snipe bargains, but this will not last long when everyone has caught on.

* Disturbingly because the mirror test, an animal recognising itself in the mirror, is supposed to be a gold standard for self awareness.... so if AI runs on neural nets on GPU and it's getting really good at recognising GPUs .....

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 79 of 100, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie

An AI recognizing a GPU is more similar to an animal recognizing part of a brain as part of a brain. I don't think this standard for self-awareness is applicable to AI.

sreq.png retrogamer-s.png