VOGONS


Bought these (retro) hardware today

Topic actions

Reply 11120 of 52881, by nforce4max

User metadata
Rank l33t
Rank
l33t

Two more boards for my collection came in today. An Acorp 6A815EPD that I forgot that I had bought and a new in box Intel C440GX+ (absolutely massive box). Hopefully sometime soon I will get around to building my first Pentium 3 Xeon build with that shiny new C440GX+ 😎

On a far away planet reading your posts in the year 10,191.

Reply 11121 of 52881, by BloodyCactus

User metadata
Rank Oldbie
Rank
Oldbie

MT32, old-old style with PGA. Serial number in the 82 range.... I _think_ it has roms 1.05 if the sickers are correct and a reverb of 2.00, Need to wait for the PSU to arrive so I can power it up. Strange that two of the roms are socketd and the rest are soldered down to the pcb.

I think the reverb rom is a 7E1 and the gate array is a 7G1 based on some info I found on the web.

so yeah, its all very old/early revisions. I need to see if I can find some blanks to burn 1.07 and upgrade it.

album with more pics;
http://imgur.com/a/9WZBu

jSuL7Dol.jpg

TZBAgMyl.jpg

--/\-[ Stu : Bloody Cactus :: [ https://bloodycactus.com :: http://kråketær.com ]-/\--

Reply 11122 of 52881, by kanecvr

User metadata
Rank Oldbie
Rank
Oldbie

Those 7950GX2 cards are awesome. I'm also looking for one, but like with the 8800 and 2900 cards, finding one that works is not easy. I've tested a couple by now, both with the same defect - one of the GPU's is cooked (the one in between the two PCBs).

Now if I find a working 7950GX2 and 8800GTX my retro-modern collection will be complete.

Reply 11123 of 52881, by clueless1

User metadata
Rank l33t
Rank
l33t
kanecvr wrote:

Now if I find a working 7950GX2 and 8800GTX my retro-modern collection will be complete.

There's a few 880GTX on ebay at the moment in the $40-$50 range that appear to be listed as functional.

The more I learn, the more I realize how much I don't know.
OPL3 FM vs. Roland MT-32 vs. General MIDI DOS Game Comparison
Let's benchmark our systems with cache disabled
DOS PCI Graphics Card Benchmarks

Reply 11124 of 52881, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

There are SO many alternatives to the 8800 GTX. 8800 GT, 9800 GT, 9600 GT, 9800 GTX, 9800 GTS...

I really like the 8800 GT, it's single slot and packs quite a punch. The 9600 GT is surprisingly fast too.

And the HD4850, great little single slot card.

YouTube, Facebook, Website

Reply 11125 of 52881, by gdjacobs

User metadata
Rank l33t++
Rank
l33t++
BloodyCactus wrote:
MT32, old-old style with PGA. Serial number in the 82 range.... I _think_ it has roms 1.05 if the sickers are correct and a reve […]
Show full quote

MT32, old-old style with PGA. Serial number in the 82 range.... I _think_ it has roms 1.05 if the sickers are correct and a reverb of 2.00, Need to wait for the PSU to arrive so I can power it up. Strange that two of the roms are socketd and the rest are soldered down to the pcb.

I think the reverb rom is a 7E1 and the gate array is a 7G1 based on some info I found on the web.

so yeah, its all very old/early revisions. I need to see if I can find some blanks to burn 1.07 and upgrade it.

album with more pics;
http://imgur.com/a/9WZBu

jSuL7Dol.jpg

TZBAgMyl.jpg

Nice! An original model.

All hail the Great Capacitor Brand Finder

Reply 11126 of 52881, by Kodai

User metadata
Rank Member
Rank
Member
PhilsComputerLab wrote:

There are SO many alternatives to the 8800 GTX. 8800 GT, 9800 GT, 9600 GT, 9800 GTX, 9800 GTS...

I really like the 8800 GT, it's single slot and packs quite a punch. The 9600 GT is surprisingly fast too.

And the HD4850, great little single slot card.

The 9800 GTX is a little confusing to many as there was a hardware revision that caused the revision to not work with the original in SLI. I was running tripple SLI and one died, so I sent it back to XFX (back when XFX still sold Nvidia based cards) for warranty. They said they couldnt fix it so they sent me a new one. I went to put the new one in and SLI refused to work on the third card. Spent the next several days trying to get it up and going and reinstalled OS multiple times. The new card worked, but just would not SLI with my other cards. XFX couldn't figure it out and sent me another replacement card. Still wouldn't work. This time I went to Nvidia and started asking, only to find out that the revised GPU required a change in the cards BIOS that would not allow the new cards to SLI with the older cards. They told me it wasn't a big problem as very few people would run into the problem. I told them that I was running three of them and my replacement breaks that. They told me that they were sorry and asked if there was anything else they could help with.

That fiasco really ticked me off. Neither XFX nor Nvidia would replace the remaining two for me to regain my tripple SLI. Over $1300 in video cards and a revision stops the full function with no ability to regain what was an important advertised feature. The third card would work, but only as a PhysX adaptor. So I had to buy two more new 9800 GTX cards to get my tripple SLI back. Talk about not being a happy camper. I still have four of those cards and they still work just fine. I doubt I'll ever use them again, but because I spent so much time and money on them, I just cant toss them. 🤣

Reply 11127 of 52881, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

That's unfortunate 😢

The 8800 GTX was out of my budget, so when the 8800 GT came out, it was MY card so to speak. Similar performance for a great price. The card is right up there with the 4200 Ti in terms of gaming value.

The re-branding did get out of hand though. I think I have two Gigabyte 9800GTX+ cards for some SLI action in the future. I also have that dual GPU card. 9800 GX2, quite a cool product.

YouTube, Facebook, Website

Reply 11128 of 52881, by Kodai

User metadata
Rank Member
Rank
Member

Ah thats it, the GTX+ (I had forgot the "+" part) is the revision. The GTX will not SLI with a GTX+. You need to get one more for tripple SLI (keep in mind that its about 800 - 900 watts when pushing them though), and a second GX2 for quad SLI (also over 800 watts when pegged out). The power I listed does NOT include the power for the rest of the system, so factor in a few hundred more watts for that. For either setup, you will want a good 1200 watt or higher PSU. Oh, and they run REALLY hot. You will need lots of directed airflow for those two cards in SLI. If you use an Nforce mobo, then make sure you put extra cooling on the north bridge because SLI pushes the temps to their max quite often.

Reply 11129 of 52881, by kithylin

User metadata
Rank l33t
Rank
l33t
Kodai wrote:

Ah thats it, the GTX+ (I had forgot the "+" part) is the revision. The GTX will not SLI with a GTX+. You need to get one more for tripple SLI (keep in mind that its about 800 - 900 watts when pushing them though), and a second GX2 for quad SLI (also over 800 watts when pegged out). The power I listed does NOT include the power for the rest of the system, so factor in a few hundred more watts for that. For either setup, you will want a good 1200 watt or higher PSU. Oh, and they run REALLY hot. You will need lots of directed airflow for those two cards in SLI. If you use an Nforce mobo, then make sure you put extra cooling on the north bridge because SLI pushes the temps to their max quite often.

It's not just a bios change. The actual physical gpu core was revised for the 9800 GTX+ and actually featured a physical size reduction.. 9800 GTX was 65 nm and core G92, and GTX 9800+ was a whole different, new core, G92b and 55nm. The reason it wouldn't SLI with the older ones is because physically, it was not the same card even remotely.

See here: https://en.wikipedia.org/wiki/List_of_Nvidia_ … 89xxx.29_Series

Reply 11130 of 52881, by kanecvr

User metadata
Rank Oldbie
Rank
Oldbie
PhilsComputerLab wrote:

There are SO many alternatives to the 8800 GTX. 8800 GT, 9800 GT, 9600 GT, 9800 GTX, 9800 GTS...

I really like the 8800 GT, it's single slot and packs quite a punch. The 9600 GT is surprisingly fast too.

And the HD4850, great little single slot card.

I never was a fan of the 8800gt with the single slot cooler - hot and noisy little things with no overclocking headroom and poor reliability. In fact the only working GT card I have is a Palit that came with a big aftermarket zalman cooler.

As far as I remember all 8800 and 9800 cards suffer from the "ROHS alloy plague" as we liked to call it. The alloy is poor quality and the cards frequently run at 80C or more, causing cracks in the solder balls witch leads to the cards artifacting and then failing completely. The dual-slot and aftermarket cards seem to have fared better over the years.

clueless1 wrote:
kanecvr wrote:

Now if I find a working 7950GX2 and 8800GTX my retro-modern collection will be complete.

There's a few 880GTX on ebay at the moment in the $40-$50 range that appear to be listed as functional.

40-50$ is more then I'm willing to spend on an 8800GTX unless it comes boxed with all accessories and I have a guarantee it works and has not been reflowed / reballed. I payed 24 euro for the rarer 2900XT as it is, and in my opinion I overplayed a little considering for that money one can buy a 3870 or a 4850.

There's also no way to tell if the card works right until it gets here, and I don't want to go trough the return process in case it's defective or has been repaired. I've been burned this way with local cards.

PhilsComputerLab wrote:

That's unfortunate 😢

The 8800 GTX was out of my budget, so when the 8800 GT came out, it was MY card so to speak. Similar performance for a great price. The card is right up there with the 4200 Ti in terms of gaming value.

It was too rich for me as well, but back in the day I got a really really cheap press sample from a friend, otherwise I would have kept the X1950XTX I had before. Yes the 8800GT was cheap, but the 2900XT was a little cheaper and performed slightly better in what I was playing at the time so I went with that. I also loved how performance increased by 20-25% in catalyst 8.x from the reference driver (Catalyst 7.2?). I remember getting well over 170 fps in doom 3 timedemo at 1280x1024 / ultra.

Driver fragmentation for the ATI cards was really annoying. For example I remember catalyst 7.8 or 7.9 gave the best Doom 3 and Quake 4 experience, while newer games like Crysis ran a lot better with catalyst 9.2

Reply 11131 of 52881, by totalizator

User metadata
Rank Newbie
Rank
Newbie

I have bought a Sparkle Nvidia TN2 M64 32MB PCI (~10 Euro) for my Compaq DeskPro EN 6333 SFF running Windows98 and guess what - I'm more than happy with it. Definitely much better than the integrated Rage PRO 4MB. I have given up searching for Voodoo2 in a decent price so the performance of M64 in this situation is superb for me.

I got also a VIA USB 2.0 controller that works on this setup too. I was quite surprised as I had no luck with Belking PCI Wlan card previously - the PC wouldn't even boot.

Question here: the seller had two versions of the USB controller - red and black. I took the red one as it has more contacts on the connector and I assumed it is more likely to work on older machines. Can you tell what is the difference between these two cards?

Attachments

  • Sparkle.jpg
    Filename
    Sparkle.jpg
    File size
    148.68 KiB
    Views
    5887 views
    File license
    Fair use/fair dealing exception
  • 6047917264_2.jpg
    Filename
    6047917264_2.jpg
    File size
    151.45 KiB
    Views
    5887 views
    File license
    Fair use/fair dealing exception
  • 6047917264.jpg
    Filename
    6047917264.jpg
    File size
    142.8 KiB
    Views
    5887 views
    File license
    Fair use/fair dealing exception
  • IMAG0776.jpg
    Filename
    IMAG0776.jpg
    File size
    993.1 KiB
    Views
    5887 views
    File license
    Fair use/fair dealing exception

Reply 11132 of 52881, by kaputnik

User metadata
Rank Oldbie
Rank
Oldbie
totalizator wrote:

I have bought a Sparkle Nvidia TN2 M64 32MB PCI (~10 Euro) for my Compaq DeskPro EN 6333 SFF running Windows98 and guess what - I'm more than happy with it. Definitely much better than the integrated Rage PRO 4MB. I have given up searching for Voodoo2 in a decent price so the performance of M64 in this situation is superb for me.

I got also a VIA USB 2.0 controller that works on this setup too. I was quite surprised as I had no luck with Belking PCI Wlan card previously - the PC wouldn't even boot.

Question here: the seller had two versions of the USB controller - red and black. I took the red one as it has more contacts on the connector and I assumed it is more likely to work on older machines. Can you tell what is the difference between these two cards?

My guess is that there is no functional difference at least. Are there any manufacturing dates printed on the cards? The one with less contacts might be a later revision, where they've pinched a few pennies by removing unused contacts to save a few milligrams of gold in the plating process, or something like that. Or it might just be a question of different OEM:s using the same chipsets and reference PCB layout from VIA, hence the close similarity.

Reply 11133 of 52881, by shamino

User metadata
Rank l33t
Rank
l33t

Re the 8800GTX/etc mentioned above:
I generally loathe buying used high end graphics cards. I like them, but I don't trust them.
They live a hard life and very few sellers put any effort into testing them properly. At most they might turn them on to POST for 5 seconds and then say "Our highly paid Technician has Expertly Tested this card as working."
The high powered cards of any generation are expensive and it's a coin flip as to whether you'll end up with some beaten up junk.
If you're lucky, you can find somebody who knows the card's history and has taken the time to stress it in a looping 3D benchmark.

====

Anyway, not a high end card, but I bought a sealed Geforce GT240 that somebody is selling a bunch of on eBay. It's the 512MB GDDR5 version, which I think is the most ideal, and I'm a sucker for sealed cards.
The GT240 appeals to me for how well it hits the sweet spot of performance with low idle power consumption. Articles show it has the same idle consumption as a GT220 and only a few watts higher than a G210. The next card up, the GTS250, jumps into a different world and isn't a low power card at all.

At the time of release, lots of people said the GT240 was pointless, but I think it's actually one of the most interesting cards of that generation. It was designed to be nVidia's top performing card that would fit within the power budget of a common mainstream PC. I think this is why it was their first card to use GDDR5 memory and a 40nm GPU.

I've accumulated a bit of a collection of nVidia 200-series cards:
G210 512MB DDR2 passive
GT240 512MB GDDR5
GTX260 core-216
GTX275

I don't go beyond DirectX 9, and I like the idea of being able to swap these cards without needing to mess with drivers.

My intention is to use the GT240 with an Athlon64 to build an everyday PC with moderate power consumption. I'm hoping the build will run on a 300W PSU, which will help the efficiency. If all goes well, I might use that as my general purpose PC during the summer months, when power consumption and the resulting heat dissipation are an issue. Then I can leave my power hungry "main" PC (with the GTX275) turned off unless I actually need it.

Reply 11134 of 52881, by havli

User metadata
Rank Oldbie
Rank
Oldbie

Round two. 😀

GeForce4 Ti 4200 128MB AGP 8x
gf4_ti4200_128_8x_ft8uvu.jpg

GeForce GTX 260 216 - using 65nm GT200 chip and the original Volterra-equipped PCB.
gtx_260_f6eup9.jpg
gtx_260_bqzu6d.jpg
gtx260o3u8y.png

HW museum.cz - my collection of PC hardware

Reply 11136 of 52881, by brostenen

User metadata
Rank l33t++
Rank
l33t++

That GF4 card, somehow reminds me of:

sddefault.jpg

Don't know why. 😁 😉

Don't eat stuff off a 15 year old never cleaned cpu cooler.
Those cakes make you sick....

My blog: http://to9xct.blogspot.dk
My YouTube: https://www.youtube.com/user/brostenen

001100 010010 011110 100001 101101 110011

Reply 11137 of 52881, by kithylin

User metadata
Rank l33t
Rank
l33t
shamino wrote:

The GT240 appeals to me for how well it hits the sweet spot of performance with low idle power consumption.

At the time of release, lots of people said the GT240 was pointless, but I think it's actually one of the most interesting cards of that generation. It was designed to be nVidia's top performing card that would fit within the power budget of a common mainstream PC. I think this is why it was their first card to use GDDR5 memory and a 40nm GPU.

My intention is to use the GT240 with an Athlon64 to build an everyday PC with moderate power consumption. I'm hoping the build will run on a 300W PSU, which will help the efficiency. If all goes well, I might use that as my general purpose PC during the summer months, when power consumption and the resulting heat dissipation are an issue. Then I can leave my power hungry "main" PC (with the GTX275) turned off unless I actually need it.

I've actually used the GT-240 (Gddr5 version) with an Athlon64 chip @ 2.5 ghz before and found the performance in early windows XP 32-bit to be about on-par with a nvidia 6800 ultra, and of course.. at a lot less power usage. They're actually surprisingly decent cards.

Reply 11138 of 52881, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++
shamino wrote:

At the time of release, lots of people said the GT240 was pointless, but I think it's actually one of the most interesting cards of that generation. It was designed to be nVidia's top performing card that would fit within the power budget of a common mainstream PC. I think this is why it was their first card to use GDDR5 memory and a 40nm GPU.

I really like that sort of thinking. Seeing the good in parts that are otherwise easily dismissed. I'll keep that card in mind, sounds very useful!

My intention is to use the GT240 with an Athlon64 to build an everyday PC with moderate power consumption.

This I don't get though, as any recent motherboard with CPU and integrated graphics will run circles around that system in terms of performance per watt.

Got this in yesterday. It's HAMMER time!

m8kvAzz.png

YouTube, Facebook, Website

Reply 11139 of 52881, by nforce4max

User metadata
Rank l33t
Rank
l33t
shamino wrote:

Re the 8800GTX/etc mentioned above:

At the time of release, lots of people said the GT240 was pointless, but I think it's actually one of the most interesting cards of that generation. It was designed to be nVidia's top performing card that would fit within the power budget of a common mainstream PC. I think this is why it was their first card to use GDDR5 memory and a 40nm GPU.

Those who completely dismissed this card were likely idiots anyway but there were some who recognized that it had some potential as a dedicated physx card and was occasionally pared with two GTX295. Some kept the card in use into the fermi era as physx would still bugger up the fps in sli rigs, the last thing anyone would have wanted was dodgy fps in physx titles.

On a far away planet reading your posts in the year 10,191.