VOGONS


Reply 20 of 40, by swaaye

User metadata
Rank l33t++
Rank
l33t++

If I remember right, Laguna3D has a "filtering quality" slider that adjusts the bilinear filtering quality between bad and terrible. I thought that was fun. The textures float and wobble around somewhat like a PS1. Subpixel accuracy and perspective correction problems among other things? The slider seemed to control how much precision it tried to have, but at a speed cost. Broken no matter what though. Vintage3D (aka Putas) did an in-depth look at it.

These and the early Rage cards couldn't touch 3dfx because they weren't of the same hardware capability. It's not just a matter of drivers. I think Rage 128 finally had reached all of the Voodoo1's features (and some new stuff too of course). Rage Pro still had some blending problems.

Reply 21 of 40, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

The slider seemed to control how much precision it tried to have, but at a speed cost. Broken no matter what though. Vintage3D (aka Putas) did an in-depth look at it.

Yes, but just disabling texture filtering (where it's possible) has much more impact than setting up filtering to lowest possible quality in driver. I suspect that Laguna 3D AGP is capable to properly filter textures with perfect perspective correction, but performance falloff is so horrible that Cirrus Logic had to disable that quality option completely. They've crippled 3DO M2 chip design too much.

Without filtering it easily beats Matrox Mystique, ATi Rage II, SIS 6326 and any S3 Virge card.

Last edited by The Serpent Rider on 2022-08-23, 02:16. Edited 1 time in total.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 22 of 40, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2022-01-12, 21:30:

I suspect that Laguna 3D AGP is capable to properly filter textures with perfect perspective correction, but performance falloff is so horrible that Cirrus Logic had to disable that quality option completely. They've crippled 3DO M2 chip design too much.

But do we know the M2 enough to be able to say this and that was crippled?

Reply 23 of 40, by 386SX

User metadata
Rank l33t
Rank
l33t

@swaaye Interesting, thanks. 😀

I've found a Laguna card at a good price I'd possibly buy but still deciding. Cirrus Logic is my best brand card on my 386 machine so no matter how broken those cards might be I respect these alternative brands history and how much they tried with many solutions. At last Trident and SiS also did make good things after the Blade3D (included imho all considered) and the 6326. Maybe too late but it looks like their early 2000 solutions weren't bad at all. Maybe on the marketing side, looking back at those times, the extreme race for the fastest cards and similar things was such a useless thing from the consumer point of view and so many much cheaper cards were more than enough for most 3D games instead of going for the usual reviewed cards of those times..
After all I did buy the Kyro2 in its time for a similar reason (Trident and SiS solutions weren't easily available around local stores) and it was a great card, no matter what they said about compatibility it had a good driver support for a long time.

Last edited by 386SX on 2022-01-13, 10:15. Edited 3 times in total.

Reply 24 of 40, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Well yes, there's a lot of footage from games on cancelled Panasonic M2.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 25 of 40, by acl

User metadata
Rank Oldbie
Rank
Oldbie

I'm probably late, but i found some time to run some tests with my Laguna3D.

It's a Cirrus Logic 5465 4Mb
My test machine runs a 700Mhz PIII running at 7x133 = 931Mhz.
For the Laguna3D i had to revert back to 100Mhz FSB (133Mhz caused artifacts)
The driver does not support OpenGL, so i tried mostly D3D games.
Generally speaking, the output is not very good. With missing transparencies and bad texture filtering.

I had the following results. (screenshots below)
Final Reality

  • No transparency for clouds

Quake II

  • No OpenGL. Can't run in hardware accelerated mode

Midtown madness (recommended settings)

  • 640*480 10-15fps

POD (D3D)

  • 640*480 (?) 40-50fps

Outlaws (D3D HW accelerated)

  • 400*300 21fps
  • 640*480 15fps
  • 800*600 9fps

HalfLife (D3D)

  • No textures

3DMark2000

  • Can't run (DX support)
Filename
screenshotsGames.zip
File size
4.45 MiB
Downloads
36 downloads
File license
CC-BY-4.0
Filename
screenshotsBench.zip
File size
1.1 MiB
Downloads
32 downloads
File license
CC-BY-4.0
Last edited by acl on 2022-02-09, 13:29. Edited 1 time in total.

"Hello, my friend. Stay awhile and listen..."
My collection (not up to date)

Reply 26 of 40, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
acl wrote on 2022-02-09, 11:08:
Outlaws (D3D HW accelerated) […]
Show full quote

Outlaws (D3D HW accelerated)

  • 400*300 9fps
  • 640*480 15fps
  • 800*600 21fps

Shouldn't it be the opposite?

Reply 27 of 40, by acl

User metadata
Rank Oldbie
Rank
Oldbie
Putas wrote on 2022-02-09, 12:41:
acl wrote on 2022-02-09, 11:08:
Outlaws (D3D HW accelerated) […]
Show full quote

Outlaws (D3D HW accelerated)

  • 400*300 9fps
  • 640*480 15fps
  • 800*600 21fps

Shouldn't it be the opposite?

Of course it is 👍
Thank you. I'm correcting the post

"Hello, my friend. Stay awhile and listen..."
My collection (not up to date)

Reply 29 of 40, by Geri

User metadata
Rank Member
Rank
Member

I have extensively tested the Laguna3D last year. The AGP version. No idea about which chip revision, as it has a heatsink on it. I haven't checked the clocks.

-I havent noticed any bugs related to perspective correction or filtering. Its just as good (bad) as the competing cards.
-The performance is around the SiS 6326 / S3 Trio3D but usually less bugs. (About 50% slower than Riva128/Permedia2/Voodoo Rush. )
-No severe compatibility issues, almost everything era-correct game works on it.
-I havent tested anything besides 16 bit. I dont know if it can do more under 3D, but with 4 MByte video memory, it wouldn't make sense anyway.
-Its a good card to have in a collection, but nothing special.
-It not as bad to justify its demise.
-But its not as good either to justify the expensive rambus memory, EDO memory should have been enough for this performance. Probably an unbalanced design.
-The drivers are good.
-No stability issues.

TitaniumGL the OpenGL to D3D wrapper:
http://users.atw.hu/titaniumgl/index.html

Reply 30 of 40, by pentiumspeed

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2022-01-12, 15:31:

The most hot component on these cards is RDRAM memory. Early revisions had additional placement on PCB for 2Mb RAM for future more powerful pin compatible chips, but it was quickly scrapped.

The datasheet for GD5464 and 65, mentions another 2 RDRAM IC and some PCB has these tracks and pads for this, but this is for future chip design which did not happen. Fully populated using two RDRAM modules is 8 chips in total, four ICs on card, two modules of 2 IC each, but this didn't happen either.

Cheers,

Great Northern aka Canada.

Reply 31 of 40, by 386SX

User metadata
Rank l33t
Rank
l33t
Geri wrote on 2022-08-22, 19:54:
I have extensively tested the Laguna3D last year. The AGP version. No idea about which chip revision, as it has a heatsink on it […]
Show full quote

I have extensively tested the Laguna3D last year. The AGP version. No idea about which chip revision, as it has a heatsink on it. I haven't checked the clocks.

-I havent noticed any bugs related to perspective correction or filtering. Its just as good (bad) as the competing cards.
-The performance is around the SiS 6326 / S3 Trio3D but usually less bugs. (About 50% slower than Riva128/Permedia2/Voodoo Rush. )
-No severe compatibility issues, almost everything era-correct game works on it.
-I havent tested anything besides 16 bit. I dont know if it can do more under 3D, but with 4 MByte video memory, it wouldn't make sense anyway.
-Its a good card to have in a collection, but nothing special.
-It not as bad to justify its demise.
-But its not as good either to justify the expensive rambus memory, EDO memory should have been enough for this performance. Probably an unbalanced design.
-The drivers are good.
-No stability issues.

Thanks for the informations. I still don't have the card but I might get it in future.

Reply 32 of 40, by 386SX

User metadata
Rank l33t
Rank
l33t
pentiumspeed wrote on 2022-08-22, 20:53:
The Serpent Rider wrote on 2022-01-12, 15:31:

The most hot component on these cards is RDRAM memory. Early revisions had additional placement on PCB for 2Mb RAM for future more powerful pin compatible chips, but it was quickly scrapped.

The datasheet for GD5464 and 65, mentions another 2 RDRAM IC and some PCB has these tracks and pads for this, but this is for future chip design which did not happen. Fully populated using two RDRAM modules is 8 chips in total, four ICs on card, two modules of 2 IC each, but this didn't happen either.

Cheers,

Am I correct about that future possibility of two more modules would have doubled the ram bandwidth in that design?

Reply 33 of 40, by pentiumspeed

User metadata
Rank l33t
Rank
l33t

No. If happened: Base 4 RDRAM chips if used is rdram 2 ports means wider datapath without modules. The modules simply hooks into that same 2 ports in parallel on each port. Next time look at the rdram GD5464 or 65 video card of their empty pads on video card.

This is partially populated of fully designed out for 8 rdram IC in total board's and two modules.

Here's one from ebay seller's:

Cheers,

Attachments

Great Northern aka Canada.

Reply 34 of 40, by 386SX

User metadata
Rank l33t
Rank
l33t

But as the GD5464 was using a single channel only bus for 600MB/s while "future products" would have used the second channel for a theorical max 1,2Gb/s bandwidth I was wondering if the early GD5465 PCB would have theorically been already designed to soon use both channels. The card layout would have suggested it as the second empty pads connect to different free pins of the chip and soldered to it. I wonder if for costs/other reasons they went for using the single channel only cause maybe the chip would have not been faster anyway for its engine/clock and the drivers too.

EDIT: the GD5465 like has been already said was indeed using the first rambus channel and this probably still wasn't one of those "future products". But the second channel pins are those empty ones visible on the early PCBs even if not used (?) by this chip memory controller (who knows.. limited internally or simply disabled..). So I wonder how this chip would have seen in those pads those modules technically/internally cause already connected to what was supposed to be the second channel (disabled?) chip pins; maybe not even reading them or maybe enabled with some ipotethical bios upgrades.. 😁.. more realistically a new pin compatible chip. With the early reference PCB already designed to support the second channel future chip version, it was probably close to be released and maybe the GD5465 already had something disabled for it.

Reply 35 of 40, by Geri

User metadata
Rank Member
Rank
Member

If the chip could have use the extra pins, then there would been cards built using those pins.
The two possible explanations: the chip cant use it (thats probably the explanation, as this chip probably doesnt supports more than 6mb despite of the advertisement), or the performance of this chip wouldnt increase (or even decrease) if they are used - as the card is in pair with cheap edo and sd ram based solutions, it couldnt benefit from more bandwidth anyhow.

Using similar PCB-s and pin-compatible chips through multiple generations of this era of video cards isnt a new idea, ATi and S3 also did this.

TitaniumGL the OpenGL to D3D wrapper:
http://users.atw.hu/titaniumgl/index.html

Reply 36 of 40, by 386SX

User metadata
Rank l33t
Rank
l33t

I think too that at the end not being probably much ram limited such old generation card had still no reason for its market sector and with this 3D engine to enable the second channel for higher costs with I imagine not much real game speed differences. But I wonder if the limitation of the single channel was in the memory controller design not supporting both so the extra pins/pads were simply not used at all or maybe "IC internally switched" to be shared with the first channel bus in this chip version or considering the PCB empty pads seems -not- connected to the first channel, a PCB ram size upgrade wasn't the reason of that layout (maybe only changing the right two modules with new ones but not on the left ram pads). Considering the market sector, the reviews, the speed results I imagine not many PCB manufacturers would have been interested in such added cost. 😉
Edit: it's also interesting that in the later cheaper last version that removed the left ram pads, the second channel ready pins seems not connected to much at least in the first PCB layer even when still using the same "C" Revision chip.

Last edited by 386SX on 2022-08-23, 10:48. Edited 2 times in total.

Reply 37 of 40, by Geri

User metadata
Rank Member
Rank
Member

Oh before i forget, the Laguna3D drivers - similary to nvidia TNT1 - require at least a 300 mhz cpu to unleash their full potential, so i dont recommend anyone to use it in a weaker socket5/7 computer. An Intel P2/Celeron/k6-2 will do it.

TitaniumGL the OpenGL to D3D wrapper:
http://users.atw.hu/titaniumgl/index.html

Reply 38 of 40, by pentiumspeed

User metadata
Rank l33t
Rank
l33t

Please note, the datasheet did explicitly said 64 and 65 chips are not supported for dual channel even the dotted lines for future dual channel is shown yet the datasheet said not.

Cheers,

Great Northern aka Canada.

Reply 39 of 40, by 386SX

User metadata
Rank l33t
Rank
l33t

Indeed but the interesting question would be if considering these specifications already explain the pin layout with the second channel and also these early PCB without the left ram soldered (which are connected to those second future channel and even already soldered lines), the 5465 might have had the second channel memory controller internally or firmware disabled or simply not able to drive two channel (while as said having those package lines soldered which we don't know where ended inside the package). Would be interesting to see a package inside image to check if those lines actually were connected internally. I think It would not be the first time that a video chip that need maybe more revision to work in its full design or as a choice, end virtually limited for a lower end market sector.