VOGONS


Rendition Verite Thread

Topic actions

Reply 141 of 652, by noshutdown

User metadata
Rank Oldbie
Rank
Oldbie

my rendition v1000 is slow as hell, what has gone wrong?
card is creative 3d blaster, cpu is pentium3-s 1.4ghz, mainboard is aopen 815epb, and using v1000 ref driver and minigl from first page of this thread.
glquake gets 8~9fps in 640*480*16, according the posts there should be more than 20...

btw, it gets ~510 points in 3dmark99 640*480*16, is that normal?

Reply 142 of 652, by Concupiscence

User metadata
Rank Newbie
Rank
Newbie

I can't speak for 3DMark99, but that GLQuake benchmark is entirely within expected ranges. The V1000 wasn't really designed to withstand the rigors of z-buffering; you'd be much better off running VQuake, which was designed using an alternate rendering style much better-suited to the Verite's strengths. If you just *have* to run GLQuake, I wouldn't set a resolution any higher than about 400x300, and by that point you might as well be running Winquake... especially with that (relative) beast of a CPU feeding the old card.

Reply 143 of 652, by swaaye

User metadata
Rank l33t++
Rank
l33t++

V1000 is really only well suited for Rendition's Speedy3D and RRedline APIs. VQuake uses Speedy3D in addition to being designed to render in a way that works well with V1000. V1000 performs poorly with both D3D and OpenGL for various reasons. D3D runs much better than OpenGL though.

If you'd like to know more, read this link.

Reply 144 of 652, by Concupiscence

User metadata
Rank Newbie
Rank
Newbie

Yep. That article by Stefan Podell made a huge impression on me when I was in high school. Damned shame Rendition's support evaporated after Micron's acquisition not long afterward.

Reply 145 of 652, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Concupiscence wrote:

Damned shame Rendition's support evaporated after Micron's acquisition not long afterward.

My impression has been that V3300 and V4400 were both steamrolled by the competition before they even came out.

Reply 146 of 652, by bushwack

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:
Concupiscence wrote:

Damned shame Rendition's support evaporated after Micron's acquisition not long afterward.

My impression has been that V3300 and V4400 were both steamrolled by the competition before they even came out.

I think the V2000 was a descent product line. It may have been "steamrolled" by advertising and lack of investment funds, but not necessarily by speed or visual quality. I would have really liked to see the V3300. Course 3dfx would have had to change their Voodoo 3000 moniker. 🤣

Reply 147 of 652, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Rendition couldn't keep up. They also couldn't write drivers. 😁 V2200 is roughly as fast as Voodoo Graphics but only had a few months before it was flattened by Voodoo2.

V3300 was obsoleted by NV TNT before it even came out.

V4400 was supposed to use eDRAM and this caused its transistor count to be similar to a Radeon 9700's! This was 2000, not 2002. How were they going to ever build that?! And was the GPU itself capable of competing with a GeForce 256 DDR? I'm gonna guess a negative on that because nobody else could.

Reply 148 of 652, by bushwack

User metadata
Rank Oldbie
Rank
Oldbie

I ran a generic V2100/2200 (can't remember which) for a short time with my Voodoo1 before the V2 came out. I actually preferred the Rendition over 3dfx in many games.

Were any numbers thrown out there to say how fast the V3300 actually was, say compared to the Riva TNT or maybe the Voodoo 2?

Reply 149 of 652, by swaaye

User metadata
Rank l33t++
Rank
l33t++

V3300 pre release info: (still up after over a decade 🤣). He's actually wrong about TNT though because it didn't make the planned clock speeds and instead has around 180 megapixels/s fillrate and who knows what [meaningless] poly rate.
http://web.singnet.com.sg/~duane/v3300.htm

Yeah I had a V2100 back then too, alongside my Voodoo1. The Verite chip has sharper texture filtering and better default gamma. The problem with V2000 is the drivers. They are so bad. Unreal and Half Life, THE games of 1998, look horrible and run at a crawl because of driver issues.

The reason I got the V2100 was because Diamond had a fire sale of the Stealth II S220 in the latter half of '98. I got one brand new for $50 and replaced my Matrox Mystique 220. I didn't keep the Stealth for long because of it being blurrier in 2D than the Mystique and the 3D a letdown.

Reply 150 of 652, by Tetrium

User metadata
Rank l33t++
Rank
l33t++

The V2100 is interesting to me, mostly because a friend of mine also had one in combi with a Voodoo 1.
That, and I think the card looks plain cool! 😁

Whats missing in your collections?
My retro rigs (old topic)
Interesting Vogons threads (links to Vogonswiki)
Report spammers here!

Reply 151 of 652, by swaaye

User metadata
Rank l33t++
Rank
l33t++

To fully appreciate a Verite you need to play a game that uses one of their APIs and enable anti aliasing. Even V1000 can do it. The image quality is excellent.

I suggest Grand Prix Legends.

Reply 153 of 652, by sliderider

User metadata
Rank l33t++
Rank
l33t++
swaaye wrote:

To fully appreciate a Verite you need to play a game that uses one of their APIs and enable anti aliasing. Even V1000 can do it. The image quality is excellent.

I suggest Grand Prix Legends.

Are programming details for those early, proprietary API's still available anywhere? It would be interesting to see if someone can create a new game for those old video cards.

Reply 154 of 652, by Concupiscence

User metadata
Rank Newbie
Rank
Newbie
sliderider wrote:

Are programming details for those early, proprietary API's still available anywhere? It would be interesting to see if someone can create a new game for those old video cards.

You can find the Glide SDK without much issue, and the Rendition RRedline SDK was linked earlier in this thread. I can say without much fear of contradiction that the amount of trouble you'd undergo to write something in those APIs wouldn't really be worthwhile.

Reply 155 of 652, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Putas wrote:

V2k were good, no point in comparing them to expensive cards like Voodoo 2. Glide was only hope for Unreal back then.

That's because the game was designed for Glide, essentially. I don't think Direct3D 5/6 had much chance of equaling Glide's capabilities.

There's a new renderer out there now called UTGLR that runs Unreal Engine 1 better than Glide by using D3D8/9 or OpenGL. It runs quite well even on a GeForce 3 or Radeon 8500.

It was interesting that there was never a RRedline renderer for Unreal. I suppose that Rendition underestimated the importance of Unreal and the company stopped V2000 development by the end of '98.

Reply 156 of 652, by HunterZ

User metadata
Rank l33t++
Rank
l33t++

Yeah the Unreal engine was designed around Glide from the ground up, and it was one of the last games/engines to do so. Epic also realized quickly that it was a mistake, as many viable non-3dfx cards were flooding the market with Direct3D and/or OpenGL support. They apparently contracted with another company to have an OpenGL renderer made for Unreal, but that company dropped the ball. Epic then started writing their own Direct3D renderer for Unreal, but it lagged behind the Glide renderer until UT2k3 when 3dfx didn't exist any more anyways.

UTGLR is great, I've been using it for Deus Ex for years. Makes the game feel a few years newer than it is.

Reply 157 of 652, by swaaye

User metadata
Rank l33t++
Rank
l33t++

My guess would be that the OpenGL renderer was built by that Loki Games company for the Linux port of UT99. Loki sort of died quickly probably because Linux users didn't buy their product. Shocking.

Reply 158 of 652, by HunterZ

User metadata
Rank l33t++
Rank
l33t++

There really wasn't anyone running Linux back then. That was back when installing Linux meant watching a near-endless parade of package names flash up on the screen with no description - just an "install this?" prompt.

Reply 159 of 652, by 5u3

User metadata
Rank Oldbie
Rank
Oldbie

Linux hardware support was much worse back then, especially for sound and video cards. And getting OpenGL to work on a contemporary 3D card meant you had to know how compile a kernel and find your way through XF86Config.