VOGONS


Foreshadowing the value of P4 hardware

Topic actions

Reply 80 of 106, by Skyscraper

User metadata
Rank l33t
Rank
l33t
alexanrs wrote:

Are there any AGP cards capable of decoding youtube content?!

The AGP versions of Radeon HD 4650 and 4670 are the ones I know of.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 81 of 106, by calvin

User metadata
Rank Member
Rank
Member

I'm very sure the AGP NVidia cards can't decode video on Linux. Even then, if you must go P4/C2, go on a PCIe chipset/board. Modern GPUs are really good nowadays, despite them wanting two brackets. (and they're getting a lot better about that)

2xP2 450, 512 MB SDR, GeForce DDR, Asus P2B-D, Windows 2000
P3 866, 512 MB RDRAM, Radeon X1650, Dell Dimension XPS B866, Windows 7
M2 @ 250 MHz, 64 MB SDE, SiS5598, Compaq Presario 2286, Windows 98

Reply 82 of 106, by Unknown_K

User metadata
Rank Oldbie
Rank
Oldbie

Did anyone make a dual P4 motherboard? I love my old Dual Opteron boards, they are very solid and RAM is cheap.

Collector of old computers, hardware, and software

Reply 83 of 106, by obobskivich

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:
yes im aware of the older crossfire, but 16+4 was not a good method for crossfire, it might have been fine for the x850, but con […]
Show full quote

yes im aware of the older crossfire, but 16+4 was not a good method for crossfire, it might have been fine for the x850, but considering that by the time crossfire was readily available the x1800XT was almost ready to go was largely ignored, because the 7800GTX was out and offered more performance. But to continue the x850 could saturate an AGP4x bus and showed improved speed on AGP 8x, and considering PCIe 1.0/1.1 4x operates @ 1gb/s which is the same speed as AGP 4x it's sufficent to rule out any 16x4 solutions, meaning for pentium 4 no real viable crossfire solution besides ATI's own existed, and the express 1150 aka express 200 was even more buggy than nforce 4 on intel it really pushed anyone wanting to play at the highend to AMD. As for the argument agasint SLI/Xfire, I've ran it quite a few times on current gen hardware.

2007 HD3870+HD3850 crossfire (preformed terribly, I lost preformance in most games, and the gpu's where paired with an Athlon 64 X2 6000)

2011 GTX 560 Ti 448 SLI (ran wonderfully, no issues paired with i5 2500k)

2012 GTX 670 Triple SLI (ran wonderfully, paired with i7 3930k{had to sell this machine when I was out of work for 4 months})

Each one but the crossfire worked wonderfully, and if you bought multi gpu when current it made alot of sense. to run at 1200p in 2004 you needed SLI, to run at 1600p you had to have SLI/Crossfire in 2006, and today nothing can drive a game at 4k outside of SLI.

Again, not entirely accurate. AGP 4x and x4 PCIe cannot be directly compared 1:1. As far as non-16+16 configurations, it isn't always detrimental. Tom's has tested "PCIe scaling" a variety of times over the years, including with CrossFire:
http://www.tomshardware.com/reviews/crossfire … ess,2095-5.html

It also isn't the GPU that "saturates" a given bus - it's the application's demand. In other words, saying "X850 can saturate AGP 4x" makes no sense - instead, there are games that can see benefit from higher bandwidth to the GPU, and others that will not (which you can also see in the CF review, where P965 CF can outperform PCIe 2.0 single-card with some applications, but not with others).

Xpress 200CF was also never released for Intel - that's an AMD exclusive platform. ATi did not release CF for Intel until the Xpress 3200CF. The 200CF for AMD is a fine chipset IME.

Finally, "1200p in 2004" wasn't really a big consideration - Steam h/w survey data from back then generally showed 1024x768 as the most common resolution, but depending on the game 1920x1200 (or more likely 1600x1200) is entirely functional with older cards, no SLI/CF is required (even GeForce 2 Ultra can accomplish that, depending on the game). The same reasoning applies to 2560x1600 and 4K gaming - it depends on the game as much as it does on the hardware. Sure, you can (and I suspect probably will) link me to reviews that show cards like 290X or GTX 690 having trouble with a game like Watch_dogs on maximum settings not getting 100 FPS at 4K, but that doesn't mean they can't run any game at 4K. It's also worth pointing out that just as 1920x1200 wasn't very common in 2004, it's not like every and their grandmother has 4K displays in 2015.

It's unfortunate to hear you've had a bad experience with CrossFire, but it isn't fair to say that your experience is a representation of all configurations, systems, etc (just as there is no "all gamers did..." or "all enthusiasts wanted..." kind of statement that can be made as a blanket). 😊

Skyscraper wrote:

All HT capable Pentium 4 should be able to run Youtube 480p with only CPU decoding.
The top Prescotts (3.6 and 3.8 ) should handle Youtube 720P.

This is using Flash, HTML5 seems to perform worse so far. If the board has a PCI-E X16 slot then like other members suggested get a low power card like the AMD HD 5450, they usally sell for ~$5.

Dual Prestonias with HT can handle 720p and 1080p, albeit very inefficiently (power consumption is massive).

calvin wrote:

I'm very sure the AGP NVidia cards can't decode video on Linux. Even then, if you must go P4/C2, go on a PCIe chipset/board. Modern GPUs are really good nowadays, despite them wanting two brackets. (and they're getting a lot better about that)

What kind of video? They should support MPEG-2/h.264 (depending on the card's actual capabilities ofc), but none of that means "will decode Flash."

Unknown_K wrote:

Did anyone make a dual P4 motherboard? I love my old Dual Opteron boards, they are very solid and RAM is cheap.

No, nothing that will take a pair of Socket 423 or 478 chips - just like there aren't boards that can take a pair of Socket 754 or 939 chips. There are, however, multi-socket NetBurst boards that use Socket 603 or Socket 604, as well as a variety of CPUs available (many of them have more cache than Pentium 4 as well). AGP and PCIe (including SLI) equipped variants exist on dual-socket platforms, such as the Asus PC-DL Deluxe, NCCH-DL Deluxe, and Iwill DN800SLI.

Reply 85 of 106, by calvin

User metadata
Rank Member
Rank
Member

I doubt they can do VP8 or H.265. MPEG4/H.264, probably. There is likely no way to coax a crusty P4 to render the former at an acceptable framerate without massive quality drops.

2xP2 450, 512 MB SDR, GeForce DDR, Asus P2B-D, Windows 2000
P3 866, 512 MB RDRAM, Radeon X1650, Dell Dimension XPS B866, Windows 7
M2 @ 250 MHz, 64 MB SDE, SiS5598, Compaq Presario 2286, Windows 98

Reply 86 of 106, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie
calvin wrote:

I doubt they can do VP8 or H.265. MPEG4/H.264, probably. There is likely no way to coax a crusty P4 to render the former at an acceptable framerate without massive quality drops.

I believe the GTX 960 and Titan X are currently the only GPUs that have full hardware support for H.265. On the GTX 970/980, H.265 is only partially accelerated, and this partial acceleration is performed by the shader cores, not by the fixed function video decoder.

Even though AMD and NVIDIA don't mention VP9 support, I believe modern cards can decode YouTube's VP9 encoded video in hardware. On my system, 1080p/60fps HTML5 video uses like 1% of the CPU, so the video card's gotta be performing at least partial VP9 acceleration.

94 MHz NEC VR4300 | SGI Reality CoPro | 8MB RDRAM | Each game gets its own SSD - nooice!

Reply 87 of 106, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

Yes, H.265 is very new. I can render in this format, and VLC plays it back in software, but I don't have any hardware that decodes it in hardware. The next round of cards will likely all have this feature together with HDMI 2.

YouTube, Facebook, Website

Reply 88 of 106, by TandySensation

User metadata
Rank Newbie
Rank
Newbie
alexanrs wrote:

Are there any AGP cards capable of decoding youtube content?!

HD3200 or better or the 8000 series or better in the nvidia camp should work. My HTPC has a 780G chipset with onboard 3200HD and it does hardware acceleration of streaming videos, 1080p has very low cpu usage on a 2Ghz Phenom.

Looks like the HD3650 in AGP is available, should do the job.

Reply 89 of 106, by Skyscraper

User metadata
Rank l33t
Rank
l33t
TandySensation wrote:
alexanrs wrote:

Are there any AGP cards capable of decoding youtube content?!

HD3200 or better or the 8000 series or better in the nvidia camp should work. My HTPC has a 780G chipset with onboard 3200HD and it does hardware acceleration of streaming videos, 1080p has very low cpu usage on a 2Ghz Phenom.

Looks like the HD3650 in AGP is available, should do the job.

Sadly the HD 3650 does not support h264 Flash decoding any more.
The integrated version of the HD 3450 does support Flash decoding strangly enough but the stand alone HD 3450 does not.

H264 decoding with for examble Media Player Classic works even with the HD 2X00 😀

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 90 of 106, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Officially only HD3xxx IGPs support Flash. I read it has to do with the IGP having a more powerful UVD unit. Something similar to HD4000.

Although if you run Catalyst 9.11 then the HD3xxx GPUs will accelerate Flash, but it is half baked and often drops back to software decoding. It's a mystery as to why only this release works. Maybe the problems were insurmountable.

Reply 91 of 106, by obobskivich

User metadata
Rank l33t
Rank
l33t
TandySensation wrote:

HD3200 or better or the 8000 series or better in the nvidia camp should work. My HTPC has a 780G chipset with onboard 3200HD and it does hardware acceleration of streaming videos, 1080p has very low cpu usage on a 2Ghz Phenom.

Looks like the HD3650 in AGP is available, should do the job.

Officially Flash HW support is only available on Radeon HD 4000 series and GeForce 8000 series with PureVideo HD (this means no G80, but the 8400/8600 and G92-based 8800 all work). ATi had beta support for GPU decoding of Flash content on the HD 3000 series, but it was removed for the final release - I forget which driver enables it (and I also don't remember if you have to use a specific version of Flash either).

Skyscraper wrote:

H264 decoding with for examble Media Player Classic works even with the HD 2X00 😀

h.264 is supported from Radeon X1k/GeForce 7. G80 GeForce has the same decoder as the GeForce 7, but the 8400/8600/G92-based cards have full h.264 support. Radeon X1k does h.264-decode on its shaders, and lower-spec cards support lower resolutions as a result. Radeon HD 2900 implements AVIVO (which won't do Flash h.264 among other things), but the 2600 and below feature UVD. Radeon HD 3000 series and GeForce 9 series and above have full h.264 support across the entire product line; Radeon HD 4000 series and GeForce 9 series and above have full h.264 Flash support across the entire product line.

ATi's IGP naming schemes don't 1:1 correlate to their discrete offerings though - some of the parts branded as 3000-series have the UVD2 decoder found in the R700-based discrete cards, and will support h.264 and Flash as a result.

swaaye wrote:

Officially only HD3xxx IGPs support Flash. I read it has to do with the IGP having a more powerful UVD unit. Something similar to HD4000.

The HD 3300 and HD3200 were released after the R700, and feature UVD2 (which is what R700 has), and will therefore support Flash h.264 and other functionality. The HD 3000 and HD 3100 do not feature UVD2, but instead rely on AVIVO, which does not support Flash h.264.

Although if you run Catalyst 9.11 then the HD3xxx GPUs will accelerate Flash, but it is half baked and often drops back to software decoding. It's a mystery as to why only this release works. Maybe the problems were insurmountable.

It was a beta support feature in Flash (it pre-dates official Flash GPU support by Adobe), and my understanding is that it was removed due to performance and technical limitations on the R600-based GPUs, as well as Adobe's final specs (remember that R600/R700 were really abysmal when it came to GPGPU compliance). UVD2 resolved this issue for the R700-based chips.

Reply 92 of 106, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I had Catalyst 9.11 accelerating Flash on a notebook with HD 3410 fairly recently. It's a 12" subnotebook with a weak Athlon Neo X2 and even that flaky Flash acceleration is nice to have.

Reply 93 of 106, by feipoa

User metadata
Rank l33t++
Rank
l33t++

I did not have the best success with the HD4350 in my dual PIII-1.4 Tualatin computer for accelerating Flash or other online content. I recall that it only accelerated certain offline HD videos. Used Win7 for that test.

I had singled out the i865G, i875P, and i915GL as the 3 boards which would be good candidates for my 2 black cases.

The i915GL, or ASUS P5GL-MX, board would not show more than ~2.5 GB of RAM installed when I had a PCIe graphics card (X550) installed. When the graphics card was removed, and I used the onboard video, the system still only showed 3 GB. I did not see any BIOS setting which allowed for manipulation of memory size. According to my internet search, the i915GL supports 4 GB. Is this a known problem? I also tested the Gigabyte/Acer 8I915AE (i915GL), which for some reason, MemTest reports the memory stuck in DDR266 mode, even though the BIOS says it is DDR400. This board only has 2 DIMM slots though, so this board is not a real consideration at this point.

Feeling discouraged with the i915GL boards, I moved onto the i865G board, or the Gigabyte GA-8IG1000MK. When I went to power the system on, three 1000 uF capacitors burst in sequence, spraying electrolytic juice over my face. There are now 5 bulging caps. I've never had caps burst in my face before. Is this board worth the effort to replace the caps? I think I have new 6.3V, 1000uF caps in a bin. I had such great luck testing the 4 rack mount servers, I did not expect such troubles with the desktop boards.

I then powered up the i875P board, the Intel S875WP1-E, entry server board. No real problems as of yet; it is running MemTest.

I noticed that some P4 boards have a 0.09 um P4 CPU with 1 MB L3 cache, another with 2 MB L3 cache. Was there any obvious improvement with these various cache sizes?)

Plan your life wisely, you'll be dead before you know it.

Reply 94 of 106, by obobskivich

User metadata
Rank l33t
Rank
l33t
feipoa wrote:

I did not have the best success with the HD4350 in my dual PIII-1.4 Tualatin computer for accelerating Flash or other online content. I recall that it only accelerated certain offline HD videos. Used Win7 for that test.

Interesting about the AGP 4350 - I've often wondered how the AGP-bridged ATi cards handle AVIVO/UVD; I know on the nVidia AGP-bridged cards a number of features are usually disabled vs the PCIe variant (no idea why they do this).

I had singled out the i865G, i875P, and i915GL as the 3 boards which would be good candidates for my 2 black cases.

The i915GL, or ASUS P5GL-MX, board would not show more than ~2.5 GB of RAM installed when I had a PCIe graphics card (X550) installed. When the graphics card was removed, and I used the onboard video, the system still only showed 3 GB. I did not see any BIOS setting which allowed for manipulation of memory size. According to my internet search, the i915GL supports 4 GB. Is this a known problem? I also tested the Gigabyte/Acer 8I915AE (i915GL), which for some reason, MemTest reports the memory stuck in DDR266 mode, even though the BIOS says it is DDR400. This board only has 2 DIMM slots though, so this board is not a real consideration at this point.

If it boots with 4GB installed, it "works" but the BIOS may not be able to actually address all of that memory. I have an 875p board that will boot with 4GB installed, but can only address like 3300MB of it (and Windows will see even less than that, around 3GB). There may or may not be an update to fix that - I honestly don't know if this is a chipset limit on earlier Intel MCHs, or just a BIOS oversight (remember that back in 2002-2004, even 2GB of memory was fairly substantial for a desktop machine; lots of people were using 512MB of 1GB, and many boards would only support like 1.5GB-2GB at max).

Feeling discouraged with the i915GL boards, I moved onto the i865G board, or the Gigabyte GA-8IG1000MK. When I went to power the system on, three 1000 uF capacitors burst in sequence, spraying electrolytic juice over my face. There are now 5 bulging caps. I've never had caps burst in my face before. Is this board worth the effort to replace the caps? I think I have new 6.3V, 1000uF caps in a bin. I had such great luck testing the 4 rack mount servers, I did not expect such troubles with the desktop boards.

I hope you're okay! 😲

I don't know much about that specific Gigabyte, but 865p-based boards can be very nice. I have a Shuttle SFF that is 865-based, and features-wise and performance-wise, it feels very similar to my 875p Asus (the biggest limitations are that its much smaller, and has many fewer expansion slots, ports, etc as a result). Both have nice BIOS options, good support for PATA and SATA, good memory performance, stable drivers, etc.

I noticed that some P4 boards have a 0.09 um P4 CPU with 1 MB L3 cache, another with 2 MB L3 cache. Was there any obvious improvement with these various cache sizes?)

Generally higher cache is better performance for Pentium 4. The 2M chips should be at worst dead-even with the 1M chips, and at best faster; not all applications seem to show the same benefit from the extra cache though (games tend to like the higher-cache variants though).

Reply 95 of 106, by ODwilly

User metadata
Rank l33t
Rank
l33t

I still say you should give that SiS based lga775 Gigabyte board a go 😀

Main pc: Asus ROG 17. R9 5900HX, RTX 3070m, 16gb ddr4 3200, 1tb NVME.
Retro PC: Soyo P4S Dragon, 3gb ddr 266, 120gb Maxtor, Geforce Fx 5950 Ultra, SB Live! 5.1

Reply 96 of 106, by NJRoadfan

User metadata
Rank Oldbie
Rank
Oldbie

Despite being able to take 64-bit chips, Intel's chipsets didn't actually support 64 bit memory addressing until around the time the 955/965/975 came out. For example, I have a Core2Duo laptop with the 945G that only sees 3.25GB of RAM under 64-bit OSes.

Reply 97 of 106, by alexanrs

User metadata
Rank l33t
Rank
l33t
NJRoadfan wrote:

Despite being able to take 64-bit chips, Intel's chipsets didn't actually support 64 bit memory addressing until around the time the 955/965/975 came out. For example, I have a Core2Duo laptop with the 945G that only sees 3.25GB of RAM under 64-bit OSes.

AFAIK they still don't. Most chipsets I've seen are restricted to 36-bit addressing. And I'd not be surprised if consumer-grade Core iX processors are restricted to 36-bit addressing too.

Reply 98 of 106, by NJRoadfan

User metadata
Rank Oldbie
Rank
Oldbie

36-bit addressing gives you 64GB of RAM. Most consumer boards can't even go that high. Intel limits many of them to 32GB since they want you to buy the "E" platforms.

Reply 99 of 106, by feipoa

User metadata
Rank l33t++
Rank
l33t++

I will ultimately test all the motherboards to determine which boards I can simply put into the recycle bin and to determine which CPUs are functional, however I still only plan on keeping 4 of the 16 Intel 845 boards. I will keep all RAM and CPUs though, as these require little storage volume. I will recap that i865G board since this is the only 865 board in the lot.

I think even my PIII BIOS could see 3.3 - 3.6 GB, depending on which grapics card was installed and how many PCI ROM options were set. If all PCI ROM options were disabled and the onboard video was utilised, I recall the BIOS seeing 3.9x GB. I don't understand why these P4 boards are seeing less than a PIII board.

Plan your life wisely, you'll be dead before you know it.