VOGONS


First post, by kainiakaria

User metadata
Rank Newbie
Rank
Newbie

Intel Graphics have been a meme since the year 2000. We used Intel Integrated Graphics and we knew how crappy they were, that was why it was the butt of every joke when it came to graphics. Expecting integrated graphics to run games is like expecting a Honda Civic to win Le Mans. I remember trying to run Star Wars Jedi Knight Jedi Outcast on a Compaq Deskpro EN Intel i810 based system and it not wanting to even boot up, the Dell Optiplex GX1 that I had been using at the time could run that game and that computer had ATI 3D Rage PRO as its integrated chipset and that by the way says a lot about 1998 era tech. 1998-1999 models of the Compaq Deskpro EN also have the ATI 3D Rage PRO. I think the ATI 3D Rage PRO was a DirectX 5 based chipset. I used to run Half-Life on the Compaq Deskpro EN that not just had the ATI 3D Rage PRO but I also ran it on the Compaq Deskpro EN that was based on the Intel i810 video chipset and it ran. I also ran Half-Life on the Dell Optiplex GX1.

I will always have some amount of respect for Intel Integrated Graphics or at least on laptops from 2006 all the way to the present. Laptops with Intel Integrated Graphics from 2003 were as far as I could tell garbage. The fact that I could run Warcraft III on a Dell Inspiron B130 was mindblowing. The fact that I can run windows 2000 on the Inspiron B130 makes it good for backwards compatibility purposes. Not even my Gateway M320 which also had Intel IGP could run that game let alone any game for that matter. I am a huge fan of the ATI Mobility Radeon IGP Xpress Series when it comes to integrated laptop graphics mainly because they are DirectX 9.0B compatible. Because the Dell Vostro 1000 has an ATI Mobility Radeon IGP Xpress 1150 makes the retro gaming aspect of this laptop almost limitless.

Before ATI got into making mobile laptop video chipsets you had video chipsets like the Cirrus Logic GD7543, Cirrus Logic GD7548 or Trident Cyber9397 which was standard fair for integrated video chipsets on desktop computers from 1996-1997. These chipset usually had 1 MB to 2 MB of VRAM available. Socket 7 based HP Vectra Intel Pentium-Pentium MMX based machines had Cirus Logic and S3 integrated video chipsets. ATI 3D Rage LT PRO was one of ATI's greatest contributions to integrated video chipsets back in 1997-1998. Wikipedia says that the 3D Rage LT (aka Mach64 LT) was often implemented on motherboards and in mobile applications like notebook computers. This late 1996 chip was very similar to the 3D Rage II and supported the same application coding. It integrated a low-voltage differential signaling (LVDS) transmitter for notebook LCDs and advanced power management (block-by-block power control). The 3D RAGE LT PRO, based on the 3D RAGE PRO, was the very first mobile GPU to use AGP.

The 3D Rage LT Pro offered Filtered Ratiometric Expansion, which automatically adjusted images to full-screen size. ATI's ImpacTV2+ is integrated with the 3D RAGE LT PRO chip to support multi-screen viewing; i.e., simultaneous outputs to TV, CRT and LCD. In addition, the RAGE LT PRO can drive two displays with different images and/or refresh rates with the use of integrated dual, independent CRT controllers. The 3D Rage LT Pro was often used in desktop video cards that had a VESA Digital Flat Panel port to drive some desktop LCD monitors digitally.

Reply 1 of 45, by SirNickity

User metadata
Rank Oldbie
Rank
Oldbie
kainiakaria wrote on 2020-03-05, 17:09:

Intel Graphics have been a meme since the year 2000. We used Intel Integrated Graphics and we knew how crappy they were, that was why it was the butt of every joke when it came to graphics. Expecting integrated graphics to run games is like expecting a Honda Civic to win Le Mans.

I have always liked Intel IGP, but I've always had realistic expectations. It was a great, stable, power-efficient design that provided functional graphics for business PCs, laptops, servers, and home PCs that weren't designed for gaming. It wasn't a Radeon, but it wasn't meant to be. And if you didn't need a Radeon, just having fast-ish, essentially free, well-supported on-board graphics meant not having to deal with add-in video solutions.

I've played a few games on laptops, and they worked well enough, but TBH, it's just not something I'm interested in. If you look inside a gaming PC, the graphics card is the star of the show. It's often a two-slot monster with its own power cable and a turbo prop to cool it. What, in all we know about physics, makes anyone think that beast would run from a battery-powered device that can fit in a backpack? And if it did, the battery life would be 2 minutes and some seconds, and if you had it in your lap, it would roast chestnuts on an open fire like it was the holidays. No thanks. Even if the technology problem didn't exist, I would still have to find some way to reconcile the ergonomics. Nothing about this scenario makes any sense.

The game console world has had a long-standing tradition of separating on-the-go gaming from home gaming. If you're playing at home on a GameCube, you're playing on the go with the equivalent of an SNES. The games are scaled-down and simplified, only certain genres make it. The Switch upset this apple cart, and I didn't think it would work at all, but Nintendo has done a heck of a job pulling it off as well as they have. It's not a powerhouse at home, and it makes some compromises beyond that for on-the-go, but it manages to do a lot with a little.

Reply 4 of 45, by Shagittarius

User metadata
Rank Oldbie
Rank
Oldbie
rmay635703 wrote on 2020-03-06, 00:26:
Shagittarius wrote on 2020-03-06, 00:19:

Honda powered cars have won le mans multiple times

=)

Honda Insight

https://m.youtube.com/watch?v=McJJeukIWSA

That's like what happens when most integrated chips try to run current games.

Reply 5 of 45, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie

I thought at the time that the Intel 810 was pretty decent to be honest, as a entry point to 3d acceleration in 1999/2000 it got the job done, I remember it playing some games better than a Voodoo 2 (mostly due to the ram configuration I think) at that point in time (which was old news but not that old considering it was the absolute best 2/3 years earlier, and the 810 was a "free" feature of your motherboard)

I felt more disappointed with the slow evolution later, they seem to have stagnated a bit in gaming performance for a few years during the lga 775 days, the fact that GMA 950 and GMA 3100 still lacked hardware TnL and had software TnL and no PS 3.0 well into the late 2000s, or that the GMA 4500 had the hardware features but was really often slower than the GMA 950/3100

Reply 6 of 45, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

810 chipset (upgraded i740) should be more or less on par with Riva 128ZX SDRAM. Maybe even faster.

Get up, come on get down with the sickness
Open up your hate, and let it flow into me

Reply 7 of 45, by dionb

User metadata
Rank l33t
Rank
l33t

Why the focus on Intel? They were pretty late to the game. Integrated video for the PC started with the 1996-era SiS 5511+6202 chipset. Maybe "integrated" isn't quite the term, as it was still a discrete chip, but it shared system memory, which is the defining feature of integrated VGA. A year later, SiS came with the 5596, which was the first actually integrated solution, integrating the 6205 into the 5571 northbridge. This was two years before Intel's 810.

Of course performance was awful, particularly in the 5511+6202 and 5596, as shared bandwidth in an EDO system left the CPU (and VGA core for that matter) completely starved. Intels i810 had a significantly better core, but it suffered just as much from having to share bandwidth with CPU, all the more so when the i810 was paired with 133MHz FSB, but only allowed 100MHz memory - and then halved that.

A lot of the big innovations in integrated VGA were also done by others. The first actually shipping dual channel chipset was ALi's Aladdin 7, with dual-channel SDR-SDRAM to allow both CPU and integrated ArtX (later bought by ATi and turned into the Radeon) VGA to get decent memory performance and bandwidth. A good discrete card still performed better, but it blew away all other 2000-era integrated solutions. ATi was also first to allow integrated VGA and discrete VGA to cooperate so as to get maximum performance when needed and delivering maximum power efficiency when not.

Intel has just chugged along with the 'good enough' mantra- for non-gaming purposes (gamers may be noisy, but are and were a small minority) anything that reliably gets pictures onto a screen is good enough, and by integrating that into entry-level chipsets and later CPUs they've pushed all but the most high-end competitors out of the market. Great business acumen, but hardly a technical innovation.

Reply 8 of 45, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie

from what I tested SiS 530/630 style IGP and i810 are very far apart in performance, the Trident video included on the MVP4 is also quite terrible and with high performance penalty for the CPU/ram

if we are talking a few years later ATI and Nvidia had some interesting things,

another interesting aspect is that the i810 allowed for dedicated ram (most cheap boards didn't use this option) to boost performance compared to just using system ram, the only other IGP I remember with this was the AMD 780g and variations with "sideport" (128MB 32bit ddr2/3 for the sideport and 4MB 32bit sdr for the i810)

Reply 9 of 45, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie

My biggest detour into integrated graphics gaming happened in 2009. I was very enthusiastic about netbooks back then (and I still really love the idea) and the first "big" thing I bought myself, with money I got as salary, was a 10" Atom N270 laptop. Then bad things happened and my father got into the hospital, while I had to look after my granny. There was no space for a normal PC in their apartment and the N270 was the only laptop in our family.

It sounds strange now, in the era of Android devices, but I had that laptop with me all the time. I used it at the university for various assignments, at work, at home - everywhere and for everything, including computer gaming. And in fact, gaming was not so bad. Sure, I wanted to run Heroes of Might and Magic V and could not, but there were countless earlier titles that ran beautifully. I played a lot of Max Payne, Heroes III, Re-Volt, Fallout 1 and 2. The experience in these games was perfect. Especially Heroes III that have native 800x600 resolution that matches the 1024x600 screen very well.

You may call Intel GPUs of the time lacking, but you can't argue with the price. $350 for a complete system including a monitor. A decent videocard of the time, for example, HD4870, would cost $170 alone.

Reply 10 of 45, by ragefury32

User metadata
Rank Member
Rank
Member
kainiakaria wrote on 2020-03-05, 17:09:
The Dell Optiplex GX1 that I had been using at the time could run that game and that computer had ATI 3D Rage PRO as its integra […]
Show full quote

The Dell Optiplex GX1 that I had been using at the time could run that game and that computer had ATI 3D Rage PRO as its integrated chipset and that by the way says a lot about 1998 era tech. 1998-1999 models of the Compaq Deskpro EN also have the ATI 3D Rage PRO. I think the ATI 3D Rage PRO was a DirectX 5 based chipset. I used to run Half-Life on the Compaq Deskpro EN that not just had the ATI 3D Rage PRO but I also ran it on the Compaq Deskpro EN that was based on the Intel i810 video chipset and it ran. I also ran Half-Life on the Dell Optiplex GX1.

I am a huge fan of the ATI Mobility Radeon IGP Xpress Series when it comes to integrated laptop graphics mainly because they are DirectX 9.0B compatible. Because the Dell Vostro 1000 has an ATI Mobility Radeon IGP Xpress 1150 makes the retro gaming aspect of this laptop almost limitless.

Before ATI got into making mobile laptop video chipsets you had video chipsets like the Cirrus Logic GD7543, Cirrus Logic GD7548 or Trident Cyber9397 which was standard fair for integrated video chipsets on desktop computers from 1996-1997. These chipset usually had 1 MB to 2 MB of VRAM available. Socket 7 based HP Vectra Intel Pentium-Pentium MMX based machines had Cirus Logic and S3 integrated video chipsets. ATI 3D Rage LT PRO was one of ATI's greatest contributions to integrated video chipsets back in 1997-1998. Wikipedia says that the 3D Rage LT (aka Mach64 LT) was often implemented on motherboards and in mobile applications like notebook computers. This late 1996 chip was very similar to the 3D Rage II and supported the same application coding. It integrated a low-voltage differential signaling (LVDS) transmitter for notebook LCDs and advanced power management (block-by-block power control). The 3D RAGE LT PRO, based on the 3D RAGE PRO, was the very first mobile GPU to use AGP.

The 3D Rage LT Pro offered Filtered Ratiometric Expansion, which automatically adjusted images to full-screen size. ATI's ImpacTV2+ is integrated with the 3D RAGE LT PRO chip to support multi-screen viewing; i.e., simultaneous outputs to TV, CRT and LCD. In addition, the RAGE LT PRO can drive two displays with different images and/or refresh rates with the use of integrated dual, independent CRT controllers. The 3D Rage LT Pro was often used in desktop video cards that had a VESA Digital Flat Panel port to drive some desktop LCD monitors digitally.

First of all, the ATi Rage LT series is NOT an integrated GPU. It's a discrete GPU. It's not integrated into the Northbridge nor the CPU die (like an AMD APU or any Intel Core series GPU starting with Arrandale), or nor does it "borrow" main memory - because that's the definition of an integrated GPU: it's hidden inside something else and takes away from your main memory. The rule of thumb is that If you can find a GPU chip on the motherboard (or on a separate card), it's discrete - If it's hidden inside something else, it's integrated. There were definitely discrete GPUs embedded onto motherboards by OEMs - the S3 Trio is commonly embedded onto the motherboards of IBM Aptiva desktops in the mid-to-late '90s. The Thinkpad T21 on my shelf has an S3 SavageIX GPU on the motherboard as well. In contrast, The Intel X3100 Integrated GPU is hidden inside the i965 series Northbridge chip, as is the X300 GPU die in the RS400 Northbridges ATi made for Intel CPUs. Now, just because they were integrated doesn't make them bad - some implementations were decent. The SGI O2 workstation uses an integrated GPU, and they work just fine. The nVidia MCP89 in the 2010 Macbook Air/Pro 13s were competitive against the Geforce 7600 Gos, and Bioshock Infinite was definitely playable via Haswell/Broadwell GT3 graphics. The Rage LT might be the first GPU intended for laptops (announced in late 1996 sampling, but not in volume production until mid-1997), but it only made it into a single laptop design as it was a PCI only...the Wallstreet Powerbook G3. The Rage Pro LT was also not the first mobile GPU to use AGP - that chip came out in November '97. Both the S3 Virge/MX and the Neomagic 256AV came out earlier than that.

There are weird little integrated/discrete hybrids throughout history, like the i810s with the AIMM AGP memory module (it allows the i810 to talk to the RAM module in the AGP port and not borrow main RAM - which would in theory perform better, but the entire i740 Starfighter architecture relying on AGP for fast VRAM access was a massive conceptual blunder), the Intel Iris Pros with their massive L4 cache (128MB on my Haswell 4770R) acting like embedded VRAM, AMD with that zero-copy RDMA2 architecture on their Kabini APUs, or the recent Kaby Lake-G where Intel embedded a Vega-M GPU and 4GB of VRAM directly onto their CPU die. Out of the 4, AIMM was a non-starter, Kaby Lake-G is a dead end, I am not sure if Intel is still doing Iris Pros, but the RDMA thing is the linchpin behind the Playstation 4/XBox One sharing 8GB between their integrated GPU and their CPUs.

Reply 11 of 45, by ragefury32

User metadata
Rank Member
Rank
Member
RacoonRider wrote on 2020-03-08, 06:38:

My biggest detour into integrated graphics gaming happened in 2009. I was very enthusiastic about netbooks back then (and I still really love the idea) and the first "big" thing I bought myself, with money I got as salary, was a 10" Atom N270 laptop. Then bad things happened and my father got into the hospital, while I had to look after my granny. There was no space for a normal PC in their apartment and the N270 was the only laptop in our family.

It sounds strange now, in the era of Android devices, but I had that laptop with me all the time. I used it at the university for various assignments, at work, at home - everywhere and for everything, including computer gaming. And in fact, gaming was not so bad. Sure, I wanted to run Heroes of Might and Magic V and could not, but there were countless earlier titles that ran beautifully. I played a lot of Max Payne, Heroes III, Re-Volt, Fallout 1 and 2. The experience in these games was perfect. Especially Heroes III that have native 800x600 resolution that matches the 1024x600 screen very well.

You may call Intel GPUs of the time lacking, but you can't argue with the price. $350 for a complete system including a monitor. A decent videocard of the time, for example, HD4870, would cost $170 alone.

Heh. Could you imagine how much more fun you could've had if you bought an Atom Netbook with the nVidia MCP79 chipset (AKA an nVidia Ion machine)? That i945 chipset actually used up more power than the Atom CPU itself.

Also, around 2009 was when I bought myself one of those CULV machines that Intel was pushing back then...think of them as like the grandaddy of the modern Ultrabook crop. The Intel SU2300 powered Acer Aspire 1410 was quite a good machine, and not that much more expensive compared to a Netbook of that time....it's still on my rack with Debian 10 installed.

Reply 12 of 45, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie

Intel GMA series were lame ducks.They were always seriously lacking in real games performance compared to NVIDIA/ATI IGP chipsets at the same time. They might look good in benchmarks but when it came down to actual games compatibility, drivers support and maturity, they always looked bad compared to NVIDIA/ATI IGP chipsets. Some may argue that no one was playing games with IGP, so why would anyone care. Well, it depends, NVIDIA/ATI IGP were usually good at past 2 generations games, but Intel were plagued with tons of compatibility issues. Intel HD Graphics were so much better and that was when Intel was serious in catching up in the Graphics department. When the graphics vendors reduced to 2-horse race, and one of them was your main competitor while the other was extremely arrogant to team up as partner, Intel had no choice but to invest on its own. It is interesting to see what Intel has to offer later this year when they introduce their in-house dGPU to compete with AMD and NVIDIA.

Reply 13 of 45, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie
ragefury32 wrote on 2020-03-08, 07:42:

Heh. Could you imagine how much more fun you could've had if you bought an Atom Netbook with the nVidia MCP79 chipset (AKA an nVidia Ion machine)?

Who knows? Newer games aren't always better, as the existence of this forum has proven, however, I would have certainly noticed higher weight and/or lower battery life. And by the moment I got the N270 I have already been saving money for more than half a year, so I wouldn't afford this machine until later that year. Students don't get paid much.

Last edited by RacoonRider on 2020-03-08, 19:05. Edited 1 time in total.

Reply 14 of 45, by dionb

User metadata
Rank l33t
Rank
l33t
SPBHM wrote on 2020-03-08, 01:59:

from what I tested SiS 530/630 style IGP and i810 are very far apart in performance, the Trident video included on the MVP4 is also quite terrible and with high performance penalty for the CPU/ram

The performance penalty for CPU/RAM was exactly the same (except when the i810 ran asynch - with 66MHz FSB Celeron, the impact was limited, with 133MHz P3 worsened), as for the cores, they varied. The fastest for P3 (by far) was ALi's AladdinTNT2, for fairly obvious reasons 😉

In terms of raw 3D performance SiS' 630 was slightly inferior to i810 (but you wouldn't enjoy Q3A on that either), but it did offer motion compensation for MPEG2 playback and independent overlay for TV-out while you worked on the desktop.

[...]

another interesting aspect is that the i810 allowed for dedicated ram (most cheap boards didn't use this option) to boost performance compared to just using system ram, the only other IGP I remember with this was the AMD 780g and variations with "sideport" (128MB 32bit ddr2/3 for the sideport and 4MB 32bit sdr for the i810)

It was a technically great idea, but commercially less so- the same reason ALI's Aladdin 7 (with a second memory channel) and AladdinTNT2 (with a very powerful core) failed. Integrated VGA cateres to the majority of the market who basically doesn't care about performance. Adding more RAM bandwidth and/or a hefty core increases the price to the point that someone would have to consciously choose for the 'special' solution, and anyone prepared to do that would go for a discrete GPU instead.

RacoonRider wrote on 2020-03-08, 06:38:

[...]

You may call Intel GPUs of the time lacking, but you can't argue with the price.

And that is why Intel is now the biggest GPU vendor (and AMD is on its platform), and there is no market anymore for low-end to mid-range GPUs.

Reply 15 of 45, by ragefury32

User metadata
Rank Member
Rank
Member
dionb wrote on 2020-03-08, 16:12:
The performance penalty for CPU/RAM was exactly the same (except when the i810 ran asynch - with 66MHz FSB Celeron, the impact w […]
Show full quote
SPBHM wrote on 2020-03-08, 01:59:

from what I tested SiS 530/630 style IGP and i810 are very far apart in performance, the Trident video included on the MVP4 is also quite terrible and with high performance penalty for the CPU/ram

The performance penalty for CPU/RAM was exactly the same (except when the i810 ran asynch - with 66MHz FSB Celeron, the impact was limited, with 133MHz P3 worsened), as for the cores, they varied. The fastest for P3 (by far) was ALi's AladdinTNT2, for fairly obvious reasons 😉

In terms of raw 3D performance SiS' 630 was slightly inferior to i810 (but you wouldn't enjoy Q3A on that either), but it did offer motion compensation for MPEG2 playback and independent overlay for TV-out while you worked on the desktop.

[...]

another interesting aspect is that the i810 allowed for dedicated ram (most cheap boards didn't use this option) to boost performance compared to just using system ram, the only other IGP I remember with this was the AMD 780g and variations with "sideport" (128MB 32bit ddr2/3 for the sideport and 4MB 32bit sdr for the i810)

It was a technically great idea, but commercially less so- the same reason ALI's Aladdin 7 (with a second memory channel) and AladdinTNT2 (with a very powerful core) failed. Integrated VGA cateres to the majority of the market who basically doesn't care about performance. Adding more RAM bandwidth and/or a hefty core increases the price to the point that someone would have to consciously choose for the 'special' solution, and anyone prepared to do that would go for a discrete GPU instead.

RacoonRider wrote on 2020-03-08, 06:38:

[...]

You may call Intel GPUs of the time lacking, but you can't argue with the price.

And that is why Intel is now the biggest GPU vendor (and AMD is on its platform), and there is no market anymore for low-end to mid-range GPUs.

Eh, the Ali Aladdin TNT2 were not always the top of the Socket370 game. There’a also the VIA PM133 with the ProSavage (based on a cleaned up S3 Savage4), which is just as good, and sometimes, better than that the TNT2.

The i81x dedicated RAM thing (AIMM) is a bit... special. It's RAM (PC66/100 if I remember correctly) attached to the AGP port.
Normally in discrete GPUs you have VRAM directly attached to it using ring buffers or crossbar switches, and that’ll get you gigabytes of transfer per second (a Rage 128 has around 3GB/sec transfer, and a Radeon 7500 is around 6), and if that local VRAM runs out, the driver swaps texture and/or geometry data into/out of main system memory using DMA calls through the AGP port at 1-8x speeds (266Mbytes/sec to 2128 MB/sec, and at the beginning it's more like 2x). Which is quite a shortfall. If your GPU has no local VRAM and only depend on AGP, your GPU is constantly starved waiting for data, and even worse, the latency goes through the roof (instead of talking to VRAM directly attached to the GPU you must send DMA commands to the AGP port, wait until the northbridge responds, and then for the system RAM to deliver the goods to it).

The idea of purely depending on the AGP port for operations was what crippled the i740, and why the i810 was only kinda *meh* even with the AIMM installed. The i740 Starfighter was originally designed to complement a Pentium 166 MMX (or so I remember from old Intel developer docs) and to run optimally on 640x480, so you can say that for a card that came out in late '97/early '98, it was aiming for Voodoo1 levels of performance and capability. Too bad it hit mainstream when the nVidia TNT/Voodoo2/3 are already out.

Of course, later on you got stuff like TurboCache or HyperMemory, which is essentially small local RAM combined with PCIe x16, which could in theory throw down 8GB/sec for bandwidth, about the same as some of the more lowly internal GPU bus bandwidth (like an X300 on the Xpress 1200). Still not great for latency and you still want a card with more local VRAM than depend on the PCIe bus, but you can get by.

Last edited by ragefury32 on 2020-03-10, 13:02. Edited 1 time in total.

Reply 17 of 45, by Standard Def Steve

User metadata
Rank Oldbie
Rank
Oldbie

Out of sheer curiosity, does anyone know when (and how) IGPs stopped robbing CPUs of half the available memory bandwidth? Right now I only have two machines with onboard graphics, a Core 2 E8600/dual channel DDR3-1333 with GMA x4500 HD, and an i7-4790/dual channel DDR3-1600 with HD 4600. Both machines have dedicated GPUs installed. However, I ran an AIDA64 memory benchmark on both machines before and after installing the video cards. On both machines, the read/write/copy bandwidth didn't change at all, but latency was slightly better, dropping a few nanoseconds after the video card install.

Memtest86 bandwidth numbers were exactly the same. I remember back in the P4/Athlon XP days, installing an AGP graphics card resulted in significantly higher RAM bandwidth in memtest and AIDA64.

On a side note, I remember seeing an i815 based machine with the 4MB display cache built right into the motherboard. Think it was an SFF Deskpro.

Standard Def Rigs
Super P3: PIII-S @ 1.63 GHz/FSB155 | 2GB DDR-310 | 6800GT AGP | 500GB 7200 RPM
Super G4: 2x PowerPC 7455 @ 1.5 GHz | 2GB DDR-333 | 7800GS AGP | 300GB 10k RPM
Super G5: 4x PowerPC 970 @ 2.5 GHz | 16GB DDR2-533 | x1950XT PCIe | 512GB SSD

Reply 18 of 45, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie
Standard Def Steve wrote on 2020-03-09, 16:39:

Memtest86 bandwidth numbers were exactly the same. I remember back in the P4/Athlon XP days, installing an AGP graphics card resulted in significantly higher RAM bandwidth in memtest and AIDA64.

The chipsets and memory controllers designs could have improved over time, especially if it is Intel. An idle IGP could have clock-gated itself and consume no memory bandwidth. You can also completely disable the IGP if the BIOS provides such option to free up the stolen memory. High bandwidth memory benefits more for the IGP than the CPU for their typical workload patterns.

Reply 19 of 45, by SirNickity

User metadata
Rank Oldbie
Rank
Oldbie
dionb wrote on 2020-03-08, 16:12:
RacoonRider wrote on 2020-03-08, 06:38:

You may call Intel GPUs of the time lacking, but you can't argue with the price.

And that is why Intel is now the biggest GPU vendor (and AMD is on its platform), and there is no market anymore for low-end to mid-range GPUs.

Nor audio cards, network cards, I/O cards... Integration of video was kind of the next obvious target to building turn-key reference desktop designs for the business market, and laptops for that matter. When that was covered, they took the north bridge too! 😁