VOGONS

Common searches


Radeon iGPU in 2017 intel CPU's...

Topic actions

First post, by kanecvr

User metadata
Rank Oldbie
Rank
Oldbie

So I came across this today:

https://www.techpowerup.com/230360/first-inte … ics-within-2017

It says that Intel has licensed AMD Radeon GPU technology, and said tech will be used for on-board graphics on future Intel processors. Any toughs on this?

Reply 1 of 24, by Scali

User metadata
Rank l33t
Rank
l33t

To me it sounds highly unlikely, since Intel has spent a lot of resources in the past few years on closing the gap with AMD, and in fact is now ahead of AMD (and nVidia) in terms of features. Intel currently offers the most complete DX12-GPU on the market, supporting conservative rasterization tier 3 and raster ordered views. nVidia only supports CR tier 2 in Pascal (and CR tier 1 in Maxwell v2). AMD supports neither technology.
Their performance is close as well, and is somewhat arbitrary, since Intel generally spends less transistors on the iGPU than AMD does. So Intel should be able to just scale up the iGPU a bit if they want the same performance as AMD. Having 10 nm production can compensate somewhat for the extra transistors.
Apparently Intel doesn't think the market demands such powerful iGPUs, otherwise they would already have been offering such configurations. Instead, Intel focuses on more powerful CPUs.

I think these stories are a misinterpretation of a renewed cross-licensing of GPU technology between AMD and Intel (the 'big three' all hold a number of crucial patents on GPU technology, without which, a modern, effcient, DX12-compatible GPU would be impossible to build).
But we will see.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 2 of 24, by kanecvr

User metadata
Rank Oldbie
Rank
Oldbie

Intel iGPU's are rubbish - with the exception of Iris witch only made it to expensive CPU's and is a pretty expensive piece of silicon itself. As for "closing the gap", the features you describe bring no real-world performance to the table. What people want to see is good framerate and image quality, not tech-speak and theoretical performance.

I for one thing it would be interesting to see radeon graphics in intel chips - could be useful on cheap-ish laptops.

Reply 3 of 24, by Scali

User metadata
Rank l33t
Rank
l33t

I'm just telling it like it is. AMD is getting behind on the technical level. Intel is at the forefront.
If anything, AMD needs Intel's GPU technology to stay relevant at this point, not the other way around.

What you're saying sounds like excuses and emotions. "It's rubbish", "no real-world performance" etc. Heard it all too many times.
Let's face facts: AMD hasn't done anything on the GPU front in years. They've just been rehashing their GCN, and glued HBM on there (which wasn't even enough to outperform the competition that still uses GDR, because their architecture is getting dated, and lacks both features and efficiency).

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 4 of 24, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie

I'd question why Intel would willingly work together with one of its biggest competitors on a product, but, I did once own a laptop that had an AMD GPU and an Intel CPU under the hood. It was an Alienware M17x R2, a 2010 model, made years after the ATI/AMD merger. I mean, the CPU and GPU were separate, unlike what the OP's article is proposing, but even still, it was a post-2006 laptop from a large manufacturer.

Reply 5 of 24, by kanecvr

User metadata
Rank Oldbie
Rank
Oldbie
Scali wrote:
I'm just telling it like it is. AMD is getting behind on the technical level. Intel is at the forefront. If anything, AMD needs […]
Show full quote

I'm just telling it like it is. AMD is getting behind on the technical level. Intel is at the forefront.
If anything, AMD needs Intel's GPU technology to stay relevant at this point, not the other way around.

What you're saying sounds like excuses and emotions. "It's rubbish", "no real-world performance" etc. Heard it all too many times.
Let's face facts: AMD hasn't done anything on the GPU front in years. They've just been rehashing their GCN, and glued HBM on there (which wasn't even enough to outperform the competition that still uses GDR, because their architecture is getting dated, and lacks both features and efficiency).

Right. Has intel released a GPU capable of 1080p / ultra? Until they do, they are not relevant in the market. As a gamer the only relevant chip manufacturers are nvidia and AMD - intel has a LONG way to go in that field and have proven time and time again that they are unwilling to invest in it - as such, they are licensing GPU tech from AMD.

Scali wrote:

What you're saying sounds like excuses and emotions. "It's rubbish", "no real-world performance" etc. Heard it all too many times.

Because it's the truth. Gaming on an intel GPU is masochism at best, impossible at worst. I don't care about video encoding and quicksinc - fast nvidia cards do it better. Intel's iGPU's might more advanced, but they are still rubbish for gaming and 3d modeling.

Reply 6 of 24, by clueless1

User metadata
Rank l33t
Rank
l33t

High framerates sell. Look at the Nvidia FX5200. Its DX9 support was very advanced, but it's real-world performance was behind the competition. After gamers read the reviews, did they spend their money on it, or on a Ti4200 or Radeon 9500Pro?

The more I learn, the more I realize how much I don't know.
OPL3 FM vs. Roland MT-32 vs. General MIDI DOS Game Comparison
Let's benchmark our systems with cache disabled
DOS PCI Graphics Card Benchmarks

Reply 7 of 24, by Scali

User metadata
Rank l33t
Rank
l33t
mr_bigmouth_502 wrote:

I'd question why Intel would willingly work together with one of its biggest competitors on a product, but, I did once own a laptop that had an AMD GPU and an Intel CPU under the hood. It was an Alienware M17x R2, a 2010 model, made years after the ATI/AMD merger. I mean, the CPU and GPU were separate, unlike what the OP's article is proposing, but even still, it was a post-2006 laptop from a large manufacturer.

But then it was Alienware's choice, not Intel's.
I even have an old Intel Celeron Northwood laptop, which has an ATi Radeon chipset (yes, entire chipset, not just the GPU).
If you're using a separate GPU anyway, then it doesn't matter whether you choose AMD or nVidia, does it?
But having Intel integrate a third-party GPU on their own CPUs seems unlikely. I mean, yes, they once did that with PowerVR GPUs on the early Atoms... but Imgtech is not an x86 manufacturer, and in fact does not produce chips of its own at all. So it is not a competitor.
For Intel there are many good non-technical reasons for not wanting AMD GPUs (clearly they don't need faster GPUs to maintain their marketshare, or else they would have teamed up with another GPU-maker years ago).
If anything, they wouldn't want AMD to get the additional marketshare.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 8 of 24, by Scali

User metadata
Rank l33t
Rank
l33t
kanecvr wrote:

Right. Has intel released a GPU capable of 1080p / ultra?

What does that even mean? Intel GPUs have displayport and HDMI2 with 4k 60 Hz support: https://arstechnica.com/gadgets/2015/08/skyla … dedicated-gpus/
In fact, they had that before AMD did, 🤣.

Also, you make it sound as if AMD iGPUs are somehow better at gaming. Couldn't be further from the truth. In fact, in some games they perform even worse than Intel, because they are too CPU-bottlenecked.
No iGPU gives you a proper gaming experience relative to a discrete card, because of the simple reason that you use system memory, which is a huge performance bottleneck (I hope you're not talking about 4k gaming, because my main computer has a 4k monitor and a GeForce 970OC, but even then I play many games in 1080p, because I can't find a nice balance between detail and framerate that I like in 4k. Now you're not going to tell me there's any iGPU out there that's anywhere near the performance of a 970OC).
Nevertheless, some reviewers think Intel has closed that gap: http://www.pcgamer.com/gaming-with-integrated … ly-good-enough/
http://in.ign.com/pc/85334/feature/gaming-on- … grated-graphics

Last edited by Scali on 2017-02-04, 22:50. Edited 1 time in total.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 9 of 24, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

I just finished testing a MeeGoPad (Intel Compute Stick) and was very surprised by the graphics chip. It had zero compatibility issues, although performance of such a device is limited.

I've been using Intel HD graphics in quite some machines simply because I do not play games on those.

YouTube, Facebook, Website

Reply 11 of 24, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++
gdjacobs wrote:

How does it score on your benchmark ensemble running under Dosbox?

Around 70 fps in 3D Bench 1.0c, so limited to 386 and 486 era games. But it did run Munt fine while playing Wing Commander 2.

YouTube, Facebook, Website

Reply 12 of 24, by Oldskoolmaniac

User metadata
Rank Oldbie
Rank
Oldbie

Of course intel graphics is not for gaming, they are only meant for watching movies, browsing the web and business work on low powered laptops.

I doubt intel will have amd video on its chip, intel probably bought licensing from amd so they can make iGPUs without infringing patents. amd has done that in the past with intel during the 386 days.

Motherboard Reviews The Motherboard Thread
Plastic parts looking nasty and yellow try this Deyellowing Plastic

Reply 13 of 24, by vladstamate

User metadata
Rank Oldbie
Rank
Oldbie
Oldskoolmaniac wrote:

Of course intel graphics is not for gaming, they are only meant for watching movies, browsing the web and business work on low powered laptops.

Yes but like Scali was saying this is not for the lack of features but rather because it is an integrated GPU. Therefore it has to use same system bus as the CPU and it is also usually clocked at lower speed. If you were to theoretically put an Intel GPU on a dedicated card with a dedicated 1Gb GDDR and clock it at 1Ghz it will do as well (if not better) than a similar speed-specced AMD.

So saying that Intel graphics is not for gaming while correct in practice is not entirely accurate.

YouTube channel: https://www.youtube.com/channel/UC7HbC_nq8t1S9l7qGYL0mTA
Collection: http://www.digiloguemuseum.com/index.html
Emulator: https://sites.google.com/site/capex86/
Raytracer: https://sites.google.com/site/opaqueraytracer/

Reply 14 of 24, by Scali

User metadata
Rank l33t
Rank
l33t
vladstamate wrote:
Oldskoolmaniac wrote:

Of course intel graphics is not for gaming, they are only meant for watching movies, browsing the web and business work on low powered laptops.

Yes but like Scali was saying this is not for the lack of features but rather because it is an integrated GPU. Therefore it has to use same system bus as the CPU and it is also usually clocked at lower speed. If you were to theoretically put an Intel GPU on a dedicated card with a dedicated 1Gb GDDR and clock it at 1Ghz it will do as well (if not better) than a similar speed-specced AMD.

So saying that Intel graphics is not for gaming while correct in practice is not entirely accurate.

Indeed, which is why Intel started putting the eDRAM on the high-end models. As you can read in some of the linked articles above, this can greatly improve the performance of an otherwise identical iGPU, because the eDRAM acts as a cache to reduce the bandwidth bottleneck.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 16 of 24, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I have a Surface Pro 4 with the i7 and Iris 540 w/ EDRAM. I have tried lots of games on it without issue, including old games. Intel's drivers seem really solid these days. The only problem is the device is limited to about 18W TDP sustained with the APU. It just can't dissipate more power than that.

Reply 17 of 24, by ODwilly

User metadata
Rank l33t
Rank
l33t

Back when Ivy Bridge first came out my friend built his 3570k machine. Could not afford a gpu at first so he used the Intel graphics. We were both impressed when it managed to play Skyrim on medium settings and averaged between 30 to 60 fps.

Main pc: Asus ROG 17. R9 5900HX, RTX 3070m, 16gb ddr4 3200, 1tb NVME.
Retro PC: Soyo P4S Dragon, 3gb ddr 266, 120gb Maxtor, Geforce Fx 5950 Ultra, SB Live! 5.1

Reply 18 of 24, by vladstamate

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

I have a Surface Pro 4 with the i7 and Iris 540 w/ EDRAM. I have tried lots of games on it without issue, including old games. Intel's drivers seem really solid these days. The only problem is the device is limited to about 18W TDP sustained with the APU. It just can't dissipate more power than that.

Oh Oh oh Surface Pro 4! I would be interested in buying one of those! Are your experiences mostly positive? I would be using it for programming (Visual Studio) and DosBox games.

YouTube channel: https://www.youtube.com/channel/UC7HbC_nq8t1S9l7qGYL0mTA
Collection: http://www.digiloguemuseum.com/index.html
Emulator: https://sites.google.com/site/capex86/
Raytracer: https://sites.google.com/site/opaqueraytracer/

Reply 19 of 24, by gerwin

User metadata
Rank l33t
Rank
l33t

The main deal-breaker for me is the intel HD settings panel, it sucks and they won't improve it:
- No scaling options unless your desktop is in a video mode that is not the native mode of the screen.
- No forcing of Anti-Aliasing.
- The CAD software that I expected to natively use the intel HD FSAA capability (in DirectX 10.1 Spec), does not enable the FSAA either. Works fine with NVidia.
- Requires Dot Net FX installation.

Some games like the new 'Combat Mission' series cannot run properly on Sandy Bridge intel HD. that is an OpenGL game. Most other games I tried do run well, though with the above mentioned limitations.

--> ISA Soundcard Overview // Doom MBF 2.04 // SetMul