VOGONS

Common searches


Radeon iGPU in 2017 intel CPU's...

Topic actions

Reply 20 of 24, by swaaye

User metadata
Rank l33t++
Rank
l33t++
vladstamate wrote:
swaaye wrote:

I have a Surface Pro 4 with the i7 and Iris 540 w/ EDRAM. I have tried lots of games on it without issue, including old games. Intel's drivers seem really solid these days. The only problem is the device is limited to about 18W TDP sustained with the APU. It just can't dissipate more power than that.

Oh Oh oh Surface Pro 4! I would be interested in buying one of those! Are your experiences mostly positive? I would be using it for programming (Visual Studio) and DosBox games.

It's very fast. I suggest the i5 or i7 models. The SSD is extremely fast too. The screen is top quality and I love the 3:2 ratio. Amazing as a tablet too, though a bit heavy I guess. I like it in tablet form for reading old mags. The type cover is very functional and durable.

Battery life is adequate. If it's mostly idle it will probably run around 7 hours. I can get about 2 hours 3D gaming on it.

I read about occasional QA defects but I've also read MS support is top notch. MS has released many firmware and driver updates over the last year trying to perfect it. They all come in thru Windows Update and it's painless. Impressive support efforts.

I think it's the ultimate netbook / subnotebook and an extremely powerful tablet option. I wish physics would stop being a problem though so it wouldn't need to throttle or get hot. 😉

Reply 21 of 24, by swaaye

User metadata
Rank l33t++
Rank
l33t++
gerwin wrote:
The main deal-breaker for me is the intel HD settings panel, it sucks and they won't improve it: - No scaling options unless yo […]
Show full quote

The main deal-breaker for me is the intel HD settings panel, it sucks and they won't improve it:
- No scaling options unless your desktop is in a video mode that is not the native mode of the screen.
- No forcing of Anti-Aliasing.
- The CAD software that I expected to natively use the intel HD FSAA capability (in DirectX 10.1 Spec), does not enable the FSAA either. Works fine with NVidia.
- Requires Dot Net FX installation.

Some games like the new 'Combat Mission' series cannot run properly on Sandy Bridge intel HD. that is an OpenGL game. Most other games I tried do run well, though with the above mentioned limitations.

I haven't used Sandy Bridge or Ivy Bridge graphics much. Intel has abandoned them though. I think Haswell is basically done too now. I've read that Sandy and Ivy have terrible MSAA performance. That might be why they never bothered with options to force it.

On the topic of scaling, Intel actually worked around a Windows 10 issue there. There's a big thread on their forum about scaling not working with games. Their engineers came in and explained it. I was impressed by that.

Reply 22 of 24, by m1so

User metadata
Rank Member
Rank
Member
Scali wrote:
What does that even mean? Intel GPUs have displayport and HDMI2 with 4k 60 Hz support: https://arstechnica.com/gadgets/2015/08/s […]
Show full quote
kanecvr wrote:

Right. Has intel released a GPU capable of 1080p / ultra?

What does that even mean? Intel GPUs have displayport and HDMI2 with 4k 60 Hz support: https://arstechnica.com/gadgets/2015/08/skyla … dedicated-gpus/
In fact, they had that before AMD did, 🤣.

Also, you make it sound as if AMD iGPUs are somehow better at gaming. Couldn't be further from the truth. In fact, in some games they perform even worse than Intel, because they are too CPU-bottlenecked.
No iGPU gives you a proper gaming experience relative to a discrete card, because of the simple reason that you use system memory, which is a huge performance bottleneck (I hope you're not talking about 4k gaming, because my main computer has a 4k monitor and a GeForce 970OC, but even then I play many games in 1080p, because I can't find a nice balance between detail and framerate that I like in 4k. Now you're not going to tell me there's any iGPU out there that's anywhere near the performance of a 970OC).
Nevertheless, some reviewers think Intel has closed that gap: http://www.pcgamer.com/gaming-with-integrated … ly-good-enough/
http://in.ign.com/pc/85334/feature/gaming-on- … grated-graphics

It means none of their GPUs can play games on 1080p/Ultra. Neither can AMD IGPs, but from experience most of their IGPs play them far better rhan Intel ones. Yoy can play Fallout 4 on an AMD APU,not on Intel HD Graphics. DirectX 12 support? When most games provide zero visual or performance benefit using it and in nany cases run worse? No DX12 game would run at ovet 10 fps on an Intel non-Iris GPU anyways.

But then, you believe AMD fans are idiots and wrote articles with titles like that on your blog. I actually respected you for your retro stuff until I found out just how Intel biased are you. And I say this as someone who has an Intel CPU and an nVidia video card. Has intel EVER made a decent GPU?

Reply 23 of 24, by dr_st

User metadata
Rank l33t
Rank
l33t

This whole discussion needs to be put in the right context.

Of course, an integrated GPU will never be able to compete with a same-generation high-end (or even mid-range) discrete GPU in raw power. GPU power and game graphical complexity tend to go hand in hand, so that the games can challenge the high-end modern GPUs (if they couldn't challenge them, who would pay for the top-of-the-line solutions?! Masochists who like huge heavy power-sucking prone-to-failure ovens in their desktops? 😜)

But go a couple of generations back, and you will see that the integrated video solutions can suddenly match and even surpass the mid-range+ cards of older generation. Thus, the claim "none of their GPUs can play games on 1080p/Ultra" is patently false. Such a claim is only true when it comes to graphics-intensive modern-generation games (which are the only games that are tested in GPU reviews, because it's the only games that show the differences).

Integrated GPUs can play tons of older games on 1080/Ultra and more. It depends how far back you go with the GPU and the games. They can also play tons of modern, not graphically intensive games very well.

So yes, it's true that they will never be able to play modern heavy games well, and they are not intended to. But this is VOGONS. Here we like old games and old machines, and you see lots of threads about people building systems with top-of-the-line nVidia/AMD solutions of 5-10 years ago. A modern system based on an integrated GPU will often run leaps and bounds around such solutions.

https://cloakedthargoid.wordpress.com/ - Random content on hardware, software, games and toys

Reply 24 of 24, by Scali

User metadata
Rank l33t
Rank
l33t
m1so wrote:

But then, you believe AMD fans are idiots and wrote articles with titles like that on your blog. I actually respected you for your retro stuff until I found out just how Intel biased are you.

Only AMD fanboys can't see beyond the title and don't understand what the article is about, and how I'm not Intel-biased at all (it's not even about Intel or AMD, it's about fanboys). Not to mention that blog is not exactly recent, it's from 2009. This says a lot about you.
(If anything, the fact that I said back in 2009 that Intel and AMD don't normally 'leapfrog' eachother with every generation, and that this still hasn't happened at least up to today, 2017, shows just how right I was when I wrote that piece).

I mean, 🤣? If you read about my retro stuff, you'd know that my roots lie in the non-Intel world of Commodore 64 and Amiga. If anything, Intel was the enemy. Trying to call me Intel-biased is ludicrously missing the point.

m1so wrote:

Has intel EVER made a decent GPU?

Depends on your definition of 'decent'.
AMD fanboys such as yourself look only at one criterion: absolute performance (and then cherry-pick, because as you already said, 'non-IRIS only').
There are different design goals you might have for GPUs than just (absolute) performance. You can also optimize a design for cost, transistor count, or things like that.
That's where Intel's true strength lies: OEMs consider Intel GPUs 'decent' enough to make them the only GPU in cheaper systems. This means that AMD, nVidia or any other OEM is not able to sell GPUs in that market at all. Only in the more high-end market will you get AMD or nVidia GPUs at all.
Intel can make GPUs so cheap that it's not even worth it to make a version of a CPU that doesn't include one. What's the cheapest Intel with a GPU? A Celeron of $40? Mission accomplished.
The fact that Intel can cram in all those DX12_1 features that AMD can't (conservative raster and ROV require a total redesign/revalidation of the graphics pipeline, which is why AMD has been dragging their feet... they can't just add it to an existing GPU), also shows that Intel's GPU department knows what they're doing.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/