Jump to content

Intel HD/AMD APU and B&S running on the wrong GPU


zgrssd

Recommended Posts

The short version is that nobody told NVIDIA (and possibly AMD) that the Client.exe should be assigned the proper GPU. And so it runs on the likes of a Intel HD 2600 card by default.

I know that you can not properly choose the right GPU (because your game never get's told about the other one). But you are still resposible for telling them on what card your programm should run, so they can update thier lists.

 

The long version of why this is an issue (and how everyone can solve it themself) is this:

There has been a serious change in how Integrated GPU's worked in the last decade or so. Especially the way adding a dedicated GPU no longer disables the Integrated one can often cause performance issues with new games. Or old games/consoles ports. Really games in general are having issues with that and there is nothing the programmer could even do about it (it is out of his hands).
I think this knowledge has reached the point where it becomes required "common knowledge" for a gamer and game developer alike. So I decide to write it down once to copy & paste it from now on.


The old school integrated GPU - on the Motherboard
Back in the Days when APG was still a common GPU connection (and IDE Express was a new thing), the integrated GPU was put onto the Motherboard. It was directly soldered into the Motherboard, like the network card, USB devices, soundcard and what not had been for a decade or more.

What that meant:
- They had no processing power. They were intended to be just enough to display a desktop. They were never designed for gaming, only to not need a dedicated card. If a GPU's name includes "Mobility" or had the "Integrated" prefix it usually meant "this card is bad, no mater what it says otherwise".
- they use (part of) the System RAM as Video RAM. They simply did not had any integrated RAM (or even room for it). So they plain reserved a part of the System RAM for their own use on BIOS level.
- They still ate a lot of energy and produced a lot of waste heat. Less then a dedicated one, but still annoying.
This was especially a problem for Notebooks. They were often the number 1 reason for heat problems. If you got a old notebook you can propably feel the heat the onboard GPU produces within minutes of turning it on.
- No driver updates. Each motherboard was it's own, non-standartised piece of circuiting art. The builder of the motherboard (or Laptop) had to provide all driver upgrades. Due to the way they are directly integrated into the chipset, no namegiver driver could work for them!
Wich usually meant no upgrades at all.
If you wanted a newer OS you needed to hope on the Mainboard/Laptop Manufacturer, or install a dedicated card (even a cheap one) you knew had support.
- Adding in a dedicated GPU (in the case of a PC) resulted in the integrated GPU just being disabeled. In case of APG the GPU could be literally disconnected from the rest of the hardware (as APG only allows one GPU per slot/set of circuits).
This was a mixed blessing. It did avoid the issue we have nowadays.
But it made it impossible to use the (relatively energy- and heat-) saving integrated GPU to run stuff a proper GPU was not needed for (like Windows, Webbrowsers, Wordprocessors, PDF readers and all the other stuff).


The new school - integrated GPU on the CPU:
With progress in Multicore architecture and Intel/AMD aquiring GPU manufacturers it suddenly became very feasible to put the GPU into the CPU.
AMD calls that idea APU or Fusion (earlier).
Intel just put it into every I3, I5 and I7. First Intel HD cards, but later also Iris cards (the better brother of HD).

What that meant:
- they still have no real processing power. After all they were only designed for simple stuff (Desktop, word processors, browsers). Of course them being of a later GPU generation means they still beat the old mobility cards easily.
But there is no way a dedicated card of the same generation can not beat them.
- better access to RAM. These integrated cards still use the system RAM as Video RAM. But they can use the much better CPU connection to the RAM, not having to go through stuff like the APG connection to get to the RAM.
- energy and heat management is so much easier. After all the CPU is already a pretty bit heat source and must already be able to modulate it's speed depending on load. Since the onboard GPU was now on the same piece of hardware (one that had already pretty good cooling), that meant instant solutions to the Notebook heat and energy problems. Non-PC's are the big winners of this new design all across the board.
- driver Updates. Since we now got a few standartised GPU's - rather then every Mainboards/Laptops homewbrew soldering of named Circuits - Intel and AMD can easily provide new Drivers. If you got a newer integrated GPU you can quickly find out if it has support for a newer OS. And you can almost bet it has.
- Adding a dedicated GPU does not turn of the Integrated ones. And again a mixed blessing:
It does allow non-intensive stuff to run on the much more energy and heat effective hardware.
It unfortunately results in games by default being run on the weaker, integrated GPU. The average programmer never considered providing even a selection for the GPU. And for console ports this was never even a consideration. In the end the decision was made to just do this work in the Control Programm of the dedicated GPU and keep very programm equally in the dark about what is there.


That last part is where the issues come from:
The control programm of the better GPU must know a game to assign it the dedicated GPU. Usually within the first 1-2 driver updates after a game comes out, it does not know. And it needs to be told, before it knows. So your wonderfull new game on your wonderfull new hardware will run a crummy Intel HD level GPU, no mater what else you got in there.
The game can't choose the GPU anymore, if it even ever had an option for that. It never even get's TOLD of the other GPU. That way braindead programms that just pick "the first GPU" or "the only GPU the programmer expected to be there back in 1990" still get the one they need.

NVIDIA uses a simple name/metadata or full path/string comparision approach to identify and whitelists programms for the dedicated GPU. In the end that means a list of strings that must be updated. And that can be manually overridden. It can even work with very, very braindead designs like a .dat file being executed (I am looking at you lead programmer of C&C 3 Kanes Wrath [http://steamcommunity.com/app/24810/discussions/0/523897023727034819/])
I have no idea how AMD does this. I heard something about a metadata "tag" on the programm itself.

In either case eventually they do it (if somebody tells them), but until then you have to manually override the assingment. As appropirate for your dedicated GPU manufacturer.

 

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...