AfterDawn: Tech news

Researchers use integrated GPU to boost CPU speed

Written by Rich Fiscus @ 10 Feb 2012 4:06 User comments (13)

Researchers use integrated GPU to boost CPU speed Researchers at North Carolina State University have found a way to improve CPU performance more than 20 percent using a GPU built on the same processor die.
"Chip manufacturers are now creating processors that have a ?fused architecture,? meaning that they include CPUs and GPUs on a single chip," said Dr. Huiyang Zhou, who co-authored a new paper based on the research. He explained, "Our approach is to allow the GPU cores to execute computational functions, and have CPU cores pre-fetch the data the GPUs will need from off-chip main memory."

The research was performed in conjunction with AMD, who talked about plans to increase CPU/GPU integration in a presentation to analysts last week. Based on that presentation, the techniques identified in this research could be used in AMD processors within the next two years.



Although this research appears to be focused on current PC technology, most likely AMD's Fusion APU, it also has obvious applications for improving ARM processor performance. ARM's SOC (System On a Chip) design emphasizes power efficiency over speed, making it the standard choice for smartphones, tablets, and other mobile devices.

Along with a plan to transition into SOC processor production, possibly including ARM chips, AMD is promoting standardization between different processor architectures. Their HSA, or Heterogeneous Systems Architecture, initiative is intended to standardize the way various components integrated on a single processor interact with each other.

Tags: AMD CPU GPGPU SOC HSA
Previous Next  

13 user comments

111.2.2012 01:30

interesting, however not ideal as there general purpose is in laptops, or other low profile machines that lack the ability to expand graphics power due to either cost cutting measures or design issues.

211.2.2012 13:39

OK, not being a design engineer with all the math degrees & such for any of this... purely coming from the background of implementation & end user dynamics:

So much of this particular technology has started to rely heavily on the cooperation of the GPU in doing a lot of number crunching. Granted, I've seen a bunch of it being driven in the 3D graphical interfaces & those 'physical' environments, but if memory serves, I would have thought some universities have been looking into folding some of the math of proteins as well.

So I offer this... If the graphic processors are pounding out the numbers in comparatively/drastically higher numbers than regular CPUs why aren't manufacturers virtually using these processors as the basis of their design (as of resent)?

Seems to me this would be the next step in the evolution. But I have missed things along the way too.

311.2.2012 14:47

Originally posted by LordRuss:
OK, not being a design engineer with all the math degrees & such for any of this... purely coming from the background of implementation & end user dynamics:

So much of this particular technology has started to rely heavily on the cooperation of the GPU in doing a lot of number crunching. Granted, I've seen a bunch of it being driven in the 3D graphical interfaces & those 'physical' environments, but if memory serves, I would have thought some universities have been looking into folding some of the math of proteins as well.

So I offer this... If the graphic processors are pounding out the numbers in comparatively/drastically higher numbers than regular CPUs why aren't manufacturers virtually using these processors as the basis of their design (as of resent)?

Seems to me this would be the next step in the evolution. But I have missed things along the way too.


Most do. there Called RISC Processor (SPARC PPC ARM MIPS)(the Masters of there Designated Task) such as RADARS, Laser Guidance systems, and many scientific purposes, including Protein folding, and Prediction Branching)

your general Population uses CISC (x86, x64, Intel AMD etc)(The jack of all but Masters of none) often found in environments where each piece of hardware does not have its own hardwired chip.




411.2.2012 16:51

Originally posted by DXR88:
Most do. there Called RISC Processor (SPARC PPC ARM MIPS)(the Masters of there Designated Task) such as RADARS, Laser Guidance systems, and many scientific purposes, including Protein folding, and Prediction Branching)

your general Population uses CISC (x86, x64, Intel AMD etc)(The jack of all but Masters of none) often found in environments where each piece of hardware does not have its own hardwired chip.


You're not making much sense. RISC hasn't been made/used for about 12-15 years & your CISC analogy doesn't hold water as it washes back into itself as being the argument for being today's current computing technology.

I'm talking about completely incorporating the different architecture of Tegra or Fusion(?) into just that, rather than the complaints of a stalemate of 'no further gains' in current CPU technology.

Which by the way, GPUs are being currently used for some scientific purposes, just limited. So I'm saying why hasn't every bit of this been pushed over the hump yet?

Either I wasn't clear or you clarified what I already said with dead & redundant equipment I also knew about.
This message has been edited since its posting. Latest edit was made on 11 Feb 2012 @ 4:56

512.2.2012 01:04

I think the main problem is that developers just don't always take advantage of everything that is available simply because it isn't always available. For instance, many apps use nVidia CUDA yet refuse to use ATI cards that can do essentially the same thing simply because they are not specifically designed for it (slower workstation cards from ATI ignored). Other apps use neither, as CUDA isn't always there...or simply because they were too lazy to add CUDA support. ATI has something like CUDA on their workstation cards too (I forget the name at the moment)...but it is virtually unused simply because it is not on the run-of-the-mill desktop cards.

As an i5 owner with a dedicated video card, I would love to see apps using the integrated GPU that I have no use for currently...but I honestly don't know how much of a boost I would see considering that most apps don't even bother to use CUDA.

612.2.2012 11:15

Originally posted by KillerBug:
As an i5 owner with a dedicated video card, I would love to see apps using the integrated GPU that I have no use for currently...but I honestly don't know how much of a boost I would see considering that most apps don't even bother to use CUDA.
I agree. But I think Cuda & Stream (the ATI equivalent you were looking for) are more of a software 'switch' (if you will) to get the GPU involved in helping with CPU processing.

Don't get me wrong, like the cereal, "it's great!", but they only seem to want to use it for video processing in the consumer market. And not that Cuda or Stream are the only viable options, they just seem to be the only two out there at the moment.

So my blathering is about some Tucker or Tesla upstart taking (say) a Cuda & building a quad core CPU layered off it's foundation. If engineers are already writing code for GPUs to fold/unfold protein DNA my feeble brain doesn't see the reason why it can't direct a little traffic on a motherboard.

Thus killing this supposed stagnation of CPU processing speeds for a while. Granted, I'm leaving myself open for ridicule that these GPUs use a multi-processor approach to their ability to do their 'thing', thus giving the illusion of a higher mhz rating, but then equally, shouldn't we be able to have similar computers with similar CPUs?

Thus the reason for the question to keep coming back on itself & the risk of me sounding like a crack smoker.

712.2.2012 15:28

LordRuss, risc still in use & production. http://en.wikipedia.org/wiki/RISC

812.2.2012 15:54

Originally posted by ddp:
LordRuss, risc still in use & production. http://en.wikipedia.org/wiki/RISC
OOookay... I wasn't right. But Motorola was the biggest manufacturer of the processor. Now Quaalcom is in the game. But your link is saying "for all intents & purposes" the RISC-CISC lines are all but blurred. And if they're still in production, why aren't they calling them RISC processors? In your article they want to refer to them as ARM.

I don't mean it necessarily like AMD's FX chip being renamed the AM(whatever). I mean the RISC seems to have died somewhere along the way, obviously not in the server market, but it somehow lived under a highway for a long time & now wants to live in the middle of Beverly Hills again.

Besides, even 'they' adopted engineering rules of the x86 architecture, just like everyone else did. It's the micronization elements where we're starting to see a resurgence of all this.

Still doesn't change the fact all these guys need to do whatever the video card guys are doing in processing technology & twist their nuts on a bit tighter.

912.2.2012 15:59

was using them on sun microsystem boards at celestica back in 1998-2000 before transfered to other site to build nokia cell phones & later cisco boards.

1013.2.2012 12:38

I forgot about Sun... I knew they were still using them after CrApple dropped them like rats on the proverbial sinking ship... I just lost track after the whole Motorola thing. Then started mumbling to myself when I heard voices around the processors in Blackberries & such (or so I thought).

I figured it was a bad drug induced flashback. Who knew?

1113.2.2012 13:21

the processors on the sun boards where made by TI & did not have pins but pads like the socket 775 & up.

1213.2.2012 13:41

Could I assume they were the trend setter for the x86 market going pinless then? Or were they just the first to be the first to naturally migrate?

1313.2.2012 14:07

were the 1st as intel didn't go pinless till socket 775.

Comments have been disabled for this article.

Latest news

VLC hits milestone: over 5 billion downloads VLC hits milestone: over 5 billion downloads (16 Mar 2024 4:31)
VLC Media Player, the versatile video-software powerhouse, has achieved a remarkable feat: it has been downloaded over 5 billion times.
1 user comment
Sideloading apps to Android gets easier, as Google settles its lawsuit Sideloading apps to Android gets easier, as Google settles its lawsuit (19 Dec 2023 11:09)
Google settled its lawsuit in September 2023, and one of the settlement terms was that the way applications are installed on Android from outside the Google Play Store must become simpler. In the future, installing APK files will be easier.
8 user comments
Roomba Combo j7+ review - Clever trick allows robot vacuum finally to tackle home with rugs and carpets Roomba Combo j7+ review - Clever trick allows robot vacuum finally to tackle home with rugs and carpets (06 Jun 2023 9:19)
Roomba Combo j7+ is the very first Roomba model to combine robot vacuum with mopping features. And Roomba Combo j7+ does all that with a very clever trick, which tackles the problem with mopping and carpets. But is it any good? We found out.
Neato, the robot vacuum company, ends its operations Neato, the robot vacuum company, ends its operations (02 May 2023 3:38)
Neato Robotics has ceased its operations. American robot vacuum pioneer founded in 2005 has finally called it quits and company will cease its operations and sales. Only a skeleton crew will remain who will keep the servers running until 2028.
5 user comments
How to Send Messages to Yourself on WhatsApp How to Send Messages to Yourself on WhatsApp (20 Mar 2023 1:25)
The world's most popular messaging platform, Meta-owned WhatsApp has enabled sending messages to yourself. While at first, this might seem like an odd feature, it can be very useful in a lot of situations. ....
18 user comments

News archive