AfterDawn: Tech news

Nvidia: Being part of PlayStation 4 was not worth cost

Written by Andre Yoskowitz (Google+) @ 17 Mar 2013 1:18 User comments (23)

Nvidia: Being part of PlayStation 4 was not worth cost Nvidia's Senior VP of content and technology Tony Tamasi has said this week that the company skipped involvement with the PlayStation 4 because it was not worth the cost.
The upcoming console has a custom APU built by Sony and AMD, and the Wii U has a similar chip as does the Microsoft Xbox 8.

"I'm sure there was a negotiation that went on, and we came to the conclusion that we didn't want to do the business at the price those guys were willing to pay," Tamasi added. "Having been through the original Xbox and PS3, we understand the economics of [console development] and the tradeoffs."

With the original Xbox, Nvidia built the 233 MHz "NV2A" application-specific integrated circuit (ASIC) and the company also co-developed PlayStation 3's RSX "Reality Synthesizer" GPU (550 MHz).

"We're building a whole bunch of stuff, and we had to look at console business as an opportunity cost," he said. "If we, say, did a console, what other piece of our business would we put on hold to chase after that? In the end, you only have so many engineers and so much capability, and if you're going to go off and do chips for Sony or Microsoft, then that's probably a chip that you're not doing for some other portion of your business. And at least in the case of Sony and Nvidia, in terms of PS4, AMD has the business and Nvidia doesn't. We'll see how that plays out from a business perspective I guess. It's clearly not a technology thing."

Previous Next  

23 user comments

117.3.2013 4:56
chang1
Unverified new user

i need this membership

217.3.2013 7:45

I back up Nvidia's decision. The console industry is not looking too good. With games $70 a piece, this isn't going to be good.

People were very hesitant to buy the PS3 only because of the price. I'm sure a lot of people would have bought it if it was $300 MSRP when it first released!

317.3.2013 12:13

Originally posted by PraisesToAllah:
I back up Nvidia's decision. The console industry is not looking too good. With games $70 a piece, this isn't going to be good.

People were very hesitant to buy the PS3 only because of the price. I'm sure a lot of people would have bought it if it was $300 MSRP when it first released!
yeah true

417.3.2013 12:49

So what does that say about what's in the new xbox? Are they jumping into bed with AMD again or someone else?

Another thought, nVidia have pursued the mobile market cheerfully and have helped OUYA (they say that's a big deal) and are interested in pursuing their own Shield device with the Tegra4.

The Shield may not take off despite the curiosity in it. It seems to be a bit idiosyncratic. But the OUYA 2 will probably use the T4 too.

I'd like to know if anyone is seriously considering buying the nVidia Shield? If so, why?



-------------------------------------------------------------

This message has been edited since its posting. Latest edit was made on 17 Mar 2013 @ 12:54

Its a lot easier being righteous than right.


DSE VZ300-
Zilog Z80 CPU, 32KB RAM (16K+16K cartridge), video processor 6847, 2KB video RAM, 16 colours (text mode), 5.25" FDD

517.3.2013 12:57

In my opinion it just goes to show that the next Xbox & PS4 will be very similar.

617.3.2013 15:16

Sony is afraid to make a good video game console, since their last experience with PS3.
Corporate thinking $$$....

This message has been edited since its posting. Latest edit was made on 17 Mar 2013 @ 15:16

Live Free or Die.
The rule above all the rules is: Survive !
Capitalism: Funnel most of the $$$ to the already rich.

717.3.2013 15:33

I hope the guy with the "Smash My PS3" site is alive and well to continue his good deeds...:)

817.3.2013 19:14

Microsoft already has confirmed using AMD/ATI as their chipsets of choice for next gen. This will be good in a way because that means cross platform games should not be AS buggy as the PS3/360 generation of games. Personally I think Nvidia took a gamble, they are basically saying they dont believe AMD has the man power to take on all markets so Nvidia will attempt to focus on mobile and PC markets and watch their competitors struggle to meet supply and demand on all fronts. If AMD/ATI succeeds tho, they would take a substantial gain while Nvidia shot themselves in the foot by closing the door to such an idea.

However it frustrates me on the PC side of things because they have Nvidia has a point; they (ATI) has delayed their Radeon HD 8xxx series damn near another year, been hush about any news regarding these cards, and likely won't even release them prior to 2014 as they prep for this years next gen console wars.

917.3.2013 20:42

Sony had already decided on a Radeon-based GPU from AMD for the PS4 back in 2008, but was still trying to work w/ IBM for the CPU, so I kinda wonder if things soured a little between Sony and NVIDIA back then. And I don't know what happened w/ IBM (couldn't provide quite what Sony wanted performance-wise for a ~$70/unit cost???), but AMD's Fusion (combined CPU-GPU "APUs") may have solidified Sony's choice since they were already planning on using their GPU. While seemingly "standard PC hardware," it also depends on what customizations they do (they may be able to remove and bypass some circuitry normal 64-bit x86 CPUs need for backwards compatibility, etc. since it doesn't need to run PC software (and, thank goodness, WINDOWS!), so may be able to do more specialized things a lot quicker than seemingly equivalent hardware in a PC.

1017.3.2013 20:51

I don't think of it as a strict "manpower" issue Mysttic. Just one of what's more profitable for them. They could easily hire more people if they felt it was worth it. I think their attitude is, "a lot of work for f**k all in return". And if what you say about getting the drop on AMD/ATI as far as PC tech is correct then that would make sense too. (That would include their Tesla processors too.)

So, no-one going for the nVidia Shield?


Its a lot easier being righteous than right.


DSE VZ300-
Zilog Z80 CPU, 32KB RAM (16K+16K cartridge), video processor 6847, 2KB video RAM, 16 colours (text mode), 5.25" FDD

1117.3.2013 23:43

Yes, they can hire more engineers...but there are serious issues and delays with scaling an engineering department, and so much classified stuff goes into a modern GPU that even a very skilled engineer needs to hit the books before getting their feet wet in a project like this if they are new to the company. Plus, when the goal is to create one chip that will be used for 4-6 years with little modification...what do you do with the extra staff once it is in production? Now you either need to start more new projects that are not as promising, or you need to lay off staff who you just spent a fortune to give your secrets to, and tell them to go to your competition.

Personally, I don't see it as a good or bad business move on the part of nVidia. I do see a mistake tho...from Sony. If they didn't insist on reinventing an already great wheel, they could get the console off the ground faster and cheaper, they could make coding games easier, and they could reduce their costs down the line. Why exactly does the PS4 need a proprietary GPU when the specs we have seen so far could easily be met by a slew of outdated PC GPUs? They finally figured this out from the CPU standpoint...and there is actually something to be gained by going with a PPC instead of X86, while there is nothing to be gained by going with a oddball, partially crippled GPU instead of one with the same architecture that is already in existence. Personally, the only hint of a part of a reason I can think of is that they might be afraid of something like hackintoshes being created if ALL the hardware in the PS4 was available for the PC...but that is a major stretch.

1218.3.2013 11:06

see what disturbs me as a gamer is the fact that they went to the same thing the wii u has in it a amd processor and a amd graphics card no thank you

1318.3.2013 11:15

there was a story I read were Nvidia where stealing AMD secrets to figure out why Sony left them for AMD & why MS/Nintendo chooses them.
they would make a huge amount of money by supplying parts sounds odd.
this
http://arstechnica.com/tech-policy/2013...0000-documents/

This message has been edited since its posting. Latest edit was made on 18 Mar 2013 @ 11:16

1418.3.2013 11:18

Originally posted by megadunderhead:
see what disturbs me as a gamer is the fact that they went to the same thing the wii u has in it a amd processor and a amd graphics card no thank you
that is Nintendo that chose them weak parts PS4 is a beast compared to wiiu.

1518.3.2013 18:00

Originally posted by brockie:
Originally posted by megadunderhead:
see what disturbs me as a gamer is the fact that they went to the same thing the wii u has in it a amd processor and a amd graphics card no thank you
that is Nintendo that chose them weak parts PS4 is a beast compared to wiiu.
Nintendo's cpu is from IBM and their GPU is still unknown. Now the RAM is more powerful on the PS4, but it comes down to way much more than that with DRAM, architecture, and so forth. Don't be so quick to haste.
This message has been edited since its posting. Latest edit was made on 18 Mar 2013 @ 18:08

1618.3.2013 19:10

Reports from other circles have the wiiu as being less powerful than the PS180 or the new xbox... but sure, it remains to be seen.

AMD/ATI parts don't bother me.


Its a lot easier being righteous than right.


DSE VZ300-
Zilog Z80 CPU, 32KB RAM (16K+16K cartridge), video processor 6847, 2KB video RAM, 16 colours (text mode), 5.25" FDD

1718.3.2013 19:21

Originally posted by Jemborg:
Reports from other circles have the wiiu as being less powerful than the PS180 or the new xbox... but sure, it remains to be seen.


Exactly.

Then there is the whole matter of whether the differences are sufficient to be very noticeable.
With res capped at HD TV's 1080p I doubt there's going to be a whole heap of difference.

That may change (slightly) as games in the next gen get more complex but this time around I expect a 3rd party game, say one of the Call of Duty types, to be much harder distinguish on each console.


1819.3.2013 2:15

Originally posted by deucezulu22:
Originally posted by brockie:
Originally posted by megadunderhead:
see what disturbs me as a gamer is the fact that they went to the same thing the wii u has in it a amd processor and a amd graphics card no thank you
that is Nintendo that chose them weak parts PS4 is a beast compared to wiiu.
Nintendo's cpu is from IBM and their GPU is still unknown. Now the RAM is more powerful on the PS4, but it comes down to way much more than that with DRAM, architecture, and so forth. Don't be so quick to haste.
The GPU is an AMD chip. The chip was based off the 4000 series cards. News has been out for a couple of months.

1919.3.2013 2:25

Originally posted by Rebel11:
Originally posted by deucezulu22:
Originally posted by brockie:
Originally posted by megadunderhead:
see what disturbs me as a gamer is the fact that they went to the same thing the wii u has in it a amd processor and a amd graphics card no thank you
that is Nintendo that chose them weak parts PS4 is a beast compared to wiiu.
Nintendo's cpu is from IBM and their GPU is still unknown. Now the RAM is more powerful on the PS4, but it comes down to way much more than that with DRAM, architecture, and so forth. Don't be so quick to haste.
The GPU is an AMD chip. The chip was based off the 4000 series cards. News has been out for a couple of months.

Alright, thanks for the update.

2019.3.2013 8:41

Does the wiiu have passive cooling?


Its a lot easier being righteous than right.


DSE VZ300-
Zilog Z80 CPU, 32KB RAM (16K+16K cartridge), video processor 6847, 2KB video RAM, 16 colours (text mode), 5.25" FDD

2120.3.2013 21:08

So does this mean that Nvidia might power the next Xbox? I hope so, because if the rumours are true; about the next xbox having an AMD gpu that's 1 generation behind the PS4's AMD gpu then that sucks... Also if we have to pay again this time for Xbox live, I'm seriously considering switching over to the PS4.



2229.3.2013 11:58

I'm glad they've switched to AMD. I've been building PC's for a while now and I've had less issues with Radeon cards. Granted, neither one of them are perfect. I tend to lean more towards AMD more due to price/performance; you get a lot for less. Sony has realized this as well. I hate to say it, but when you NVidia, you're buying the name only.

2331.3.2013 19:07

Originally posted by kutulu1:
I'm glad they've switched to AMD. I've been building PC's for a while now and I've had less issues with Radeon cards. Granted, neither one of them are perfect. I tend to lean more towards AMD more due to price/performance; you get a lot for less. Sony has realized this as well. I hate to say it, but when you NVidia, you're buying the name only.
Hardly.

Around the 5xxx generation and before, Radeons were terrible: huge driver issues, lack luster performance, etc. Nvidia, until recently, has been synonymous with high performance. They've always marketed toward the enthusiast crowds.

With recent generations, however, AMD has stepped up their game and now out-does Nvidia with some of their chips. This is by and large a new development in the market.

Comments have been disabled for this article.

News archive