Sony just announced the PlayStation 4, along with some high level system specifications. The high level specs are what we've heard for quite some time:

  • 8-core x86-64 CPU using AMD Jaguar cores (built by AMD)
  • High-end PC GPU (also built by AMD), delivering 1.84TFLOPS of performance
  • Unified 8GB of GDDR5 memory for use by both the CPU and GPU with 176GB/s of memory bandwidth
  • Large local hard drive

Details of the CPU aren't known at this point (8-cores could imply a Piledriver derived architecture, or 8 smaller Jaguar cores—the latter being more likely), but either way this will be a big step forward over the PowerPC based general purpose cores on Cell from the previous generation. I wouldn't be too put off by the lack of Intel silicon here, it's still a lot faster than what we had before and at this level price matters more than peak performance. The Intel performance advantage would have to be much larger to dramatically impact console performance. If we're talking about Jaguar cores, then there's a bigger concern long term from a single threaded performance standpoint.

Update: I've confirmed that there are 8 Jaguar based AMD CPU cores inside the PS4's APU. The CPU + GPU are on a single die. Jaguar will still likely have better performance than the PS3/Xbox 360's PowerPC cores, and it should be faster than anything ARM based out today, but there's not huge headroom going forward. While I'm happier with Sony's (and MS') CPU selection this time around, I always hoped someone would take CPU performance in a console a bit more seriously. Given the choice between spending transistors on the CPU vs. GPU, I understand that the GPU wins every time in a console—I'm just always an advocate for wanting more of both. I realize I never wrote up a piece on AMD's Jaguar architecture, so I'll likely be doing that in the not too distant future. Update: I did.

The choice of 8 cores is somewhat unique. Jaguar's default compute unit is a quad-core machine with a large shared L2 cache, it's likely that AMD placed two of these together for the PlayStation 4. The last generation of consoles saw a march towards heavily threaded machines, so it's no surprise that AMD/Sony want to continue the trend here. Clock speed is unknown, but Jaguar was good for a mild increase over its predecessor Bobcat. Given the large monolithic die, AMD and Sony may not have wanted to push frequency as high as possible in order to keep yields up and power down. While I still expect CPU performance to move forward in this generation of consoles, I was reminded of the fact that the PowerPC cores in the previous generation ran at very high frequencies. The IPC gains afforded by Jaguar have to be significant in order to make up for what will likely be a lower clock speed.

We don't know specifics of the GPU, but with it approaching 2 TFLOPS we're looking at a level of performance somewhere between a Radeon HD 7850 and 7870. Update: Sony has confirmed the actual performance of the PlayStation 4's GPU as 1.84 TFLOPS. Sony claims the GPU features 18 compute units, which if this is GCN based we'd be looking at 1152 SPs and 72 texture units. It's unclear how custom the GPU is however, so we'll have to wait for additional information to really know for sure. The highest end PC GPUs are already faster than this, but the PS4's GPU is a lot faster than the PS3's RSX which was derived from NVIDIA's G70 architecture (used in the GeForce 7800 GTX, for example). I'm quite pleased with the promised level of GPU performance with the PS4. There are obvious power and cost constraints that would keep AMD/Sony from going even higher here, but this should be a good leap forward from current gen consoles.

Outfitting the PS4 with 8GB of RAM will be great for developers, and using high-speed GDDR5 will help ensure the GPU isn't bandwidth starved. Sony promised around 176GB/s of memory bandwidth for the PS4. The lack of solid state storage isn't surprising. Hard drives still offer a dramatic advantage in cost per GB vs. an SSD. Now if it's user replaceable with an SSD that would be a nice compromise.

Leveraging Gaikai's cloud gaming technology, the PS4 will be able to act as a game server and stream the video output to a PS Vita, wirelessly. This sounds a lot like what NVIDIA is doing with Project Shield and your NVIDIA powered gaming PC. Sony referenced dedicated video encode/decode hardware that allows you to instantaneously record and share screenshots/video of gameplay. I suspect this same hardware is used in streaming your game to a PS Vita.

Backwards compatibility with PS3 games isn't guaranteed and instead will leverage cloud gaming to stream older content to the box. There's some sort of a dedicated background processor that handles uploads and downloads, and even handles updates in the background while the system is off. The PS4 also supports instant suspend/resume.

The new box heavily leverages PC hardware, which is something we're expecting from the next Xbox as well. It's interesting that this is effectively how Microsoft entered the console space back in 2001 with the original Xbox, and now both Sony and MS have returned to that philosophy with their next gen consoles in 2013. The PlayStation 4 will be available this holiday season.

I'm trying to get more details on the CPU and GPU architectures and will update as soon as I have more info.

Source: Ustream

Comments Locked


View All Comments

  • HisDivineOrder - Thursday, February 21, 2013 - link

    ... "The PS4 also supports instant suspend/resume."

    That is going to be huge for home consoles. Bigger than anyone can even imagine. Who the hell needs saves when you can just suspend the console at a point and then start up right where you left off?

    No logos, no menus, no copyrights, no waiting. You sit down. You pick up your controller. Your PS4 with its camera detects you and the game just starts up. Before you have even sat down, the game is sitting staring at you paused. Waiting for you to hit the Options button to continue.

    You need to go to the potty, then the grocery store to pick up more Red Bull, nachos, and a card for apologizing to your girlfriend for forgetting her birthday while playing the new Killzone? You come back an hour later, you sit down. It's on, waiting for you again.

    Especially if it can leverage HDMI-CEC to manage your HDTV and receiver on and off, too.

    That would feel like the future, wouldn't it?
  • wedouglas - Saturday, February 23, 2013 - link

    Huh? I don't see why this is big deal. You can just pause the game and turn the TV off or do something else. Computers stay on all day and night, why not the PS3 too? I've left it on for days and never had any issues.

    I think the point of this was that it's a low power mode.
  • singhjeet - Tuesday, February 26, 2013 - link

    It's not so much that the PS3 can't stay on all day and night, it can, easily, I've left mine on Folding@Home for weeks at a time. It's just every time I think to pause the game and just turn off the TV, I have a little mental battle; How long til I'm back at the TV? How much electricity is the system going to be drawing needlessly? How much time is going to be wasted waiting for the game to reload?

    All of that is now gone, cause it just suspends. It's not so much that it wasn't possible, just that it's made to do it now. You can turn it off mindlessly even if you're going to be coming back in 15 minutes or 5 hours, which will be nice for a console, especially if the game only allows you to save it at particular points in the game.
  • J-M - Thursday, February 21, 2013 - link

    What I'm missing in most "previews" I've seen so far is the fact that the jaguar cores are the first generation cores using HSA. For sure that'll have an effect on the performance when comparing to the current generation cpu's, making comparisons really hard to do. This ps4 will pack so much more punch then the specs suggest when comparing to current pc's.
  • Mathos - Thursday, February 21, 2013 - link

    People aren't thinking about some things when they comment about the CPU being an AMD CPU.

    1: These are low power Jaguar Cores, not Piledriver/Bulldozer. That means 5-25W range for power draw. Using TSMC 28nm HKMG Process.

    2: You can't expect the performance of a Standard Jaguar APU to be equal to this APU, due to the difference in memory controller. As a standard JG uses DDR3, where this will have access to much higher speed GDDR5. It should be interesting to see if this gets around the memory controller bottleneck present since the Phenom.

    3: This should change the landscape for gaming. Since all new games will be able to run on multithreaded PC hardware. And in this coming gens case, it should benefit AMD with most of them being pre-optimized for AMD hardware.
  • Mathos - Thursday, February 21, 2013 - link

    Had to add another one, what I'd give for an edit function on this.

    4: These Jaguar CPU cores are all full function x86-64 CPU cores. Which means all new games for these consoles will be programmed to run in native x86-64 mode. This should mean, most PC ports will come able to run in native x86-64 mode as well. Which should add much better visual quality, and give more access to RAM for programmers.
  • tipoo - Thursday, February 21, 2013 - link

    GDDR5 actually has a higher latency than DDR3, much higher actually. I wonder how that will play out. With GPUs the latency doesn't matter as much as bandwidth is the most important thing, but with CPUs after a point bandwidth isn't as important and a super high latency in comparison to DDR3 could hurt.
  • silverblue - Friday, February 22, 2013 - link

    Yes, but how much would the latency matter when we're talking an effective 5.5GHz memory speed (1,375MHz GDDR5)
  • powerarmour - Friday, February 22, 2013 - link

    I wouldn't worry too much about latency affecting CPU performance, it certainly never harmed most PC desktop CPU's that much when we jumped from DDR1>2>3.

    Bandwidth is still king for graphics, and that's mainly what this machine is built for.
  • tipoo - Saturday, February 23, 2013 - link

    I'm aware, maybe it won't be a big deal, that's why I said I was just wondering about it. But I think the latency from DDR 1-2-3 is a lesser difference than DDR3-GDDR5. Maybe the clock rate cancels some or most of that out. Again I'm unsure, just speculating.

Log in

Don't have an account? Sign up now