Introducing IMGIC - A better frame-buffer compression

Besides the multi-GPU scalability, another big feature introduction to the B-Series is the addition of a completely new image compression algorithm, simply dubbed IMGIC, or Imagination Image Compression.

Compression is an integral part of modern GPUs as otherwise the designs would simply be memory bandwidth starved. To date, Imagination has been using PVRIC to achieve this. The problem with PVRIC was that it was a relatively uncompetitive compression format, falling behind in data compression ratio compared to other competitor techniques such as Arm’s AFBC (Arm Frame-Buffer Compression). This resulted in IMG GPUs using up more bandwidth than a comparable Arm GPU.

IMGIC is a completely new and redesigned compression algorithm that replaces PVRIC. Imagination touts this as the most advanced image compression technology, offering extreme bandwidth savings and a lot more flexibility compared to previous PVRIC designs. Amongst the flexibility aspect of things, IMGIC can now work on individual pixels instead of just smaller tiles or pixel groups.

Furthermore, the new algorithm is said to be 8x simpler than PVRIC, meaning the hardware implementation is also much simplified and achieves a significant are area reduction.

The new implementation gives vendors more scaling options, adding compression ratios down to a lossy 25% for extreme bandwidth savings. SoC vendors can use this to alleviate bandwidth starved scenarios or QoS scenarios where other IPs on the SoC should take priority.

Overall, the B-Series now offers a 35% reduction in bandwidth compared to the A-Series and previous generation Imagination GPU architectures, which is a rather large improvement given that memory bandwidth is a costly matter, both in terms of actual silicon cost as well as energy usage.

Configurations - Up to 4 GPUs at 6TFLOPs Introducing BXS Series: Functional Safety for Autonomy
Comments Locked


View All Comments

  • Obi-Wan_ - Tuesday, October 13, 2020 - link

    Is this something that could transition to the desktop consumer space at some point? Either the Imagination IP or just the pull/push change.
  • yeeeeman - Tuesday, October 13, 2020 - link

    I am also wondering about this.
    To me it seems a bit strange why they don't take this step given that their revenues from mobile are getting slimmer and slimmer.
  • Arsenica - Tuesday, October 13, 2020 - link

    Imagination Technologies may be British but it's owned via shell-companies by the Chinese government. Finding viable revenue streams no longer matters to them.

    This product may eventually transition to China-only PCs, but in the meantime (as it's implied in Innosilicon PR) it's targeted for Chinese data centers.
  • Threska - Wednesday, October 14, 2020 - link

    Getting the Apple contract is indeed a big deal.
  • Otritus - Tuesday, October 13, 2020 - link

    I would imagine the consideration is money and competition. Imagination has no mindshare in the consumer market, and lacks many gaming-centric features (and good drivers) that would essentially eliminate adoption. The MGPU seems like it will be the future of gpu computing, but right now putting multiple slabs of silicon together and efficiently connecting them is expensive and power hungry which is undesirable.
  • myownfriend - Tuesday, October 13, 2020 - link

    What gaming centric features are they missing and how do you know they have shitty drivers?
  • Otritus - Wednesday, October 14, 2020 - link

    To my knowledge, their gpus do not currently support APIs such as DX12, lack a freesync/gsync competitor, reduced input latency, game streaming, etc. In terms of drivers they have no public drivers for windows or linux, meaning their drivers are unoptimized for desktop usage.
  • myownfriend - Wednesday, October 14, 2020 - link

    You're conflating a few things. Adaptive sync or Freesync is a feature of the display controller not the GPU. As has been stated, Apple has used Imagination's GPU IP for years and that includes the iPad Pro which supports dynamic refresh rates.

    Game streaming is a way that software uses a GPUs hardware decoder. All it would require is something like Moonlight supporting it.

    I don't know what reduced input latency is supposed to refer to here since that has to do with a bunch of things that aren't the GPU.

    I can't say for certain if they support DirectX 12 but they have supported OpenGL, Vulkan, and DirectX 11 for a long time. Considering many DirectX 11 GPU were able to support DX12 via a driver update, that's very likely something that they could do the same. If they can't then they would still be an option for Linux users.

    Lastly, Imagination makes GPU IP meaning they don't manufacture actual chips. They license out designs so you're not gonna go to Imagination's website and see download list for their drivers. You can, however, look at Imagination's YouTube channel and see their cards running on Linux and Windows desktops so those drivers exist.
  • Kangal - Friday, October 16, 2020 - link

    He's "Technically wrong" but from the consumer's viewpoint he's entirely right. There's a lot of work ImaginationTechnologies have to do to get their PowerVR GPUs to the desktop space then catch-up and compete against the likes of Nvidia's Ampere GPUs (industry standard).

    Normally, I would say that's easy. PowerVR has actually been an innovator and sometimes a leader in the GPU field, albeit from the shadows. But now? I doubt it.

    Firstly with 2020 being what it is. More importantly, the company is shifting massively with their CEO ousted, board members changing, and their Senior Engineers leaving after the aggressive Chinese buyout. The British seem to be angry about the whole thing, but it was so predictable, they're too naive. I'm not sure if what's left of PowerVR will have the leadership and talent anymore to get the job done. Perhaps these A-Series and B-Series are the end of their hard work, and in a few years time they will hit the wall (kinda like Intel did after SkyLake). Their short-handed won't affect Apple much, since they actually have their own GPU designers and don't rely on PowerVR much anymore. So I can see these valuable staff being poached by Apple-SoC, ARM-Mali or even Qualcomm-Adreno teams. Hopefully they don't add to the monopoly with Nvidia/AMD teams.
  • myownfriend - Friday, October 16, 2020 - link

    I can agree with all that. It would be very difficult for them to break into a field that's been team red and team green for some people's entire lives. If they do fall apart, I'd say Nvidia would be the worst case scenario as their recent purchase of ARM is allowing them to wield a gross amount of power.

Log in

Don't have an account? Sign up now