While they've long since given up developing GPUs, Matrox has remained a notable player in the video card industry for over four decades. These days, the company has settled into a modest role of providing graphics cards based on other vendors' GPUs for use in niche or boutique use cases, where Matrox can differentiate based on their software and support. And while their sales volume as a whole is limited, there's a certain degree of validation that comes from Matrox tapping a vendor's GPUs for their latest video cards.

To that end, Intel this week has finally earned their tip of the hat from the oldest of the video card vendors, with the announcement of a new series of multi-monitor display cards built around Intel's Alchemist architecture GPUs. Matrox's new Luma series graphics cards are based on Intel's Arc A310 and A380 graphics hardware, with the niche video card maker looking to tap into the Alchemist's class-leading video decoding and encoding capabilities, as well as the display output features and flexibility that are critical for a multi-display card.

Matrox Video's Luma family of graphics boards includes three products: the full-sized single-slot Luma A380 based on the Arc A380 (ACM-G11 with 1024 stream processors) with 6GB of memory; the low-profile single-slot fanless Luma A310 based on the Arc A310 (ACM-G11 with 768 stream processors) with 4GB of memory; and the low-profile single-slot Luma A310F that is equipped with a active cooling system. 

All three Matrox Luma graphics cards have four DisplayPort 2.1 UHBR10 (40Gbps) outputs and thus can drive four 4Kp144/5Kp60 monitors (uncompressed, 4:4:4) or two 8Kp60 or 5Kp120 displays using two of such ports for each displays. As for features, they offer the same capabilities as other Intel Arc A310/A380-based offerings, including support for DirectX 12 Ultimate, OpenGL 4.6, Vulkan 1.3, and OpenCL 3.0 APIs, as well as encoding and decoding of H.264, H.265, VP9, and AV1 video streams. As an added bonus, they retain support for Intel's oneAPI for compute tasks, as well as the Intel Distribution of OpenVINO toolkit for AI development.

All of the Luma graphics cards consume no more than 75W and can be powered entirely via a PCIe slot without any auxiliary power connectors, and the A310-based offerings are intended to fit into the most compact PCs that are out there. Furthermore, their single-slot design means that upwards of several cards can be installed into a single desktop PC for systems that need to drive eight, 12, or more monitors.

Meanwhile, it's interesting to note that while Matrox has not announced the discontinuation of their previous generation cards for these product segments – the NVIDIA based D-series and AMD-based M-series – in terms of specifications these new Intel cards should supplant the older cards in every way. Intel's DisplayPort 2.1 capabilities are likely the driving factor given Matrox's intended niche, with NVIDIA in particular being boxed out by not including DisplayPort 2.1 functionality with their Ada Lovelace generation GPU architecture.

Matrox Video's Luma boards are aimed primarily at the medical, digital signage, control room, video wall, and industrial markets. The cards come with a base three-year warranty, which can be further extended and a guaranteed lifecycle of seven years, which is important for some of the markets that they are intended for. 

Other advantages Matrox's Luma board offer include support for Matrox's PowerDesk software developed to handle exotic multi-display configurations. 

Matrox did not announce prices of the boards, though given their orientation on commercial, professional, and industrial applications, they will be priced accordingly.

Source: Matrox (via SH SOTN)

Comments Locked

26 Comments

View All Comments

  • tipoo - Friday, April 28, 2023 - link

    Matrox! Intel is SAVED!
  • Exotica - Friday, April 28, 2023 - link

    So is Voodoo coming back too?
  • blppt - Saturday, April 29, 2023 - link

    I'm actually surprised that Nvidia hasn't tried to resurrect the name "3dfx" or "Voodoo" for its products since nostalgia always sells.
  • sheh - Saturday, April 29, 2023 - link

    I don't think it's the right market for that. Nostalgia might sell where it's more analog/vintagey in nature, like cars and guitars.

    But 3D chips? Buying a new card people are looking for cutting edge tech, not Glide support and good analog output quality (well... maybe that was Matrox more than 3dfx). Dual-releasing Nvidia chips in 3dfx branded cards will at best dilute the brand. A better strategy migth be to add retro-support in new drivers: official Glide support. :)

    BTW, it's surprising to realize that 3dfx's heyday, or actually the whole of its life in the market, was only 3-4 years.
  • wumpus - Sunday, April 30, 2023 - link

    They had a couple of years as "arcade only" video card (chip?) providers (they were seen in Atari's SF Rush). Then a year or two as extremely niche consumer video cards (4-5 times the cost of typical video cards), followed by more or less "mainstream-enthusiast" for 3-4 years. Basically from the time 2MB of low latency dram was relatively cheap until the original "SST" design became obsolete.

    If they ever had a successor to the "SST" design (that mult-rendering thing?), it didn't come in time.
  • Flying Aardvark - Monday, May 1, 2023 - link

    3dfx lost its audience a long time ago. Some of us still revere 3dfx because we owned them and lived through it, but we're few and far between. You can't lose your audience and have any value. The most Nvidia could do with it at this point is a "3dfx Edition" or "Voodoo Edition" card. Nvidia Geforce 4070 Voodoo. Instead of Super or Ti. Something like that. I think Voodoo was actually a better name than Geforce. It's possible the 5000 series was an ode to 3dfx with the FX moniker.
  • Flunk - Monday, May 1, 2023 - link

    I don't think there is any chance of Nvidia using the Voodoo branding as long as Geforce remains synonymous with top level gaming performance. They'll only chance branding if they do something to tank the cache of the Geforce brand name.
  • wumpus - Sunday, April 30, 2023 - link

    Did nvidia ever re-allow the "glide wrapper". It looks like you can download it, even though 3dFX took it down originally. I'd have thought I'd notice it (although I was rocking a voodoo 3, and when I switched to the Radeon glide was dead).

    Once they bought 3dfx, the glide wrapper could only help sell nvidia cards.
  • Scipio Africanus - Saturday, April 29, 2023 - link

    Never had a Matrox card, my first PC card was an ATI Mach32. Funny how they've survived all the way till now though technically its AMD now.
  • Samus - Saturday, April 29, 2023 - link

    They have always been very solid products. Obviously nothing GPU competitive in decades, the last realistically being the G400 MAX, but you saw Matrox GPU cores a lot in servers for decades up until recently as dedicated 2D graphics output is dying in even SMB servers with the integration of iGPU in Xeons or reliance on IMPI for remote management. Occasionally there are still dedicated graphics chips but these days they seem to mostly be Aspeed.

    Anyway, the one thing you get, or got, was incredible 2D quality. Throughout the 90's, Matrox was regarded as having the cleanest analog output available thanks to high quality chokes, circuit designs, filters, and class-leading RAMDACs. As everything went digital this all became less of a feature and more of a given, but it was one thing that really put them on the map.

Log in

Don't have an account? Sign up now