News Editor, Anton Shilov

There are several things that people want from their personal computers these days: mobility, high-resolution display, high performance across a wide range of applications (including demanding PC games), sleek design and some require a small form-factor setup. While it is possible to get an ultra-thin laptop, a powerful desktop and a high resolution monitor, it is pretty hard to get everything in one package. Apparently, companies like ASUS, MSI and Razer know a way how to partly solve the problem, and we saw their solutions at CES.

Modern microprocessors and solid-state drive can offer desktop-class performance on an ultra-thin notebook. However, when it comes to performance in graphics-intensive applications, it is simply impossible to build a leading-edge GPU, which is required to play the latest games in ultra-HD (4K) resolution, into an ultra-slim form-factor system. As modern GPUs can dissipate 300W of heat, it is impossible to cool-down such a chip in a laptop. Moreover, even a 100W GPU will completely ruin battery life and will require a sophisticated cooling solution, which means a thicker design. The only way to enable proper graphics performance on a small form-factor PC is to use an external graphics adapter. While some may argue that external graphics solutions are only useful for a fraction of the market, this is not entirely true. In addition to notebooks, external graphics adapters could be plugged to small form-factor PCs like Intel NUC or all-in-one systems with decent displays, which are expanding in their utility, particularly in enterprise and other global markets.

External graphics adapters are not something new. Back in 2007/2008 AMD introduced its external graphics port (XGP) technology, code-named Lasso. AMD’s XGP allowed to connect a graphics card to a PC using the PCIe 2.0 x8 or x16 interface, which guaranteed sufficient amount of bandwidth for the time. Unfortunately, the XGP relied on a sophisticated proprietary connector that was made by only one company and was rather expensive. As a result, it never took off.

ASUS, Alienware, MSI and some other companies have also introduced external graphics solutions for their mobile PCs over the past decade. However, their GPU boxes were either proprietary, or had performance limitations. For example, ASUS’ first-generation XG Station contained an NVIDIA GeForce 8600 GT GPU and used Express Card interface, providing rather low performance even for 2007. More recently, Alienware and MSI introduced proprietary external graphics solutions compatible only with their laptops. The GPUs relied on external PCIe x4 interface, they were compatible with rather powerful video cards (thanks to the fact that they featured their own PSUs) and hence could really bring serious performance improvements. Unfortunately, both Alienware’s Graphics Amplifier  as well as MSI’s Gaming Dock were only compatible with select laptops made by these two companies.

At CES 2016 several companies introduced their new external GPU boxes that can accommodate high-end graphics adapters. At least one of the solutions uses Thunderbolt 3 interface with up to 40 Gb/s transfer rate and are compatible with various PCs. Some continue to be proprietary, but are using more modern connectors such as USB Type-C. However, since they can offer desktop performance, they can help to build gaming systems based on ultra-thin notebooks, AIO or SFF desktops.

ASUS was among the first hardware makers to offer external graphics solutions for laptops in the mid-2000s and at CES 2016 it demonstrated its all-new XG Station 2, which is designed for the company’s upcoming notebooks. The XG Station 2 is compatible with any ASUS video card based on AMD Radeon or NVIDIA GeForce GPU that consumes up to 350W of power, which means that you can install almost any board into this dock. The XG Station 2 uses two USB type-C cables that support up to 32 Gb/s transfer rates (which is equivalent to PCIe 3.0 x4), but relies on a proprietary architecture. The external GPU kit from ASUS seems to be a powerful solution and it even allows using the laptop’s own display to play games using external video cards. However, since it is a proprietary technology, it will not be compatible with non-ASUS systems. 

The manufacturer did not reveal a lot of information about its plans concerning the XG Station 2 and compatible laptops. Hence, it is unknown how competitive ASUS’ solutions with external graphics will be. Nonetheless, it is a good thing to know that the world’s largest supplier of gaming laptops intends to offer external GPUs as an option.

MSI already offers Gaming Docks for select laptops. At CES 2016 the company demonstrated its external graphics solution for its all-in-one gaming PCs. The external GPU dock for AIO systems is compatible only with the company’s 27XT 6QE as well as NVIDIA graphics cards, but it uses PCI Express interconnection and can house almost any contemporary GeForce (with exception of dual-GPU cards and some non-reference boards). The solution looks like a commercial one and it will, without any doubts, become a key selling point of MSI’s gaming AIOs this year.

The form-factor of the dock is tailored for all-in-one PCs, hence, it cannot be easily connected to laptops or SFF systems. Moreover, since implementation of the PCI Express and GPU power delivery are clearly proprietary, MSI’s external graphics boxes will only be compatible with its own AIOs. Keeping in mind that MSI needs to sell system to gamers, even proprietary external GPU box makes a great sense for the company as it allows is to offer almost any video card with its AIOs, unlike other PC makers.

Razer, which is mostly known for its peripherals and gaming laptops, decided not to use any proprietary technologies with its new Razer Stealth ultrabook and Razer Core external GPU dock. Everything is based on industry-standard components and hence the Core can be connected to almost any system with Thunderbolt 3.

Unlike ASUS and MSI, which are still finalizing their new external GPU technologies, Razer is already taking pre-orders on the Stealth notebook. The Stealth laptop is just 0.52-inches thick, but it features a 12.5-inch IGZO display with 2560x1440 or even 3840x2160 resolution as well as the Intel Core i7-6500U central processing unit. The system can be equipped with up to 8 GB of LPDDR3-1866 memory, up to 512 GB of PCIe SSD, 802.11ac Wi-Fi, built-in webcam and so on. The laptop starts at $999, which is comparable to other ultrabooks.

They Razer Core is an external GPU box that connects to personal computers using Thunderbolt 3 interface with up to 40 Gb/s transfer rate (appropriate cable is included). The enclosure features its own 500 W PSU and is compatible with all graphics cards with up to 375 W TDP. The GPU box also features four additional USB 3.0 ports as well as a Gigabit Ethernet controller.

Razer does not sell its Core GPU box just yet, hence, the pricing is unknown. Nonetheless, its reliance on Thunderbolt 3 technology and compatibility with a variety of laptops and SFF PCs makes it a very interesting product. If the company decides not to artificially limit compatibility of its Core with third-party PCs, the external GPU box can become a rather powerful product among owners of notebooks and SFF PCs with Thunderbolt 3 interconnection.

The general industrial trends show that modern PCs are becoming smaller and sleeker, but high-end graphics adapters remain rather large and power hungry. As a result, external graphics solutions for mobile and small form-factor personal computers are just what the doctor ordered. However, proprietary solutions are not always good, especially if we are talking about systems from smaller suppliers or the desire to be 'truly' universal. That being said, locking a user into a certain methodology might guarantee future sales. But Thunderbolt 3-based external GPU boxes look very promising because they combine relatively high transfer rates with simplicity and industry-standard cables (which means relatively affordable pricing).

In fact, after seeing Razer’s Core, it becomes pretty clear that after a decade in development, I think external graphics is on its way to finally done right.

CES 2016 Roundup (4): SSD Editor, Billy Tallis
Comments Locked

44 Comments

View All Comments

  • JonnyDough - Wednesday, January 27, 2016 - link

    "With these things in mind, it does make sense that Samsung is pushing in a different direction. When looking at the TV market, I don’t see OLED as becoming a complete successor to LCD, while I do expect it to do so in the mobile space. TVs often have static parts of the interface, and issues like burn in and emitter aging will be difficult to control."

    Wouldn't that be opposite? Phones and tablets are often used in uncontrolled environments, and have lock screens and apps that create static impressions on a display as much as any tv in my opinion. I think OLEDs could definitely penetrate the television market, and I think as a result of either they will trickle over into other markets due to cost. Unless a truly viable alternative to OLEDs can overtake these spaces, I think that continual refinements in OLED help it prove to be a constantly used and somewhat static technology. Robots are moving more and more towards organics as well - so it would make sense that in the future we borrow more and more from nature as we come to understand it.
  • Brandon Chester - Wednesday, January 27, 2016 - link

    Relative to TVs you keep your phone screen on for a comparatively short period of time. Aging is actually less of an issue in the mobile market. Aging is the bigger issue with TV adoption, with burn in being a secondary thing which could become a larger problem with the adoption of TV boxes that get left on with a very static UI.
  • JonnyDough - Thursday, January 28, 2016 - link

    You brought up some good points. I wonder though how many people have a phablet and watch Netflix or HBO now when on the road in a hotel bed.
  • Kristian Vättö - Thursday, January 28, 2016 - link

    I would say the even bigger factor is the fact that TV upgrade cycles are much longer than smartphones. While the average smartphone upgrade cycle is now 2-2.5 years, most people keep their TVs for much longer than that, and expect them to function properly.
  • Mangemongen - Tuesday, February 2, 2016 - link

    I'm writing this on my 2008, possibly 2010 Panasonic plasma TV which shows static images for hours every day, and I have seen no permanent burn in. There is merely some slight temporary burn in. Is OLED worse than modern plasmas?
  • JonnyDough - Wednesday, January 27, 2016 - link

    What we need are monitors that have a built in GPU slot, since AMD is already helping them to enable other technologies, why not that? Swappable GPUs on a monitor, the monitors already have a PSU built in so why not? Put a more powerful swappable PSU with the monitor, a mobile like GPU, and voila. External plug and play graphics.
  • Klug4Pres - Wednesday, January 27, 2016 - link

    "The quality of laptops released at CES were clearly a step ahead of what they have been in the past. In the past quality was secondary to quantity, but with the drop in volume, everyone has had to step up their game."

    I don't really agree with this. Yes, we have seen some better screens at the premium end, but still in the sub-optimal 16:9 aspect ratio, a format that arrived in laptops mainly just to shave a few bucks off cost.

    Everywhere we are seeing quality control issues, poor driver quality, woeful thermal dissipation, a pointless pusuit of ever thinner designs at the expense of keyboard quality, battery life, speaker volume etc., a move to unmaintanable soldered CPUs and RAM.

    Prices are low, quality is low, volumes are getting lower. Of course, technology advances in some areas have led to improvements, e.g. Intel's focus on idle power consumption that culminated in Haswell battery-life gains.
  • rabidpeach - Wednesday, January 27, 2016 - link

    yea? 16k per eye? is that real, or him make up numbers to make radeon have something to shoot for in future?
  • boeush - Wednesday, January 27, 2016 - link

    There are ~6 million cones (color photoreceptors) per human eye. Each cone perceives only the R, G, or B portion (roughly speaking), making for roughly 2 megapixels per eye. Well, there's much lower resolution in R, so let's say 4 megapixels to be generous.

    That means 4k, spanning the visual field, already exceeds human specs by a factor of 2, at first blush. Going from 4k to 16k boosts pixel count by a factor of 16, we end up exceeding human photoreceptor count by a factor of 32!

    But there's a catch. First, human acuity exceeds the limit of color vision, because we have 20x more rods (monochromatic receptors) than cones, which provide very fine edge and texture information over which the color data from the cones is kind of smeared or interpolated by the brain. Secondly, most photoreceptors are clustered around the fovea, giving very high angular resolution over a small portion of the visual field - but we are able to rapidly move our eyeballs around (saccades), integrating and interpolating the data to stitch and synthesuze together a more detailed view than would be expected from a static analysis of the optics.

    In light of all of which, perhaps 16k uniformly covering the entire visual field isn't such overkill after all if the goal is the absolute maximum possible visual fidelity.

    Of course, running 16k for each eye at 90+ Hz (never even mind higher framerates) would take a hell of a lot of hardware and power, even by 2020 standards. Not to mention, absolute best visual fidelity would require more detailed geometry, and more accurate physics of light, up to full-blown real-time ray-tracing with detailed materials, caustics, global illumination, and many bounces per ray - something that would require a genuine supercomputer to pull off at the necessary framerates, even given today's state of the art.

    So ultimately, its all about diminishing returns, low-hanging fruit, good-enough designs, and balancing costs against benefits. In light of which, probably 16k VR is impractical for the foreseeable future (meaning, the next couple of decades)... Personally, I'd just be happy with a 4k virtual screen, spanning let's say 80% of my visual field, and kept static in real space via accelerometer-based head-tracking (to address motion sickness) with an option to intentionally reposition it when desired - then I wouldn't need any monitors any longer, and would be able to carry my high-res screen with/on me everywhere I go...
  • BMNify - Wednesday, January 27, 2016 - link

    "AMD's Raja Koduri stating that true VR requires 16K per eye at 240 Hz."

    well according to the bbc r&d scientific investigations found the optimal being close to the 300 fps we were recommending back in 2008 and prove higher frame rates dramatically reduce motion blur which can be particularly disturbing on large modern displays.

    it seems optimal to just use the official UHD2 (8k) spec with multi surround sound and the higher real frame rates of 100/150/200/250 fps for high action content as per the existing bbc/nhk papers... no real need to define UHD3 (16k) for near eye/direct retina display

Log in

Don't have an account? Sign up now