NVIDIA CES 2015 Press Conference Liveblog
by Ryan Smith & Joshua Ho on January 4, 2015 10:57 PM EST
12:36AM EST - And we're done here, more later
12:36AM EST - Wrap-up summary: Tegra X1, Drive CX, Drive PX
12:35AM EST - Tegra X1 for graphics for cars, Tegra X1 for compute for cars
12:34AM EST - This Tegra PX unit has never been trained against the NV parking garage. It did all of this on the first shot
12:33AM EST - Now parking
12:33AM EST - Auto valet finally found a spot
12:30AM EST - Using the camera data to model the environment and then do path finding in it
12:28AM EST - Running Drive PX against the simulation
12:25AM EST - Using sim to show how Drive PX auto-valet works
12:24AM EST - NVIDIA created a simulation of their parking garage
12:22AM EST - More on Drive PX: Surround Vision
12:22AM EST - And NVIDIA wants to supply the hardware and parts of the osftware
12:21AM EST - Audi bullish on self-driving cars
12:19AM EST - Audi now going on a multi-state tour with their self-driving car
12:18AM EST - Now discussing Audi's self-driving concept car
12:16AM EST - Audi Prologue: Audi wants to go all digital in the cockpit
12:13AM EST - Discussing Audi's pioneering use of Tegra in their cars
12:10AM EST - Audi is a repeat partner of NVIDIA. Have been at previous NVIDIA events
12:10AM EST - Now on stage Ricky Hudi of Audi. Exec VP of Electronics Development
12:07AM EST - Tegra X1 neural net classification performance is more than doubled over TK1
12:07AM EST - Closing the circle: send classification results back to the supercomputer to correct improperly identified objects
12:06AM EST - Drive PX receives the finished neural net and uses it for classification
12:05AM EST - Training is GPU-time intensive, so it occurs on Tesla supercoputers
12:03AM EST - (Outside NV HQ, so it was staged)
12:02AM EST - Cameras at least properly identified the police car behind them
12:02AM EST - NVIDIA car got pulled over by the police
12:01AM EST - How to Train Your Computer
11:59PM EST - Identifying cars, trucks, vans, etc
11:58PM EST - Still on Drive PX computer vision demo. New scene: Vegas
11:56PM EST - Drive PX can also identify speed cameras
11:55PM EST - Processing is in monochrome, though the video is color for human benefit
11:54PM EST - Drive PX is IDing signs, pedestrians, traffic lights. Can even pick out partially occluded pedestrians
11:53PM EST - Clarification: video is recorded, Drive PX processing is being done live
11:51PM EST - Neural networks in a nutshell: throw a ton of data at a network and let it figure out how to organize it to recognize it in the future
11:51PM EST - Now showing a demo of how a recently trained Drive PX sees the world
11:50PM EST - Neural networks to power car image recognition
11:48PM EST - Neural networks, continued
11:45PM EST - Neural network tech is still fairly new, but its getting better
11:42PM EST - Now a brief overview of how neural networks work and how they can be trained on GPUs and then executed on GPUs
11:41PM EST - Neural networks and computer vision tend to be good fits for GPUs, so for NVIDIA this is a logical use for their GPU technology
11:40PM EST - Analysis taps all the major Tegra X1 components: CPUs, GPUs, and ISPs
11:40PM EST - Use camera data + Drive PX plus software based on deep neural nets to begin understanding the world and build an internal model of it
11:37PM EST - Cameras: 1080p60, x12
11:37PM EST - Based on 2 Tegra X1s, 12 camera inputs, process 1.3GPix/sec
11:37PM EST - "Auto-pilot car computer"
11:37PM EST - Second new car platform: Drive PX
11:36PM EST - Self piloting cars? NVIDIA wants to build the in-car computer to enable that
11:34PM EST - Now how to replace radar and ultrasound with vision cameras (in some circumstances)
11:33PM EST - Quck description of how ADAS works: radar, ultrasound, and vision
11:32PM EST - Parking assist, lane change assist, adaptive cruise control, etc
11:31PM EST - More cars, now discussing ADAS - Advanced Driver Assistance Systems
11:30PM EST - Wrapping up Drive CX. NVIDIA sells the whole platform
11:28PM EST - Change the simulated material used for you gauges on the fly
11:28PM EST - More on physically based rendering and how important NVIDIA feels it is
11:24PM EST - Running physically based rendering on the cockpit, just to show that they can
11:23PM EST - Android-powered, so it can be used with Google Maps and other Android apps
11:22PM EST - Now showcasing navigation mode
11:21PM EST - The 3D cockpit looks more flashy than functional, but the concept seems sound
11:19PM EST - Drive Studio is meant to be a complete off-the-shelf digital cockpit solution. NVIDIA is including almost everything one would need
11:18PM EST - Live demo of Drive CX running a virtual cockpit and infotainment center
11:17PM EST - Drive CX uses: navigration, cockpit displays, etc
11:17PM EST - There are issues with connection on site, images are slow to upload.
11:16PM EST - Also comes with an NVIDIA software suit called DRIVE Studio
11:16PM EST - Powered by Tegra X1, is a complete digital cockpit computing kit
11:15PM EST - New NVIDIA platform: Drive CX
11:14PM EST - Talking about how "rich displays" in cars mean more displays at a higher resolution; need more powerful GPUs to run it
11:13PM EST - NVIDIA's Tegra automotice business has been a small success amid the greater challenges that hvae faced Tegra
11:13PM EST - Now for a subject that's a favorite of Jen-Hsun: cars
11:12PM EST - Paper napkin math says that the GPU clockspeed needs to be 1GHz for NVIDIA's GPU performance numbers
11:12PM EST - Confirmed that it's Erista
11:12PM EST - NVIDIA is proclaiming it a 1 TFLOPS GPU, though this is at FP16 as opposed to the more normal FP32 metric for TFLOPS
11:11PM EST - TX1 adds native-ish FP16 support
11:10PM EST - Clearly not as high quality as the desktop GPU demos, but it still looks impressive
11:09PM EST - This of course already ran on Maxwell desktop GPUs, so it looks like NVIDIA has ported the necessary bits to ARM
11:08PM EST - Yep. Eleental
11:07PM EST - Sounds like we're going to be seeing Unreal Engine Elemental running on TX1
11:07PM EST - Tegra X1 demo, running at roughly 10W
11:07PM EST - Maxwell's energy efficiency means that NVIDIA can alleviate some of that unavoidable TDP throttling
11:05PM EST - Promising much better GPU performance than Tegra K1 at the same power
11:05PM EST - GPU-heavy introduction. The focus is all on the GPU
11:05PM EST - This should be Erista, first added to the NV roadmap at GTC 2014: http://www.anandtech.com/show/7905/nvidia-announces-jetson-tk1-dev-board-adds-erista-to-tegra-roadmap
11:04PM EST - 8 core CPU
11:04PM EST - 256 core Maxwell GPU
11:03PM EST - Announcing Tegra X1
11:03PM EST - Starting things off with Maxwell
11:02PM EST - Starting with a recap of past achievements; Tegra K1, Maxwell, etc
11:02PM EST - Jen-Hsun is now on stage
11:01PM EST - NVIDIA has started promptly at 8pm
11:01PM EST - Ryan is on the keys, Josh is on the photos
11:01PM EST - Okay, we're seated and connected
18 Comments
View All Comments
kron123456789 - Monday, January 5, 2015 - link
CES ain't over yet))vred - Monday, January 5, 2015 - link
but... but... I wanted everything now... :Dboe - Monday, January 5, 2015 - link
Yeah - All I really ever care about from NVidia is what they have for a new flagship video card.jwcalla - Monday, January 5, 2015 - link
mehat80eighty - Monday, January 5, 2015 - link
the deep learning is super-excitingbut a blow-by-blow tracking of where you are and what's around - sent to a cloud; privacy is my immediate concern here
Ryan has anyone fielded a similar question to Nvidia?
Ryan Smith - Monday, January 5, 2015 - link
The talk so far about sending data back has been about neural net identification failures. Position data has not been discussed (and should not be necessary to train the neural nets). However this is still all very high concept; vehicles with this technology are years off, which leaves a lot of time for change.at80eighty - Monday, January 5, 2015 - link
Ah, thanks Ryan. Given all they were analysing, I figured it was a data subset being covered.FullHiSpeed - Monday, January 5, 2015 - link
"11:45PM EST - Neural network tech is still fairly new, but its getting better"I got a job at JPL in 1986 to work on Neural-Network-specific massively parallel hardware.
Is 29 year old technology "new" ?
"Neural networks in a nutshell: throw a ton of data at a network and let it figure out how to organize it to recognize it in the future"
Same nutshell as in 1986. Not new.