We haven't covered every significant departure from AMD here, but there have been many. Carrell Killebrew and Eric Demers were among the earliest high-profile AMDers to leave. More recently John Bruno, AMD System Architect responsible for the Trinity platform, left the company and joined Apple

The talent exodus from AMD isn't entirely unexpected. The new executive team is responsible for cutting costs and implementing the company's new direction. Significant actions don't always go over smoothly, reorganizations can create unpopular hierarchies and there's bound to be disagreement on the right way to do things. The state of anything never remains the same, the situation either gets better or worse. For AMD to improve its standing in the market, change is necessary.

Today, for the first time in quite a while, a bit of good employment news comes out of AMD. Jim Keller,  lead architect on AMD's K8 processor (the original Athlon 64), system engineer on K7 (the original Athlon), and co-author of the x86-64 spec, has returned to AMD. Jim's official title is Corporate Vice President and Chief Architect for CPU Cores, and he will report directly to Mark Papermaster. 

Keller first joined AMD in 1998 after leaving DEC, having worked on the Alpha 21164 and 21264. Upon his arrival at AMD, Jim Keller immediately went to work on launching the K7. His next tasks were co-authoring the Hyper Transport and 64-bit x86 specs, in addition to working on the K8 architecture. It was the K8 architecture that really gave AMD a significant performance (and power advantage) over Intel, who was shipping Pentium 4 at the time. After his work on K8, Keller left AMD in 1999 and went to work at Sibyte (and later Broadcom after they acquired Sibyte).

The stint at Sibyte/Broadcom was longer than his time at AMD, but in 2004 Jim Keller joined PA Semi as the Vice President of Engineering. Apple eventually acquired PA Semi in 2008, and Keller went with the team.

32nm A5 in iPad 2,4 (Source: Chipworks)

While at Apple, Keller worked on the design teams for the A4 and A5 SoCs. He was also in charge of defining the specifications of two MacBook Air generations. The late Steve Jobs felt it prudent to have the best graphics and chip architects around, as they would be able to define performance needs and come up with system requirements better than anyone else. Raja Koduri, former AMD Graphics CTO, was Jim Keller's graphics counterpart at Apple where he still remains today. Indeed the impact of Keller, Koduri and others like them was best felt when Apple seemed to always integrate the right silicon at the right time.

In his new job at AMD, Jim Keller will likely be faced with the difficult task of getting AMD's x86 performance back on track. There are many working on this problem already, but Jim's addition should be a good one. At a high level, Keller's product track record looks excellent. Let's hope he can pull another Athlon out of his hat the second time around at AMD.

Comments Locked


View All Comments

  • lbeyak - Wednesday, August 1, 2012 - link

    I don't know what you mean by "affects the future or not", but I don't see what graduate studies has to do with being a successful/driven person.

    I did Graduate studies myself because I thought it would be fun and interesting.
    Sure, maybe it means they are interested in things, and so will be more willing to put in effort to do something worthwhile.

    Of course, there are several high school / university drop-outs that became highly successful.
  • pavlindrom - Wednesday, August 1, 2012 - link

    What I meant was whether he became a high-profile person with some aid from a graduate school; as in an employer placed him in a challenging position because he has the degree from where he succeeded.
    Is work ethic and intelligence all that is necessary to succeed, is what I am trying to understand. I am wondering if there is anything to be gotten for me with a graduate degree.
  • name99 - Wednesday, August 1, 2012 - link

    If a field is VERY NEW then you can get away with no graduate studies --- eg Bill Gates, Steve Jobs, probably some of the early CPU architects.

    Once a field has been around for ten years or more, there is an accumulated body of knowledge and how best to do things. Once that arises, you'd be foolish to assume that you can avoid grad school --- at that point you're essentially saying that you are so smart you have nothing to learn from the accumulated experiences of thousands of individuals in a wide variety of circumstances. This may even be true --- but good luck proving it to anyone.

    There MAY be a few fields like this today (for example I suspect if you can come up with something truly innovative in Augmented Reality you could get away without grad school), but the number of fields like this is always very limited, and fields of this nature fifteen years ago (eg a lot of network coding, coding for huge clusters, coding for phones, video game coding) have long become professionalized.
  • Oxford Guy - Thursday, August 2, 2012 - link

    Jobs got into the industry thanks to the technical ability of Wozniak. Without Woz, Jobs' business skills would have had to look elsewhere.

    Gates is a similar case. He didn't succeed because of his technical ability. He was a businessman.

    Businessmen don't necessarily need graduate degrees.
  • medi01 - Monday, August 6, 2012 - link

    You're right on Jobbs, wrong on Gates.

    Jobbs was a businessman only, but Gates could (and did) code (a lot in early days).
  • KoolAidMan1 - Monday, August 6, 2012 - link

    DOS was bought by Gates from another engineer. His coding was minimal, and one of the few things he had his name on was this game notorious for being bad: http://en.wikipedia.org/wiki/DONKEY.BAS

    Gates was a businessman first and foremost. He wasn't much of an engineer and ironically he wasn't half the technologist that the non-engineer Jobs was.
  • mevans336 - Friday, August 3, 2012 - link

    And neither do technical people.

    If you want to become an true engineer, you need grad school.

    If you want to specialize in Cisco, WIndows, VMWare, or databases, your best option is to avoid grad school (and most times college altogether if you start early enough) ... because experience trumps all.
  • Coilaman - Wednesday, August 1, 2012 - link

    I recently, was building a budget PC for graphic design, web design and video editing and I have to tell you, I decided on a Llano edition of the FM1 socket Athlon II X4-631 processor. It was released in August 2011. Why did I buy it?

    I got this processor for $81 and it's a very powerful quad-core beast. It easily beats any Intel Core 2 Quad or even first generation Core i3 processors and it just grinds through any video editing or big Photoshop files with great ease. Combined with a $100 graphics card, you'll be able to work very efficiently and even game quite well for some time to come.

    Like I said, it's a beast and I highly recommend AMD processors because of their incredible value for dollar, especially in the low-end market.
  • KitsuneKnight - Thursday, August 2, 2012 - link

    The low end is really the only segment where AMD is competitive at all with Intel, and, even then, things aren't exactly going well for them. AMD really does need to step up their game... at this point, ARM is quickly growing to be a bigger direct competitor to Intel than them.
  • Belard - Friday, August 3, 2012 - link

    AMD's socket standards is messed up. The FX-series is sometimes slower than the previous tech... the "8core" AMD is usually slower than intel's mid-range quads.

    Socket FM1 is dead. Once the inventory is gone, you'll see FM2 for the general users. Otherwise, companies like HP are already making and shipping FM2 systems. No new CPUs models for FM1.

    The current Socket AM3+ is outdated, the latest chipset is mostly a renamed previous chipset that is certified to use the new FX CPUs. The FM1/2 Chipset has native USB3 support. FM1 and FM2 sockets are incompatible, even thou its the same number of pins and socket style. NO FM2 CPUs will fit a FM1 motherboard. But the other kicker? Its the same exact Northbridge used on both!

    There are no motherboard support for PCIe 3.0 for AMD systems any time soon... we're talking 2013~14?! Nevermind that AMD makes PCIe 3.0 graphics cards.

    For a low-end low-cost entry level computer, AMD is fine... I guess. But its not something I would use for myself and would not sell to my clients. During the P4 era, All I built were AMD systems. 2012 is the first year I didn't build any AMD powered systems.

    AMD has serious issues, and making Pentium4 2.0 was a stupid stupid thing to do.
    They really should have just die-shrunk what they already had, it would have been cheaper.

Log in

Don't have an account? Sign up now