K8 was released 2003, as far as I remember (I used to own one) from the article "Keller first joined AMD in 1998" and "Keller left AMD in 1999" how can he be the father of a K8? something seems wrong, but I can't say what. m
I have no idea what "complete the design" means, but morfinx is essentially correct. The number is substantially larger for x86 designs than other designs because they are so complex.
From initial conception to ship was 7 years for Nehalem. This number is not secret. There is a talk at Stanford on iTunesU about the design process and how long it took.
Maybe but although the article praises the K8 in particular, the K7 original athlon architecture kicked intel's butt for many years I seem to remember. Both MHz for MHz and $ for $.
I don't know what you mean by "affects the future or not", but I don't see what graduate studies has to do with being a successful/driven person.
I did Graduate studies myself because I thought it would be fun and interesting. Sure, maybe it means they are interested in things, and so will be more willing to put in effort to do something worthwhile.
Of course, there are several high school / university drop-outs that became highly successful.
What I meant was whether he became a high-profile person with some aid from a graduate school; as in an employer placed him in a challenging position because he has the degree from where he succeeded. Is work ethic and intelligence all that is necessary to succeed, is what I am trying to understand. I am wondering if there is anything to be gotten for me with a graduate degree.
If a field is VERY NEW then you can get away with no graduate studies --- eg Bill Gates, Steve Jobs, probably some of the early CPU architects.
Once a field has been around for ten years or more, there is an accumulated body of knowledge and how best to do things. Once that arises, you'd be foolish to assume that you can avoid grad school --- at that point you're essentially saying that you are so smart you have nothing to learn from the accumulated experiences of thousands of individuals in a wide variety of circumstances. This may even be true --- but good luck proving it to anyone.
There MAY be a few fields like this today (for example I suspect if you can come up with something truly innovative in Augmented Reality you could get away without grad school), but the number of fields like this is always very limited, and fields of this nature fifteen years ago (eg a lot of network coding, coding for huge clusters, coding for phones, video game coding) have long become professionalized.
DOS was bought by Gates from another engineer. His coding was minimal, and one of the few things he had his name on was this game notorious for being bad: http://en.wikipedia.org/wiki/DONKEY.BAS
Gates was a businessman first and foremost. He wasn't much of an engineer and ironically he wasn't half the technologist that the non-engineer Jobs was.
If you want to become an true engineer, you need grad school.
If you want to specialize in Cisco, WIndows, VMWare, or databases, your best option is to avoid grad school (and most times college altogether if you start early enough) ... because experience trumps all.
I recently, was building a budget PC for graphic design, web design and video editing and I have to tell you, I decided on a Llano edition of the FM1 socket Athlon II X4-631 processor. It was released in August 2011. Why did I buy it?
I got this processor for $81 and it's a very powerful quad-core beast. It easily beats any Intel Core 2 Quad or even first generation Core i3 processors and it just grinds through any video editing or big Photoshop files with great ease. Combined with a $100 graphics card, you'll be able to work very efficiently and even game quite well for some time to come.
Like I said, it's a beast and I highly recommend AMD processors because of their incredible value for dollar, especially in the low-end market.
The low end is really the only segment where AMD is competitive at all with Intel, and, even then, things aren't exactly going well for them. AMD really does need to step up their game... at this point, ARM is quickly growing to be a bigger direct competitor to Intel than them.
AMD's socket standards is messed up. The FX-series is sometimes slower than the previous tech... the "8core" AMD is usually slower than intel's mid-range quads.
Socket FM1 is dead. Once the inventory is gone, you'll see FM2 for the general users. Otherwise, companies like HP are already making and shipping FM2 systems. No new CPUs models for FM1.
The current Socket AM3+ is outdated, the latest chipset is mostly a renamed previous chipset that is certified to use the new FX CPUs. The FM1/2 Chipset has native USB3 support. FM1 and FM2 sockets are incompatible, even thou its the same number of pins and socket style. NO FM2 CPUs will fit a FM1 motherboard. But the other kicker? Its the same exact Northbridge used on both!
There are no motherboard support for PCIe 3.0 for AMD systems any time soon... we're talking 2013~14?! Nevermind that AMD makes PCIe 3.0 graphics cards.
For a low-end low-cost entry level computer, AMD is fine... I guess. But its not something I would use for myself and would not sell to my clients. During the P4 era, All I built were AMD systems. 2012 is the first year I didn't build any AMD powered systems.
AMD has serious issues, and making Pentium4 2.0 was a stupid stupid thing to do. They really should have just die-shrunk what they already had, it would have been cheaper.
The funny or sad thing about PA-Semi is that no more products will likely come from them, nothing new anyway. What they make/made were low-powered PowerPC compatible CPUs, military/space grade.
What is sad/funny is that the Amiga community WENT with THAT as their CPU platform... taking them years to come up with a custom board with both PCIe, PCI and Amiga slots. The system board is unique... for sure. And they consider the 1.8Ghz low-watt CPU to be a powerhouse? It costs about $1500+ for the computer. Its a hobby OS for 10+ years running on hardware with no future, again.
Its amazing that what is left of the Amiga community hasn't just gone Linux. Its the closest thing to Amiga they could use (Amiga is still a *nix based OS).
Well may be a question to Jim, did Apple *really* design A4 or A5. Or the internet rumours are those chip just Samsung Exynos, which i think is not true since Sammy dont use PowerVR.
If so what exactly has PA-Semi contribute to Apple? and Intrinsity ? After all they dont seems to be ground breaking things from Apple SoC.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
24 Comments
Back to Article
lecaf - Wednesday, August 1, 2012 - link
K8 was released 2003, as far as I remember (I used to own one)from the article
"Keller first joined AMD in 1998" and "Keller left AMD in 1999"
how can he be the father of a K8? something seems wrong, but I can't say what.
m
DanNeely - Wednesday, August 1, 2012 - link
There're several years between when a chip is designed and when it finally gets to market.Zoomer - Thursday, August 2, 2012 - link
Also remember that he works on the architecture. That's a little higher level, and a ton of work goes on after that towards custom implementations.morfinx - Wednesday, August 1, 2012 - link
It takes about 5 years from conception to market for a new CPU architecture.romesh - Wednesday, August 1, 2012 - link
it takes abt 1 year to complete the design of a processor architecture like k8name99 - Wednesday, August 1, 2012 - link
I have no idea what "complete the design" means, but morfinx is essentially correct. The number is substantially larger for x86 designs than other designs because they are so complex.From initial conception to ship was 7 years for Nehalem. This number is not secret. There is a talk at Stanford on iTunesU about the design process and how long it took.
romesh - Thursday, August 2, 2012 - link
2003-1998=5 yearsChipAnalyst - Wednesday, August 1, 2012 - link
lecaf is right. Jim left AMD before K8 microarchitecture was finalized. Fred Weber took over the K8 program and led the final design.JNo - Thursday, August 2, 2012 - link
Maybe but although the article praises the K8 in particular, the K7 original athlon architecture kicked intel's butt for many years I seem to remember. Both MHz for MHz and $ for $.pavlindrom - Wednesday, August 1, 2012 - link
One thing that makes me wonder - did he go to graduate studies or not? I am interested to see if that affects the future much or not.lbeyak - Wednesday, August 1, 2012 - link
I don't know what you mean by "affects the future or not", but I don't see what graduate studies has to do with being a successful/driven person.I did Graduate studies myself because I thought it would be fun and interesting.
Sure, maybe it means they are interested in things, and so will be more willing to put in effort to do something worthwhile.
Of course, there are several high school / university drop-outs that became highly successful.
pavlindrom - Wednesday, August 1, 2012 - link
What I meant was whether he became a high-profile person with some aid from a graduate school; as in an employer placed him in a challenging position because he has the degree from where he succeeded.Is work ethic and intelligence all that is necessary to succeed, is what I am trying to understand. I am wondering if there is anything to be gotten for me with a graduate degree.
name99 - Wednesday, August 1, 2012 - link
If a field is VERY NEW then you can get away with no graduate studies --- eg Bill Gates, Steve Jobs, probably some of the early CPU architects.Once a field has been around for ten years or more, there is an accumulated body of knowledge and how best to do things. Once that arises, you'd be foolish to assume that you can avoid grad school --- at that point you're essentially saying that you are so smart you have nothing to learn from the accumulated experiences of thousands of individuals in a wide variety of circumstances. This may even be true --- but good luck proving it to anyone.
There MAY be a few fields like this today (for example I suspect if you can come up with something truly innovative in Augmented Reality you could get away without grad school), but the number of fields like this is always very limited, and fields of this nature fifteen years ago (eg a lot of network coding, coding for huge clusters, coding for phones, video game coding) have long become professionalized.
Oxford Guy - Thursday, August 2, 2012 - link
Jobs got into the industry thanks to the technical ability of Wozniak. Without Woz, Jobs' business skills would have had to look elsewhere.Gates is a similar case. He didn't succeed because of his technical ability. He was a businessman.
Businessmen don't necessarily need graduate degrees.
medi01 - Monday, August 6, 2012 - link
You're right on Jobbs, wrong on Gates.Jobbs was a businessman only, but Gates could (and did) code (a lot in early days).
KoolAidMan1 - Monday, August 6, 2012 - link
DOS was bought by Gates from another engineer. His coding was minimal, and one of the few things he had his name on was this game notorious for being bad: http://en.wikipedia.org/wiki/DONKEY.BASGates was a businessman first and foremost. He wasn't much of an engineer and ironically he wasn't half the technologist that the non-engineer Jobs was.
mevans336 - Friday, August 3, 2012 - link
And neither do technical people.If you want to become an true engineer, you need grad school.
If you want to specialize in Cisco, WIndows, VMWare, or databases, your best option is to avoid grad school (and most times college altogether if you start early enough) ... because experience trumps all.
Coilaman - Wednesday, August 1, 2012 - link
I recently, was building a budget PC for graphic design, web design and video editing and I have to tell you, I decided on a Llano edition of the FM1 socket Athlon II X4-631 processor. It was released in August 2011. Why did I buy it?I got this processor for $81 and it's a very powerful quad-core beast. It easily beats any Intel Core 2 Quad or even first generation Core i3 processors and it just grinds through any video editing or big Photoshop files with great ease. Combined with a $100 graphics card, you'll be able to work very efficiently and even game quite well for some time to come.
Like I said, it's a beast and I highly recommend AMD processors because of their incredible value for dollar, especially in the low-end market.
KitsuneKnight - Thursday, August 2, 2012 - link
The low end is really the only segment where AMD is competitive at all with Intel, and, even then, things aren't exactly going well for them. AMD really does need to step up their game... at this point, ARM is quickly growing to be a bigger direct competitor to Intel than them.Belard - Friday, August 3, 2012 - link
AMD's socket standards is messed up. The FX-series is sometimes slower than the previous tech... the "8core" AMD is usually slower than intel's mid-range quads.Socket FM1 is dead. Once the inventory is gone, you'll see FM2 for the general users. Otherwise, companies like HP are already making and shipping FM2 systems. No new CPUs models for FM1.
The current Socket AM3+ is outdated, the latest chipset is mostly a renamed previous chipset that is certified to use the new FX CPUs. The FM1/2 Chipset has native USB3 support. FM1 and FM2 sockets are incompatible, even thou its the same number of pins and socket style. NO FM2 CPUs will fit a FM1 motherboard. But the other kicker? Its the same exact Northbridge used on both!
There are no motherboard support for PCIe 3.0 for AMD systems any time soon... we're talking 2013~14?! Nevermind that AMD makes PCIe 3.0 graphics cards.
For a low-end low-cost entry level computer, AMD is fine... I guess. But its not something I would use for myself and would not sell to my clients. During the P4 era, All I built were AMD systems. 2012 is the first year I didn't build any AMD powered systems.
AMD has serious issues, and making Pentium4 2.0 was a stupid stupid thing to do.
They really should have just die-shrunk what they already had, it would have been cheaper.
OBLAMA2009 - Friday, August 3, 2012 - link
even if they came out with a good chip the desktop market is rapidly disappearingBelard - Friday, August 3, 2012 - link
The funny or sad thing about PA-Semi is that no more products will likely come from them, nothing new anyway. What they make/made were low-powered PowerPC compatible CPUs, military/space grade.What is sad/funny is that the Amiga community WENT with THAT as their CPU platform... taking them years to come up with a custom board with both PCIe, PCI and Amiga slots. The system board is unique... for sure. And they consider the 1.8Ghz low-watt CPU to be a powerhouse? It costs about $1500+ for the computer. Its a hobby OS for 10+ years running on hardware with no future, again.
Its amazing that what is left of the Amiga community hasn't just gone Linux. Its the closest thing to Amiga they could use (Amiga is still a *nix based OS).
iwod - Monday, September 10, 2012 - link
Well may be a question to Jim, did Apple *really* design A4 or A5. Or the internet rumours are those chip just Samsung Exynos, which i think is not true since Sammy dont use PowerVR.If so what exactly has PA-Semi contribute to Apple? and Intrinsity ? After all they dont seems to be ground breaking things from Apple SoC.
name99 - Friday, February 21, 2020 - link
Well, this ^^^ comment certainly aged well :-)