So there seems to be a 6Ghz upper limit on processor speed that we can't break through since clear back in 2004. How did I miss that? As a result we've moved to doubling down so to speak with dual or quad core processor machines but only a handful of applications actually take advantage of that by offloading processes to multiple processors.
This seems like it prefaces a day of the designer, in which software engineers will have to be even more lean and mean than usual. As a side note, STATA (my favorite stats package ever). Offers licenses that take advantage of this but they want to charge for it. Who wouldn't right? It also prefaces a day of computing really starting to suck. We're supposed to double capacity every 18 months. I'm looking at buying a new macbook pro with a dual core processor that's pretty similar to what I have now and is almost 3 years old. Am I missing some nuance here?
Subscribe to:
Post Comments (Atom)
1 comment:
Part of the issue is economics. 6 Ghz requires more precise fabrication than a 2 Ghz. As I understand it, fabrication technologies start to run up against physical limitations. As it is, there are several defective processors created for every functional processor in a computer.
Economics influences in other ways as well. Many computer manufacturers are more interested in the consumer sector rather than the "big iron" kinds of data processing. Consequently the focus of the last few years seems to be on startup times and power consumption (especially tablets, laptops and cell phones).
I think over time we'll see more programs take advantage of multiple cores, but they may be niche compared to where computing as whole will be in the future.
Post a Comment