I’ve known for some time that the semiconductor (computer chip) business has not been the most exciting place, but it still surprised me and bummed me out to see that Intel was laying off 11% of its workforce. There are lots of theories about what is happening in non-cloudy-software-appy tech, but I think fundamentally, the money is being drained out of “physical” tech businesses. The volumes are there, of course. Every gadget has a processor — but that processor doesn’t command as much margin as it once did.
A lot of people suggest that the decline in semiconductors is a result of coming to the end of the Moore’s Law epoch. The processors aren’t getting better as fast as they used to, and someĀ argue (incorrectly) hardly at all. This explains the decline, because without anything new and compelling on the horizon, people do not upgrade.
But in order for that theory to work, you also have to assume that the demand for computation has leveled off. This, I think, is almost as monumental a shift as Moore’s Law ending. Where are the demanding new applications? In the past we always seemed to want more computer (better graphics, snappier performance, etc) and now we somewhat suddenly don’t. It’s like computers became amply adequate right about the same time that they stopped getting much better.
Does anybody else find that a little puzzling? Is it coincidence? Did one cause the other, and if so, which way does the causality go?
The situation reminds me a bit of “peak oil,” the early-2000’s fear that global oil production will peak and there will be massive economic collapse as a result. Well, we did hit a peak in oil production in 2008-9 time-frame, but it wasn’t from scarcity, it was from low demand in a faltering economy. Since then, production has been climbing again. But with the possibility of electrified transportation tantalizingly close, we may see true peak oil in the years ahead, driven by diminished demand rather than diminished supply.
I am not saying that we have reached “peak computer.” That would be absurd. We are all using more CPU instructions than ever, be it on our phones, in the cloud, or in our soon-to-be-internet-of-thinged everything. But the ever-present pent up demand for more and better CPU performance from a single device seems to be behind us. Which, for just about anyone alive but the littlest infants, is new and weird.
If someone invents a new activity to do on a computing device that is both popular and ludicrously difficult, that might put us back into the old regime. And given that Moore’s Law is sort of over, that could make for exciting times in Silicon Valley (or maybe Zhongguancun), as future performance will require the application of sweat and creativity, rather than just riding a constant wave. (NB: there was a lot of sweat and creativity required to keep the wave going.)
Should I hold my breath?
I thought games were supposed to be the very activity that was both popular and ludicrously difficult, although maybe it wasn’t really the latter. But I am such a late adopter that you probably should just disregard this comment. But my streak continues!
I appreciate your streak.
I’m not a gamer and know almost nothing about it. I believe that virtual- and augmented-reality are expected to drive a lot of demand in hardware capability. Whether that will just appeal to “gamers” or will become more widespread, I dunno. Gamers also demand very high performance video, and there are video cards available to meet that demand. But though everyone has benefited from that demand, most of us still don’t be high-end cards. It might be that way for VR, or maybe the pr0n will get everyone hooked.