It’s been a couple of years since I last did a year-end look-back, so if some of the observations to follow extend beyond the January-to-December endpoints, straying back into 2020, I hope you’ll forgive the indulgence.
My initial plan was to brainstorm a top-five list. But then I couldn’t come up with a priority-ranking order that I was happy with; the ordering that follows pretty much just reflects that in which topics originally streamed out my noggin. And then I came up with a few more than five. And then I floated the overall write-up topic to my wife, and she came up with a few more. So…
COVID-19’s ongoing supply and demand effects
2020 was the year that we all sheltered in place and saved money, restricting purchases to personally prepared (as well as delivered) food, toilet paper reserves and home office equipment. 2021 was the year that we cautiously began to re-emerge into the world at large…and compensated for the previous year’s thriftiness with material consumption abundance. The consequences of this rebound are already well known to anyone who’s tried to shop for end-of-year holiday presents (for self and/or others), for example.
On the one hand, you’ve got increasing demand for increasingly semiconductor-flush products. On the other, the complex global supply chain from semiconductor fab all the way through to finished systems on retailer shelves is being clobbered by manpower (I mean the term in a non-gender-specific manner) and other constraints, coupled with ongoing worldwide measures intended to tamp down COVID-19 flare-ups. As I write this, for example, Germany is seriously considering following Austria in a return to full lockdown, and various Asian countries key to the semiconductor supply chain are also struggling with infection and hospitalization “spikes.” Toss in abundant new-market demand, specifically from cryptocurrency “miners,” (such as the graphics card shown below) and suffice it to say that being a product planner is about the last job I’d want right now.
A Lithography-shrink pause
Were the historical lithography improvement pace to have remained steady this past year-plus, it would have to at least some degree bolstered the semiconductor wafer fabrication link in the supply chain mentioned previously, since it’d be possible to squeeze an ever-increasing number of transistors onto each wafer (therefore also die built of those transistors, counterbalanced to a degree by ever-increasing per-die design complexity). But while TSMC’s 5 nm process—notably to date used by Apple and Huawei—is an impressive achievement, it’s been in production for well over a year, with the 4 nm successor still only “coming soon.” As a result, the die sizes of Apple’s latest SoCs are quite hefty. Samsung’s equivalent 5 nm process has only more recently entered production. And Intel is only now meaningfully transitioning to its 10 nm process (which it claims is equivalent to TSMC’s 7 nm…still…) with latest-generation Alder Lake CPUs.
Governments (and customers) get serious about semiconductor investments
If you can’t squeeze more transistors (and the die they in-aggregate assemble) out of a given fab in a given amount of time, but you still need to boost cumulative semiconductor output, your only alternative is to build more fabs…and facilities for assembly…and test…and warehousing…and…etc. Having worked in tech for more than 35 years now, covering it as a journalist for nearly 25, I’ve observed numerous boom-and-bust cycles, especially for DRAM, flash memory and other commodities. So, I understand semiconductor manufacturers’ and foundries’ reticence to spend billions of dollars on land, buildings to put on that land, and equipment to fill those buildings, only to end up in an inevitable oversupply situation in a year or a few, both because it turns out that their customers had been double-ordering during the prior constraint cycle and because their competitors had built new facilities in response, too.
That said, I of course also empathize with systems companies unable to meet end customer demand because they can’t source historically inexpensive, plentiful display driver ICs (for example), even at gray-market markups (and assuming they’re not counterfeits). Just in the last few days, I’ve read about evaporating smartphone and new-car supply, along with consequent price increases for used-vehicle alternatives. And right now, Tesla is even reportedly shipping to customers cars containing unpopulated USB ports! This all explains, for example, Ford’s recent partnership with GlobalFoundries. And, looking beyond individual companies, it explains the tax incentives that the state of Texas is offering Samsung and Texas Instruments to build new fabs there; the same goes for the government of Japan and TSMC, and the U.S. government is making similar investment “noise” (albeit with little accompanying action to date).
The importance of architecture
As the number of transistors that it’s possible to cost-effectively squeeze onto a sliver of silicon continues to slow, what you build out of those transistors becomes increasingly critical. Two years ago, I commended AMD on turning the tables on Intel with the revolutionary Zen architecture, after many years of playing “second fiddle” to its bigger competitor. Events since I wrote those words have borne out that prognostication; right now, from what I can tell, AMD is selling everything its foundry partner can churn out, and notably it’s doing so in the high-margin markets of its choosing, leaving Intel to service high-volume but lower unit profit segments.
Some of AMD’s success is due to the company’s “chiplet” packaging innovations, which have enabled it to cost-effectively stitch together multiple die on a unified package substrate to achieve a given aggregate core count, cache amount and (in some cases) embedded graphics capability, versus squeezing everything onto one much larger, lower-yielding sliver of silicon. But the Zen microarchitecture’s fundamental combination of high performance and low power consumption, now in its third-generation implementation, also gets notable credit.
Intel has recently responded with Alder Lake, the microarchitecture underlying its 12th generation Core microprocessor products, which are architecturally innovative in their own right. They mark the first high volume (albeit not the first outright) x86 blending of high speed and high efficiency CPU cores on the same die, delivering power-plus-performance potential in which partner Microsoft also needed to invest notable development effort in order to unlock in software. Still, although Alder Lake has enabled Intel to retake the raw performance crown in all-important market segments such as gaming, at least temporarily, it consumes notably more power than do AMD counterparts in the process. And plenty of other examples of architectural over- and under-achievement exist in market segments beyond the PC, of course.
Emulation + native evolution = CPU independence
As I’ve already described in detail in my Apple coverage over the past year-plus, the company’s migration plans from Intel x86 CPUs to its own Arm-based SoCs don’t solely encompass hardware; the software side of the story is equally if not more important. Apple’s approach here is multi-prong, ironically echoing the plan it already successfully executed in its prior transition from PowerPC to Intel x86 CPUs:
- An emulation layer called Rosetta (v2 in this case), built into the operating system, which enables continued use of x86-compiled applications albeit at lower performance and power efficiency than native alternatives
- To wit, immediate availability of “Universal” applications, which run natively on both x86 and Arm, from its own internal development teams (as before, Universal support will inevitably eventually be phased out in favor of Arm-only code), and
- Development support and other encouragement for its third-party application partners to follow in its footsteps.
As far as I can tell as an outside observer, the migration strategy has been highly successful so far. That said, although I’m a mostly-Mac guy, I haven’t yet taken the Arm-based Apple system purchase plunge. However, I do have a Windows analogy that confirms the broader validity of the approach. As I alluded to last month, earlier this year I picked up an Arm-based Microsoft Surface Pro X, which I’m typing on at the moment. Its hardware potential is substantial, but as usual, any software shortcomings would constrain the overall experience. Windows 10’s x86-compiled code emulation support was largely restricted to increasingly uncommon Win32-based applications. After migrating the computer to Windows 11 a couple of months ago, however, I’m now able to install and robustly run nearly every 64-bit Windows app I try.
Streaming service proliferation and legacy consumption evaporation
I’ve long felt that if you really want to observe the action at a sports event, especially one in which the field of play is larger than you can see in its entirety from a single vantage point (such as a long-distance running, or vehicle or bicycle racing, for example), it’s better to stay home and watch it on TV. I don’t fundamentally go to a Colorado Rockies baseball game to see the game (or with any notable confidence that the Rockies will win, either, but that’s a different story); instead, I’m there for the hot dogs, cold beer, and camaraderie with my fellow fans.
I’m sadly beginning to feel the same way about movie theaters, concerts and the like. Long-time readers likely already realize, for example, how much I historically enjoy seeing and listening to live music performances. But I instead spent the last year-plus watching archive videos, discovering many new artists and bands in the process. With the reopening of venues such as Red Rocks Amphitheater this year, I returned to the audience for several concerts, initially excitedly but with expectations quickly dashed. Here, for example, was my “view” from the Fiddler’s Green Amphitheater lawn section at Dead and Company’s October 22 show:
Conversely, here are the set 1 and 2 opener experiences that the folks watching the live feed from home enjoyed (unsurprisingly, many bands have taken notice of their incremental COVID-cultivated online audiences and now also offer pay-per-view livestreams for those who couldn’t score tickets or are unable or unwilling to make the trip to the venue):
Maybe I’ve just gotten old, but the crowds (including plenty of taller-than-me folks who think nothing of plopping directly in front of me), the copious, pungent pot smoke, and other downsides of the on-site live music experience have gotten old, too. Same goes for the lousy sound, talkative “neighbors,” and outrageously priced popcorn at movie theaters. And now that my wife and I have outfitted our home exercise room, I’ve got little motivation to return to the gym. If I want “community” I can alternatively gain a sufficient semblance of it via the Peloton app, after all (to that latter point, by the way, I do realize my stance is somewhat antisocial; those with more extroverted tendencies might feel differently about the value of face-to-face).
Enhanced quality for free
One notable upside of the increased amount of entertainment content now being streamed is that the quality of that content has also increased, thanks to competitive differentiation pressure and often at no extra cost to the consumer. Take music, for example. Amazon, Apple, Spotify and a host of other providers are all basically offering slight (if that) variations on the same subscription service, leveraging the exact same content coming from the same artists and labels. In striving to differentiate itself, Apple earlier this year therefore upgraded its library to high-def and (in some cases) surround sound at no extra charge; Amazon quickly followed. Tidal, traditionally the deep-pockets audiophile’s choice, now even offers a free service tier.
Cloud gaming gets credible
Beefed up “cloud” servers and the services running on them, along with beefed up “pipes” linking them to consumers, have also enhanced the viability not only of online interactive gaming but more fundamentally of cloud-served gaming, as an alternative to traditional console- or computer-installed and -run game content. Don’t want to spend the money on, or deal with the hassle of maintaining and upgrading, a leading-edge PC or game console? And/or want to run leading-edge games on your smartphone or other resource-deficient network endpoint? Google Stadia, which I wrote about in detail in mid-2020. It is one compelling option in such a situation, but certainly not the only one. There’s also NVIDIA’s GeForce Now (ironically pretty much your only option for experiencing gaming on a GeForce RTX 3080 right now), Steam Link, Microsoft’s Cloud Gaming…Amazon is even getting in the game (pun intended).
New entrants challenge broadband monopolies
Speaking of broadband pipes, regular readers with good memories are already aware of my longstanding disquiet at Comcast’s de facto monopoly in my community. An enduring desire for meaningful competition not only where I live but elsewhere, among other reasons to increase the likelihood of equal access to diverse content types and sources, i.e., to encourage network neutrality, was the fundamental motivation for my recent deep dive into Starlink’s burgeoning satellite Internet service. Equally intriguing are the telecom operators, such as embryonic DISH Networks just down the highway from me, who plan to offer broadband service over 5G wireless cellular data links. While each of them might have exhibited monopolistic aspirations in their respective historical market segments, the game changes when those markets overlap.
Wireless charging becomes plentiful, more robust
Close-proximity wireless charging is pretty darn convenient, much as the bleeds-green environmentalist in me hates to admit it due to the technology’s inherent inefficiency versus an alternative wired charger connection. And in the few short years that Qi and its semi- to fully-proprietary counterparts and predecessors have existed in the market, they’ve made notable strides both in consumer adoption (therefore unit shipments of both chargers and their constituent IC and other building blocks) and power output. Take the first-generation Pixel Stand charger I tore down earlier this year. Although its 10W support was restricted to Google’s own phones (non-“a” Pixel flavors, at that), that metric marks a notable upgrade to the 5W horizontal pad that preceded it at the dissection table just one year earlier. Apple’s latest iPhones charge at 15W speeds, and Google’s new Pixel 6 family claims up-to-23W capabilities.
Renewable energy ditches subsidies
Conversely, something that this bleeds-green environmentalist loves to see is the steadily growing adoption of solar, wind and other renewable energy sources, even in the face of phase-outs of government subsidies versus legacy coal, natural gas and the like. Couple renewables with ever more robust and cost-effective batteries (which I’ll have more to talk about in next month’s look-ahead companion piece), and renewables’ inherent variability (the sun doesn’t always shine, the wind doesn’t always blow…) neatly gets buffered, as well. My to-date enthusiasm for electric cars has historically been somewhat muted, since in a sense they just relocate the source of greenhouse gases from the tail pipe to the power plant. But make that power plant “green,” too, and you’ll see me get really enthusiastic. The environmental future of this planet may look increasingly grim, but we might yet engineer ourselves out of it…
Thus concludes my thoughts about what I felt were the tech highlights of the past year. That said, I strongly suspect your “top” lists probably diverge from mine to at least some degree. I look forward to reading all about it in the comments!
—Brian Dipert is Editor-in-Chief of the Edge AI and Vision Alliance, and a Senior Analyst at BDTI and Editor-in-Chief of InsideDSP, the company’s online newsletter.