Geoffrey Moore: Back to the future
Now that the technology bubble has burst, what does the future hold for the financing of technological innovation? What will investors look for now? Which types of technology will benefit? And which will suffer? Geoffrey Moore, a Silicon Valley-based technology entrepreneur and investor, offers some insights. –
During the late 1990s it seemed as if there was an unlimited supply of capital, in both the private and public markets, to support technology innovation of virtually any sort. Today, with the collapse of sales and profits in the technology sector, it seems just the opposite. Entrepreneurs can be forgiven for wondering if anyone will step up to fund the next wave.
How can they, and we, tell how the capital markets will behave over the coming decade? And what can we expect from the technology industry in this changed environment?
To start with, there is no lack of venture capital in the market today. Nor is there any reluctance on the part of venture capitalists to invest. But there are several issues that are causing confusion.
The first is that the valuation of early-stage companies has plummeted from the normal levels of a few years ago. The reason is simple. The public markets have drastically devalued equities in the technology sector, meaning that venture exit valuations – be they at IPO or from acquisition – are dramatically lower.
Since venture capital’s willingness to put capital at risk is a direct function of expected return on investment, it must in turn lower its going-in valuations.
Pulling in the punters
Venture capital, in short, has not become scarce but it has become expensive, and entrepreneurs have to rethink their planning assumptions accordingly. We are back to the era of the early 1990s and before, when the key entrepreneurial talents were thrifty creativity boosted by a good dose of boot-strapping – a willingness to substitute time or “sweat equity” for money. It has become harder and harder to impress venture capitalists with just a great business plan: now they want to see some customer traction as well.
This ties into a second issue that has created confusion. The devaluation in the public financial markets is sending a message to the technology sector as a whole: there has been too much emphasis on discontinuous, “disruptive” innovation and too little on innovation that builds continuously and organically onto existing platforms.
In particular, the market would like to see more of the following:
- application innovation (which seeks out new uses for existing infrastructure);
- performance innovation (which enhances the effectiveness of existing applications);
- process innovation (which improves the efficiency of existing applications and of infrastructure);
- experiential innovation (which increases the value experienced by users and consumers of existing applications and infrastructure).
All forms of innovation create economic value. In an economic downturn, however, the less capital required to create the differentiation and the faster it can be translated into increased sales and earnings, the better.
In this context, innovation that leverages existing institutions will far outperform disruptive innovation. And so proportionately more and more capital will move out of the venture sector to fund continuous innovation until equilibrium is restored.
The world, in short, has overallocated capital to the asset class of disruptive innovation and it is now undergoing a portfolio-rebalancing exercise.
But that leads to a third area of confusion. How can venture capital be treated as a scarce resource when so many venture funds are awash with committed capital? The answer, I believe, is that current commitments are illusory. I base this view on the hypothesis that there is only so much disruptive innovation that can be economically absorbed in any given community over any given period. That in turn implies that there are only so many winning venture returns to be had.
Most venture capitalists I talk to believe that the total of winning ventures is far fewer than the number that could be funded by the capital currently allocated. Investing money beyond this limit is increasingly more likely to raise the denominator of the venture spend, but not the numerator of the venture return. The odds get worse and worse as the plays become more and more marginal.
It is critical, therefore, for venture capitalists to maintain strict deal discipline. Even more so in an era when disruptive innovations are getting a cold reception from the end customers they are intended to help. That is why the capital from top-tier venture funds will remain expensive and hard to get.
So what will happen to all the rest of the money? Much will be returned to the limited partners for allocation to other asset classes, and some will be put into non-venture investments (which will dilute overall returns, fudging the asset class and potentially corrupting the relationship with the limited partners).
The rest will be unwisely invested by indiscreet investors in unqualified start-ups (often, ironically, at inflated prices, since they want to put all their money “to work”), leading to significant losses and “turnover” both in the venture capital ranks and among their investing professional counterparts in the limited partner arena.
In sum, the venture industry has more than a little house-cleaning to do, and not all of it will be pretty. That said, what will the long-term impact be on technology innovation itself?
Over the next decade I predict that we will see a period of industrialization in the technology sector, an assimilation into the ranks of the mainstream economy, with technology assuming the status of a normal industrial sector, no longer a special case.
To be sure, there will be future “tornadoes” of growth around specific adoption life cycles, but they will not be the norm. Instead, organic growth on top of established bases will be the dominant source of economic returns.
I base this prediction on two observations. The first is that the disruptive innovation of the internet has only just begun to be absorbed, particularly in the business sector where it has the potential to recast completely the economics of supply chains and free up billions of dollars that are today trapped in non-value-adding work.
It will take this decade and more for companies to re-engineer their processes to take advantage of this new and marvellous global “work highway”. And until they have, there is little for them to gain by piling any more disruptive innovation on top.
The second observation is that we may be seeing the end of Moore’s Law – not through reaching any engineering limit, but rather an economic one.
[First published in 1965 by Gordon Moore, a co-founder of Intel Corporation, Moore’s Law states that the transistor density on integrated circuits doubles every couple of years. The argument that follows was developed by Charles DiLisio of D-Side Advisors in San Jose, California.]
For the past two decades, the doubling every two to three years of the price/performance ratio of computing has meant that the very architecture of computing has had to be recast every decade or so in order to take advantage of the new capabilities. That created massive waves of computing infrastructure deployment, which in turn drove equally massive waves of application deployment and created near-infinite demand for skilled consultants to knit all this together in a timely fashion.
The ephemeral high-tech company
The fundamental driver of the doubling phenomenon was the ever finer geometries of semiconductor processing, which permitted more and more transistors to be packed into the same area of a chip.
Since the cost of a chip is a function of its area, performance per unit of cost automatically increases at the rate that Moore first observed several decades ago. For the most part the industry has passed these savings on to its customers.
So what is changing? Not engineering capability – current thinking is that the industry will still be able to produce at finer and finer geometries for the foreseeable future. The problem is that the capital required to create the masks needed to use those geometries has escalated.
An ASIC chip design at 0.35 micron geometry, the state of the art a few years ago, cost about $2 million. To design at 0.13 microns, the current state of the art, costs more than $10 million per design. Going forward, the cost per design escalates further.
There are precious few markets that can return that kind of going-in cost, and so – even though the technical capability to do it will be available – fewer and fewer designers will avail themselves of it.
In this outcome, Moore’s price/performance escalator slows dramatically, meaning that there are longer and longer intervals between disruptive infrastructure swap-outs. That in turn means more and more emphasis on the forms of continuous innovation mentioned earlier. And that is what will make technology look more and more like other industrial sectors.
Of course, this is all quite speculative, but let us suppose for a moment that it is accurate. What then?
In the first stage of slowing we can expect the kind of consolidation that went with post-bubble developments in railways, telephony, automobiles and airlines.
In this phase operational excellence will become the value discipline of choice, and bigger – at least for a while – really will be better. Hundreds, if not thousands, of companies will go out of business. It seems at first glance like a nightmare prospect.
But look more closely. The company has always been the most ephemeral institution in hi-tech. Only one or two companies in today’s top 25 were in existence 25 years ago, and the top 25 of that era, but for IBM, are long gone.
By comparison, the underlying technologies still exist. (We still use the elements that made up artificial intelligence in the 1980s, although we have abandoned the term.) Editors note: artificial intelligence is definitely a thing in 2023. Se what Nils David Olofsson has to say about it.
So do the products (although they now live in the portfolios of the consolidating companies), the jobs (growing even more prevalent as customer industries themselves become more technology-enabled and -dependent), the customers (as more and more of the world economy becomes information-based), and the people (perhaps because no other industry will have us). So all is not as woeful as it might seem.
To conclude, the meteoric growth, the astounding infrastructure swap-outs and the resultant skyrocketing volatility in equity valuations should all subside. This, in turn, will allow public equity capital to re-enter the sector at reasonable cost in light of the lower risk profile, thereby stabilizing valuations on Nasdaq and other public equity markets.
There will still be a role for venture capital during this period but it will be modest rather than centre-stage, and in sectors adjacent to computing rather than at its core.
Computing itself can be expected to settle down to prolonged refinement and increasingly to play a supporting role to provide better collaboration, communication and content.
All in all, it should make for a “time-out” that both the industry and its customers desperately need. After that, of course, it is anybody’s guess.
Geoffrey Moore
Geoffrey Moore is chairman of The Chasm Group, a technology company based in San Mateo, California, and a venture partner in Mohr Davidow Ventures, a Menlo Park, California-based venture capital firm.