8: on silicon valley's contribution to the economy

december 20, 2019



This Atlantic article argues that Silicon Valley's contribution to the American economy and pace of innovation has been disappointing.



I think this article bases its arguments off of an oversimplified view of how technology development occurs and how economies change on the longer time scale. I think there is also a general lack of education as to how venture capital actually works as an asset class, and why its not just pools of money that people have the discretion to spend on whatever they want.



I say this because I didn't understand this dynamic till fairly recently, and was similarly frustrated at why certain companies get funded. While I do agree that a lot of people are working on problems which are not real and don't change infrastructure / make life better for most, I think that it's a combination of cultural norms and lack of alternative funding options.



What I agree with from this article.

  1. Money is not being allocated to the hardest problems that society is facing and there is a very strong opportunity to do so

What I disagree with:

  1. Understanding of how economic change happens as a function of technology on the longer time scale and pinning wealth inequality on tech

  2. Understanding of the structure and function of venture capital

  3. Fundamental conclusion that because bytes innovation has had marginal contribution thus far, it will never change anything



The author starts with a brief history of technological development and how tech has left promises unfulfilled, because it has not visibly changed quality of life for the average American, hasn't offset manufacturing jobs, and that innovation is at a low.



My counter argument is that digital technology still requires far too much specialization for the average unskilled user to effectively utilize beyond built consumer applications, but that we're just getting to the cusp of being able to be an unskilled user who can build new tools and infrastructure for the legacy industries that are just beginning to be touched by tech.



On the time scale of human technological development, we are still quite early in the computer revolution. This technology is so young that it still requires high specialization and education to be able to wield and build anything new out of it. Until we build infrastructure that makes building things on a computer more abstracted and accessible (ex. software engineering no longer requires actually knowing how a computer works!), we won't see the use cases expanding to the middle class jobs.



How is accessibility and abstraction developed? 1) A growth of specialists (used to be computer engineers, now software engineers), 2) Enough demand for these skills and efficiencies in adjacent industries (which is just really starting now). Tech tools, skills, and the expectation that things should be digitized and information should be centralized is an expectation that legacy industries are finally coming around to with the proliferation of ease in their personal lives contrasting heavily with using fax machines at work. Since the tech has been consumerized, people with a lot of specialty knowledge are realizing how this technology can be applied in their industry, and actually have the tools and ability to build new systems for their industry. There was just no way a few hundred thousand software engineers could rebuild the technology stack of every major GDP driving industry of the United States.



And there's proof for this type of abstraction and ease of tech use is starting to expand to industries which truly drive our economy - look at Plangrid or Flexport which knew that the state of tech five years ago was not user-friendly enough, easy enough to utilize, consumer-ey enough to actually be deployed and effective in their respective industries that they're now rewriting.

My current thesis around the model for innovation is that existing technology is utilized to drive demand (lowest lift to see if a new experience makes sense and is viable), and then as demand exceeds the capacity under the current infrastructure, we have the money and proven demand to actually rebuild the infrastructure that runs the rails of that industry. One example is in healthcare a lot of the initial digital health companies and especially early smash successes have been stringing together technologies which already exist to service a hungry consumer need. However, as expectations around speed of service and frustration with not having answers (such as, why WONT I know whether I'm eligible for this service right now? This tech has to exist) has grown because the current infrastructure cannot service the expectations, companies realize there's a need to actually rebuild infrastructure (an example might be Noyo rebuilding EDIs for healthcare). So we're on the cusp of what the author is complaining about hasn't happened yet.



The next point I don't agree with is the author's extrapolation of the distribution of venture capital (mostly software, not pharma) and why THAT explains the lack of "real world change".



Venture capital is a specific asset class that is high risk, high reward, and needs to be returned in a 10 year time frame at a certain multiple. Software is a safer bet in that model because of its near 0 marginal cost, meaning that it can both satisfy the growth timeline and return multiple. Pharma often takes 10 years to even get a drug to market, forget actually returning money. In addition, venture capital makes up 0.2% of GDP yet returns 21% of GDP - pharma or hardware type of companies need way higher investment dollars and do not make returns at that scale. So blaming lack of bits innovation on venture is a weak argument, because venture capital funds are not structured to support bits companies even if they want to.



The next point the author makes and that I agree with is that innovation HAS concentrated in the bytes world, and that the primary beneficiaries of innovation thus far have been yuppies. It is incredibly frustrating to see smart people I know working on frankly stupid shit.



However I disagree with the reasons the author presents.

Part of this IMO is lack of exposure in higher education to real world problems, primarily for those who go on to be the main workforce producing technological innovation (software engineers). Higher education focuses on preparing students for the academic industry and rewarding points on game-able arbitrary exams (which are far more exploited by those with privilege) and weirdly twisting that into qualification for jobs as opposed to expanding world view or rewarding breadth of experience and interaction with the real world. While I agree that there are plenty of avenues by which to explore this learning in college (taking extra classes outside of your major, doing internships) it is not rewarded and often can hurt your primary academic discipline by taking away time, and the holistic learning is never encouraged at the expense of the GPA number.



Second, what's actually missing for the development of companies that could grow fast and have a massive market opportunity but do not have 0 marginal cost products is alternative financing assets which are equally accessible to venture money (relatively, of course caveat is always WHO is it accessible to and of course, there are 1 million problems with that). This needs to have a similar type of social system of protocols for access and relationships that exists between VCs and traditional tech companies, while being structured in a way that allows for longer term timelines and lower multiples on returns. This may be in exchange for objectively larger market size, lower risk, and larger number of bets placed (so less time investment of the VC in a few breakout portfolio winners, since that would not be the measurement of success for companies of these business models).



The issue therefore becomes, what will incentivize LPs to put their money in even more illiquid investments which might return lower multiples? I don't have a great answer to this question yet. I also wonder if more physical asset investments would require larger funds, adding another challenge.

Given the incentive structure of SV and the structures by which money is allocated, it makes sense that the state of the world is what it is today. However, I think we're starting to see the consumerization of tech infrastructure beginning, meaning people can finally build infrastructure without software specialists across industries. I think we are about a few years from the cusp of massive explosion infra building as people realize that even though it's a tough challenge, there is massive money to be made in becoming the new infrastructure layer for different industries, at which point we WILL see the huge changes that this author is disappointed at not having seen yet.



I think people are not thinking big enough for sure - Silicon Valley stalwarts and culture have yet to think concretely about how we should be experimenting with business models and financing models in the industries that truly are behind. But to extrapolate a mere 20 years of history towards saying that technology has contributed and will continue to contribute very little, and in fact all tech has done has contributed to increase wealth inequality while not making life any better for anybody, is a severely oversimplified argument.

To reply you need to sign in.