Invention

Invention

I've never met an inventor. Have you? I've heard the story of Edison, and I've seen TV commercials aimed at armchair inventors. Still, I've never met a real-life inventor, and I don't know anyone whose profession can genuinely be called "inventing".

There are several professions I'd like to fit into the slot labeled inventor: engineer, industrial designer, product designer, etc. If I stick to the understood definition of inventor — someone whose job is to conceive and build original products — none of these professions quite fit. They are all too specific. The term 'inventor' is too big around to hold them.

If you work hard, and have some luck, you might be able to make useful contributions to one field. That's what the most talented people hope for: to make recognized progress in one field. They don't dream of success in multiple fields.

Today it is vanishingly rare for anyone to make large and diverse contributions. Historically, though, it happened more often. Da Vinci is well known for remaking several fields in the the 1400s. Einstein for one field circa 1900. Now our best are lucky to remake a subfield. We might say that science has matured, and that progress has stabilized. The gains we make are smaller and come from larger effort.

I expect that Terry Tao will solve several more major open problems, all of them closely related to analytic number theory. Terry Tao does amazing work, primarily in one subfield of mathematics. For other subfields he still has to consult the local experts. It would surprise us if Tao solved major open problems in an unrelated subfield. His are specialized problems. We have localized expectations for his work. When we think of classical inventors we think of advances as large and varied as the phonograph and the microphone.

It seems weird to have to say, but without bringing it up we don't seem to recognize it: there are only so many things that you can do with everyday objects. The useful ways to combine them run out after a while. There is a real limit on what you can do with everyday items, and we've been asymptotically approaching that limit for some time.

The limit on new combinations is one of the things that makes it especially hard to be an inventor these days. Everything you can do with common materials has been done. The hundreds of millions of Americans who live in houses recapitulated most of those possibilities this week. How could you find something anybody would call new and useful? If we instead ask for new and universally useful, the prospects become bleaker. Could anyone still combine household materials in a way that's new and useful to everybody? Probably not. Yet that process is not far off from describing the phonograph.

Further evidence is in infomercials that sell us seemingly useful products we don't actually need. If there were new and useful products we needed, they would hawk those, because the business would be better. Those products would be easier to sell. It wouldn't take much to convince us. Imagine how much more effective infomercials would be if they were selling brooms for the first time. Instead pitchmen have to resort to useless items that only sound useful. That's where we are on the curve: the simple, effective items have been used up, and not much is left for unsophisticated manufacturing. This doesn't mean we won't see improvement in our domestic lives over the next century — of course we will. It just won't come from items a skilled craftsman could create in his tool shed.

The exhaustion of simple combinations extends to chemistry as well. There is nothing under your sink or in your medicine cabinet that hasn't been combined and studied. In your house you won't cause any new reactions. In a college chemistry lab, you might still be out of luck. However, if you were alive in the 17- or 1800s there were many basic elements that had yet to be isolated. The periodic table we used in high school was completed later. By any standard chemistry was immature.

It's not possible to pinpoint the hour and minute when science became mature, just like it's not possible to pinpoint the exact moment when an adolescent becomes an adult. For classification purposes it helps to settle on some definition. We can arrive at a passable legal age for adulthood (18), and we can give an approximate timeframe for when science matured: c. 1945-1953, at the start of the Atomic Age. The proof justifying this timeframe is understandably informal. The method it uses is to bound the timeframe from both sides.

Benjamin Franklin made fundamental advances in several areas in the mid-1700s. So that's too early. In the late 1800s commercial research labs appeared. Göttingen and other German universities became the center for major academic research programs. Science had begun to organize, while it still appeared immature: Einstein's Annus Mirabilis was 1905, while Einstein was isolated in Switzerland. The physicists at Göttingen would go on to lay the groundwork for quantum mechanics throughout the 1920s and 30s, and the work that occurred then looks like the lead-in to an eventual maturation.

From the other direction, we can bound the process from above at 1958. Before Texas Instruments developed the integrated circuit the maturation process had ended. Big corporations knew where to go and what to look for when they built computers. They put money behind engineers and waited for the science to appear. The developments that resulted were both valuable and incremental. Moore's Law is wonderful in its implications, and, to date, it is also depressingly dependable in predicting achievement.

The earlier date of 1945 denotes the atomic testing at Trinity, and the later 1953 date denotes the discovery of the shape of DNA by Watson and Crick. Many landmark results occurred in the intervening time. In 1948 Shannon published his paper on communication theory, and in 1952 Jonas Salk produced a vaccine for polio. What distinguishes the results in this period from the results of earlier times is the narrow focus of the discoverers. What distinguishes results in this period from results in later times is the magnitude of achievement.

Since then scientific progress has been remarkably stable. Often, that's less than we wanted. Since the 1950s the volume of scientific output grew at some figure less than 6% a year, a growth rate similar to that of mature corporations. Qualitatively, we feel let down. We made some cultural bets that things would be more interesting. Our most visible progress has been in computing. The progress of computers has been wonderful, albeit anticipated. Everyone could see the internet coming. There have been fewer surprises.

Because we didn't have a good picture of the progress of science until recently, some of what we used to believe doesn't make sense anymore. The inventions in the past — huge, unprecedented advances — we now see more clearly as the pitching stage of a sculpture. The first strokes knock off large blocks of stone. At this stage it's not clear what the sculpture will depict. The strokes that follow remove less stone and require more and more attention to detail. In our sculpture we seem to have roughed out the biggest pieces of dead stone.

Earlier I said that domestic improvements won't come from items you can create in your tool shed. That's an unsafe claim to stand behind, because what's in your tool shed has always been changing. So maybe we should qualify that statement: domestic improvements won't come from items you can create in your tool shed today.

Software isn't an item, but it's tough to ignore that many existing companies started as two laptops and two people working from an apartment. That's a lot like building something in your shed. You aren't manipulating physical objects. You are creating something, however. Fifty years ago they didn't have laptops in apartments, and such an opportunity was inconceivable.

Inconceivable doesn't always mean from the future. The past can be inconceivable as well. To me it's almost inconceivable that Jobs and Wozniak bootstrapped a hardware business as recently as 1976. They built and sold Apple Is from their garage, using mail-order parts, and then they relayed that success into dedicated manufacturing for the Apple II. That is amazing. In 2012 it almost defies belief.

What happened with the Apple I & II couldn't have happened with the iPhone. Though it's still possible to make money selling hobbyist hardware, today nobody would pay to own a pre-iPhone. The expectations for quality are too high. Even if we pretended people would buy pre-iPhones, the plan would break down again at the next step. No investors would give two twenty year olds the run of a smartphone plant. Expectations are too high. They would have to come from a proven manufacturing background. In hardware at least, the trend has been to move manufacturing out from the garage.

Going forward, it looks like manufacturing will be moving back into the garage. Right now, manufacturing is out of reach because it isn't feasible for one person to manage physical stuff. Home 3D printing looks like it's going to change that. The crafting step will happen on the home computer. If so, producing physical objects is going to look a lot more like programming, or design, which individual people tend to be better at. Individuals are better at managing thought stuff as long as the task is small enough to fit in the individual's head. So maybe a future trend in entrepreneurship will be devices, better designed by people at home.

A few of these trends appear cyclical. How do we know that the trend in scientific progress won't turn out to be the same way? In other words, how do we know we're converging on the real maximum rate, and not some local maximum on the rate of progress curve?

We can't prove that we aren't at a local maximum, of course. The real, much larger picture could always be hidden behind some missing discovery. The best we can do is make inferences from the evidence that we currently have. Much of the science we've recently discovered imposes bounds on the science we intend to discover. We've uncovered limits. To draw a parallel, it takes four minutes for data to reach the Earth from Mars, and vice versa. That channel is nearly perfect, and so we don't think we can improve it, given c. The only way we could send messages faster would be to discover a method that upends our current understanding of physics.

Ever since Gödel we've had to constantly rein in our expectations of what's possible. Things weren't like that before: we used to live in a scientific world marked by optimism. Now we live in a scientific world constrained by limits. We've learned to demarcate what we can know. And we seem to have gotten better at anticipating scientific progress.

If we really are better at anticipating the progress of science, then why? Why don't our advances take us by surprise anymore? It probably isn't that we've developed supernatural insight. If we're unsurprised by what comes out of science these days, one explanation could be that we've learned to interpret a map.

When your advances are so large and so varied, it's easy to believe that they came from enormously creative inspiration, or that the intuition that produced them somehow was unique. Neither of these is likely to be true. One, it seems more accurate to label the process of scientific advancement as discovery and not creation. That's why the common idea of an inventor is so ill-fitting. Inventors do not create, they discover, and the closer you are to math and physics the more you discover and the less you create. The next steps are already laid out. Math and physics give a diagram for how to go forward.

Two, people frequently share the same intuitions. Because researchers work along a preexisting path, they tend to be in the same place at the same time. This is especially true if the path ahead has been blocked for a long while. When the techniques necessary to remove the obstruction filter back from other paths, the way to remove the blockage becomes clear, and the researchers advance at the same time.

Innovation

Steve Jobs gets credit for a lot of things he didn't do. Jobs himself said it best: "People like symbols, so I'm the symbol of certain things." Sometimes that means using Jobs as a stand-in for the many designers who work at Apple. Jobs usually makes for a good story. We like narratives, and we can build several entertaining ones around Jobs. Telling stories lets us gloss over other people by attributing their work to one person.

It's easier that way, to think of a situation in terms of one person. It makes what happened easier to visualize. We can picture at least one person, for the most part. It's harder to picture a crowd of people. And it's even harder to picture how that crowd will perform. Some authors will write with one person in mind. Others write just for themselves. These tricks help writers anticipate all of their audience at once. If you can't picture everybody correctly, picture one person, and hope that does well enough. It at least keeps things consistent.

The difficulty writers have picturing crowds is actually a fundamental human constraint. Imagining crowds is hard. We reward people who can imagine crowds. If you can imagine crowds you can go far in a lot of different fields. You might try fiction, but I would think about politics or investing first.

Imagining crowds is hard because it's the combinatorial version of a problem that's already hard: to put yourself in someone else's shoes. It's hard enough to do your own thinking. To simulate someone else, you not only need to do all of your thinking, but all of their thinking, too. And that's tough. Here, "putting yourself in someone else's shoes" doesn't mean recognizing how someone else feels, which naturally comes to most people. It means pausing your thought to recreate someone else's thought process, which is harder. Most of us need some direction to recreate what other people are thinking. It is the default state of thinking to focus on yourself.

One reason conspiracy theories persist is an underdeveloped sense of how crowds work and an exaggerated sense of the power of central authorities. For many instantiations of crowds, central authorities are powerless in comparison. They have little control over the crowd, and they know it. Central authorities are usually well aware of the limitations of their powers, even when others are not. Canute was an ancient king who ordered his throne room brought to the beach. There he made his courtiers watch as he issued decrees to the tide. Unsurprisingly, the tide ignored the commands given by the king.

Like Canute and the tide, business founders can't create waves that aren't already there. They have to capitalize on existing waves. This is what people mean when they say to identify a market. Any business will come with the standard difficulty of building something new. Without the convenience of an existing market, there is the added difficulty of winning the attention of the customers. True, some lucky businesses will spring from new technology sold to an unaware market. Mostly, businesses have to respond to pre-existing market needs with existing technology.

Setting out to invent something to build a business around is a bad idea. Realistically, you couldn't count on it. You'd have to commit yourself to something else instead, like becoming an academic. Then, if you were lucky, you might discover something that comes with the option of forming a business around it.

A more realistic goal is to start a business using known technology. The risk in this case drops from ridiculously high to merely high. It's mistaken to think that businesses based on known technology will be unoriginal. To those worried about being original, the thought of using existing methods is distasteful. This thinking ignores a basic truth of entrepreneurship. The technology a business builds will start off being known and later will expand beyond it, provided the business is any good. Good businesses always come up with new ways to do things. Doing something new will fall out of solving an existing problem well.

Part of the challenge in choosing a problem well is choosing a problem whose solution is self-sustaining. It's not enough to identify something people want. Many things people want don't exist because there isn't a place for them. Mostly, it's because people won't pay for the product, ever. If you notice something that would obviously be useful, but doesn't exist, see if other companies have already tried that idea and found that it wouldn't support itself. For an idea to work the product has to be both desirable and self-supporting.

A nimble team can start a business without having a detailed plan. They only need a general idea in mind. Almost always, the "brilliant" idea a business starts with has to be modified before becoming sustainable. Predicting what customers want in advance is hard because it's hard to predict customers. The best way to get to something they want is to revise what you're building based on their input. If ideas mostly change, it make more sense to not invest in an unchanging plan. Detailed precision rarely helps. You're better off starting with a flexible plan, and iterating from there, rather than making a commitment to false precision.

You still need some kind of idea, and coming up with a good idea, even a general one, is not easy. Twenty year olds who are perfectly qualified to build and run companies will struggle to come up with a suitable idea. The experience necessary to generate business ideas differs greatly from the experience gained from programming and design. Generating ideas is a hard won skill. Generating ideas reliably comes from failed businesses, or from scrutinizing a variety of new businesses as they expand. Acquiring this experience can take years. It is unlikely to be found in a new college grad, particularly if that college grad devoted time to mastering programming or design.

Ranking business ideas demands a comparative understanding of markets. In contrast, building a business demands understanding of only one market. Sharp designers and programmers are fully capable of that. While it helps to have the broader perspective, it isn't a requirement.

Investors must rank business ideas every day. Strictly speaking, a successful entrepreneur needs this skill once. After choosing which business to start, future problems are always less general. To get a sense for the added difficulty of generality, consider choosing which business to run: Kraft, Disney, Home Depot, or any other Dow 30 company. Which is going to be biggest in ten years? Starting a single business is like choosing "Kraft" and focusing only on foodstuffs. At that point it doesn't matter how much you know about theme parks or home construction.

It would make sense if veterans guided new entrepreneurs through the idea phase. That this doesn't happen more often is unfortunate. Many failed businesses could be avoided. Solid teams with bad ideas could be helped. Investors can evaluate ideas better than new entrepreneurs. When the Y Combinator seed fund acknowledged this by letting entrepreneurs apply without ideas, they were criticized, unreasonably. People expected entrepreneurs to have their own brilliant ideas.

Lately entrepreneurs have been criticized for chasing ideas that are too small. We all want the ideas new companies work on to be big, most of all the entrepreneur. Circumstance dictates what's possible, not the imagination of the entrepreneur. If ideas are small, then it's because they have to be. The broken logic is that if more entrepreneurs tried huge ideas, more of those huge ideas would come to pass. Huge ideas don't work like the lottery, and entrepreneurs don't work like lottery tickets. A lottery ticket has some non-zero probability of succeeding, however small. For many huge ideas the probability of success is, right now, zero. Throwing more entrepreneurs at them is throwing confetti. What's worse, those entrepreneurs could've worked on smaller ideas that would build to the huge ones. Entrepreneurs don't set out to be menial. They set out to do the biggest thing they think they can succeed at.

We've always moved forward by doing what's possible at the moment. What's possible has just changed over time. Right now what's possible seems less glamorous. Computing, after all, is a physically introspective field. It's not outward looking like many other forms of engineering.

On the one hand, I have a hard time trivializing any product used by millions of people. On the other, I can sympathize with the impatience people feel over the aims of new companies. Right now technology is looking inward. I tend to think of it as a planning phase.