Back to the Future in Silicon Valley

Silicon Valley came by its name honestly enough: way back in the 60s, 70s and 80s, it was the place where venture capitalists and entrepreneurs turned silicon into computer chips in high volumes and at low prices that astonished the world. I have a chart on my office wall that shows how hundreds of silicon-based chip companies emerged in the Valley in that period, the vast majority of which could be traced to Shockley Semiconductor Laboratory (founded 1956), and Fairchild not long thereafter – the later spawned so many later chip companies that it fairly earned the moniker “Fairchild University.”

The Big Kahuna of the silicon companies in Silicon Valley was and is, of course, Intel.  The company’s x86 chip architecture, launched in 1978, powered the PC revolution and remains to this day by far the dominant architecture in personal computers, laptops, workstations, and even the cloud computing business. It’s even a big player in the supercomputer segment. Name a big chip company and chances are good a big part of its business is x86-based chips. Much the same as the IBM System/360 architecture has dominated the mainframe business since the 1960s. Yes, there is still a market, albeit not so big as it used to be, for mainframes.

The x86 architecture has been so dominant for so long that for maybe the last two decades or so, the venture capital business – a business that owes its current form largely to its role financing all those chip companies way back when – has pretty much stopped funding chip startups. Until recently, that is.

I was doing a bit of research on the current AI revolution (there have been others: folks of a certain age will remember the “expert systems” AI hype in the late 80s) when I found something pretty interesting – well, to me, at least. Over the last couple of years, VCs have invested hundreds of millions of dollars in more than 20 new chip companies. All of which seem to have one thing in common: they are developing chips optimized for AI applications. Chips with non-x86 architectures.

I said I found this interesting. That’s because it suggests to me that while lots of folks are talking about how various tech-enabled and tech-driven revolutions are on the cusp of changing the world for people who use technology, and how to frame that as an investment opportunity, far fewer folks are talking about how those revolutions might change the world for today’s biggest producers of technology, and how to frame that as an investment opportunity. So, for example, while there are plenty of people talking about how AI might disrupt the smartphone business and the personal transportation space, not so many are talking about how AI might disrupt the businesses that provide the components that power those businesses. Who, that is, will be the next “Intel Inside.”

I’m betting it won’t be Intel – for the same reason IBM didn’t follow its System/360 mainframe architecture with something like an x86 architecture of its own. When you are king of a big mountain, and Intel’s is still sitting at the top of a pretty big one, you tend to think more about defense than offense. Protecting your realm, not cannibalizing it. Just ask Kodak – the folks who invented digital photography.

So here is an investment hypothesis: Assuming (and this is a big assumption) that today’s AI revolution will be even nearly as big as the hype suggests, I’ll bet when the dust settles there is a new sheriff in the silicon part of Silicon Valley. Though if you look at where those chip-hungry VC investors have been spending their money, it just might be an out-of-towner.

Advertisements

Employee Turnover: A Cloud with a Silver Lining

Paul Jones, co-chair of Venture Best, the venture capital practice group at Michael Best, has been selected as a regular contributor of OnRamp Labs, the newest blog addition to the Milwaukee Journal Sentinel covering start-ups and other Wisconsin technology news.

Paul’s most recent contributed piece “Employee Turnover:  A Cloud with a Silver Lining” can be found under their Business Tab in the Business Blog section. Click here to view his latest blog.

Here is a short excerpt: “In a recent Washington Post commentary, DC-area entrepreneurJoel Holland cites four reasons he believes account for the recent emergence of the nation’s capital as a modest but real center of startup-driven innovation and venture investing. One reason he cites is a business and social culture that – in sharp contrast with Silicon Valley, he notes – supports more stable employment relationships. Holland posits that lower employee churn is something that gives DC-area innovators a competitive advantage over their Silicon Valley counterparts.

I beg to differ.”

Click here to read more of Paul Jones’ OnRamp Labs blog post located under the Business tab of the Milwaukee Journal Sentinel’s website, www.jsonline.com.

Expanding Market for Technologies to Clean Wastewater from Hydraulic Fracturing

By: Geoffrey R. Morgan

Since 2005, U.S. production of natural gas has increased exponentially, from a negligible amount to almost 7.5 trillion cubic feet in 2011. The U.S. is now the largest producer of natural gas in the world.

The new-found supply of this energy source has also had a significant effect on public policy. Domestic energy production, and natural gas in particular, is caught in a battle between proponents of sustainable sources of energy such as wind and solar, the interests of traditional coal-fired plants, national security interests in reducing dependence on foreign energy sources, environmentalists and proponents of natural gas.

The epic increase in the supply of natural gas has come from the effectiveness of hydraulic fracturing. In the hydraulic fracturing process, water mixed with chemicals and sand is injected into a well at ultra-high pressure to shatter and hold open the rock below and release the gas. According to the U.S. Department of Energy, the hydraulic fracturing fluid is composed of approximately 95% water, 4.5% sand and .05% different chemicals. These chemicals can number up to about 65 and include benzyne, glycol-ethers, toluene, ethanol and nonphenols. All of these chemicals have been linked to human health disorders when exposure and concentrations are too high. Because the percentages are by weight, it is estimated that approximately 20 tons of chemicals are added to each million gallons of water. A typical hydraulic fracturing procedure involves 4-7 million gallons of water so about 80-140 tons of chemicals. Each well requires millions of gallons of water (which separately is leading to confrontations over water supply in drought-stricken states). Some of the water comes back up immediately, along with additional groundwater. The rest returns over months or years.

A major issue is how to deal with the wastewater. The amount of water is significant. In most cases, the contaminated water is pumped into disposal wells, but this is not without risk. The wells and pumps can leak, allowing disposal water to contaminate existing aquifers.  In Texas alone, the amount of wastewater increase is significant. According to The New York Times, the state has more than 8,000 active disposal wells. The amount of wastewater being pumped into those wells has increased to approximately 3.5 billion barrels in 2011 from just 46 million barrels in 2005. A recent study dealing with the Marcellus Shale formation, which stretches from New York to Virginia, indicates that wastewater disposal from hydraulic fracturing could soon overwhelm the general wastewater treatment infrastructure of the formation. So cleaning this wastewater is important and represents a significant economic opportunity.

Insurers who write coverage on these environmental risks acknowledge that premiums are favorably impacted by the presence of effective technologies to clean the wastewater.

Water technology is a rapidly growing industry. Global Water Intelligence estimates the global water industry is $483 billion/year and growing by several percentage points annually. Water technology hubs are emerging to encourage and facilitate economic development, notably in Milwaukee, Singapore, Ontario and Israel.

Technologies are already being developed to treat wastewater from hydraulic fracturing. A new desalinization process developed at MIT can scrub the contaminants from the wastewater, uses significantly less energy and is less complicated than other desalinization techniques. The technique is called a carrier gas process in which water is sprayed onto warm air. The water vaporizes, and the water vapor, which contains only pure water, is bubbled through cool water and the vapor then condenses. Researchers at the University of Minnesota have developed a process of creating centimeter-sized silicon beads that have chemical-degrading bacteria inside them. The beads are porous so the chemicals can enter but not porous enough for the bacteria to leave. These represent just two of the developing technologies to treat the wastewater. This alone will become a multi-billion dollar industry in the coming years.

Private equity and venture capitalists should take note. There is a distinct need for this technology and a rapidly increasing, lucrative market. The economic and societal benefits of cheap, plentiful natural gas cannot be denied. Hydraulic fracturing makes it happen. And hydraulic fracturing requires billions of gallons of water annually which need to be treated and/or disposed of.

Some Thoughts on Telecommuting

By Paul A. Jones

Marissa Mayer, over at Yahoo!, is making a fair amount of news these days, most recently for reversing the company’s longstanding embrace of telecommuting and insisting that employees work at the company’s offices.  A lot of people were surprised by the move either because it seems to go against the idea that the workplace of the future, enabled by modern technology, is best defined by where people are than where they are supposed to be, or perhaps because as a new mom, it is assumed that Mayer will want to make it easier for working moms to work by letting them work from home.  The tide of opinion seems to be that Mayer’s new policy is at least strange, and likely a mistake.

As a telecommuter myself at various points in my career, including to some extent now, my take is rather more nuanced.  Telecommuting is a great option for some employees at most businesses, even at Yahoo!  But for most businesses and most employees, working in a communal environment offers important advantages that can make or break a business. Done right, the office environment offers two key competitive advantages over the dispersed (i.e. home) working environment: it fosters collaborative thinking and it promotes esprit de corps.

As for facilitating collaboration, as society becomes more complex and interconnected, innovation, the lifeblood of a thriving business, becomes more and more a collaborative process.  It is just plain easier to collaborate with people face to face than smartphone to smartphone.  As for esprit de corps, the importance is too often overlooked in a culture that more often promotes the individual than the team.  But whether we are talking about sports or business, whether it involves teammates or workmates, other things being equal, more esprit de corps is going to result in more success.  Just ask the folks at Google or Apple, or in my own experience in and around the high impact entrepreneurial and investing space, the folks at just about any young, high-risk/high-reward startup out to change the world.

To be clear, I am not suggesting that telecommuting is always a bad idea.  There are jobs, as there are sports, that are basically individual activities; for example the proverbial bond trader who plies his trade from his home office in Vermont.  There are people that work better without the distractions of a busy workplace, for example, some journalists I’ve met.  And there are situations where the fit/synergies between the business and the employee are greater than the real costs of working remotely; for example, some entrepreneurial lawyers I know.  But at the end of the day, I still think the best question for Marissa Mayer is, “what took you so long?”

 

Back to the Future with Watson

By: Paul A. Jones

When I began my career way back in 1985, in Silicon Valley, artificial intelligence (AI) was one of the trendier technology plays. I always had – and still have – a problem with the notion of artificial intelligence, mostly because I don’t think we can really define intelligence. And, in fact, the AI folks in the 1980s pretty much agreed, if only reluctantly, and it did not take long for the more practical AI folks to scale back their grander talk of machine intelligence to the narrower concept of the “expert system.” Expert systems were basically programs that included all of the known (including in the best cases probabilistically known) information about a particular field (say searching for oil) with the idea that they could then answer – to the extent current knowledge included an answer – any question about that field of enquiry.

The venture community financed a bunch expert systems startups in the 1980s, which were premised on the idea that while machine sentience – more or less (there is not generally accepted definition of sentience) the idea that a machine could not only “manipulate” (“remember” if you are of a Platonic bent) but “create” knowledge was impractical, machine manipulation of knowledge was itself sufficient to create value, if not new knowledge. Alas, the underlying computer technology of the time was on the whole not up to the task of creating commercially exciting machines that could outperform human experts even in relatively narrow fields, and by the early 1990s, AI and its less ambitious expert system cousins had pretty much disappeared from the commercial technology landscape.

Which brings us to 2011, and IBM’s Watson, a computer system that more or less handily defeated the two most successful human champions of the popular game show Jeopardy. It was quite an accomplishment, one that some observers are suggesting heralds a new era of AI research and in the not too distant future AI commercialization. But does it?

My own take is that Watson does not herald the re-emergence of AI, but rather the re-emergence of expert systems, this time powerful enough to be commercially useful managers of knowledge. But managers are not creators. Manipulating data, even vast amounts of data, accurately, more or less precisely (Watson made some humorous mistakes, for example suggesting that Toronto was a city in the United States) and blazingly fast, is not the same thing as discovering heretofore unknown information. Watson, in other words, may be able to parse the writings of a Shakespeare, but Watson, at least the Watson I saw playing Jeopardy, did not impress me as being able to write original literature of Shakespearian proportions. Watson is no more (or less, I think) sentient than the best expert systems of the 1980s, which is, finally, to say it is no more intelligent than those systems.

Which is a good thing. Personally I am glad that Watson heralds a new era of expert systems that are robust enough to be useful servants of mankind. A sentient Watson, on the other hand, would be a more problematic prospect. How far out that is, now there is a, if not the, question….