January 25, 2005
Radiation detection portals
Some of you might have seen the CBP (Customs and Border Patrol) announcement today re: the deployment of radiation detection portals at borders. The idea is to interdict trafficking of nuclear materials (among others) across US borders. If these are effective, border patrol have the ability to intercept nuclear weapons as they are brought across the borders. This would obviously be a good thing.
How might such a detection system work? I'm going to discuss this in very general terms because I have some misgivings on revealing potentially sensitive information. Consider a uranium bomb with, say, 12 kg of weapons-grade uranium and tungsten "tamper" that acts as a radiation shield. In a sense, this is a conservative weapons model (derived from Fetter, et al) - it is more likely that a terrorist group would use a gun-type bomb which would require about 50kg or more of highly-enriched uranium.
Such a bomb would emit neutrons and gamma rays, but the number of emissions observable at a detector may be smaller than the background rate of neutrons/gamma rays coming from cosmic rays, natural radioactivity, etc. So this presents an interesting problem of resolving signal from noise.
How can you make this detection problem easier? One obvious way is to move the detectors closer to the sources. Another is to increase the exposure time. To explain the latter point, consider a source that generates 20 neutrons/sec at the detector. The neutron background is 50/second with a standard deviation of 7/second (assuming a Poisson process with standard deviation equal to half the mean). Now if you see counts per second of 70, 75, 68, 75, 70..., you might notice a trend of 2-sigma events and conclude that there is a neutron source in your field emitting about 20 neutrons/sec. Well, the same goes for 1-sigma events, over a larger number of intervals, since the probabilities are multiplicative: a string of counts such as 59, 61, 64, 60, 59, 63, 56, 58, 60,... for instance, might lead you to conclude that what you're seeing is a smaller but still definite number of counts (perhaps 8-9 neutrons/second) above the background. So, given longer exposure times, it is possible to definitively detect weaker sources of radiation.
The truck or vehicle pulls up to or passes through the portal (a few meters wide) at pedestrian speeds (say 5 mph). This provides proximity and exposure time, aiding detection. Even so, this is a tricky problem, as noted earlier. Further, maximizing detection time is at odds with the goal of increasing throughput by reducing delays.
The above remarks primarily apply to passive detection, which consists of passively measuring gamma/neutron counts and registering counts that exceed a specified threshold. There is also active detection which involves actively probing the contents of a truck or car using gamma rays or x-rays and using the results to infer the presence of nuclear materials (this is conceptually similar to taking an x-ray image). This works quite a bit better, but obviously, since this is an invasive procedure that could affect any humans within the vehicle, this technique is not as popular as passive detection. However, this technique might be feasible at border checkpoints, where it might be feasible to require the passengers to step out of the vehicle for the duration of the inspection.
January 24, 2005
Looking for Sales reps in the Northeast
Sorry to interrupt the more "serious" blogging for a commercial but one of my portfolio companies is looking for sales folks in the Northeast and I would welcome any referrals.
Ideal candidates would be folks who have experience in selling software platforms to financial services companies . . . now back to the serious stuff.
January 21, 2005
Homeland Security spending dissected
I just finished reading an interesting paper by Veronique de Rugy of the American Enterprise Institute for Public Policy Research. The paper is titled 'What Does Homeland Security Spending Buy?'. The study looks at the growth of Homeland Security-related spending and where that money is going. It points out some glaring problems in the funding allocation process as well as in the projects that get funded. Pork-barrel politics rather than a sound cost-benefit analysis founded on risk management principles appears to be driving investment decisions:
While the quantity of funds is significant [Total spending directed to Homeland Security activities (est.) for FY 2005 is $47 billion], the funds are not being allocated according to a plan that was devised by security experts... In keeping with the way Washington spreads federal taxpayers' money to the states,... DHS follows in part a formula set by Congress that provides every state with a guaranteed minimum amount of state grants regardless of risk or need. Specifically, the formula written into law by Congress into the Patriot Act after September 11th guaranteed each state 0.75% of the total amount appropriated to DHS for state terrorism preparedness grants... It amounts to 40% of the total pot of money being divided up equally among the states, regardless of size, risk or need... The political formulas used now to allocate the money disconnect the funding from the risk of being atacked.
After this 40% ... is allocated to states, the 60 percent left over is apportioned among states based on population, not on risk.
This spending forumla translates to a per-capita spending of $5.05 for New York State compared to $4.69 for California and $35.30 for Wyoming.
Another interesting/scary tidbit: 95% of TSA's 2005 budget for 2005 (which is $5.3 billion) is devoted to aviation security alone (you know, checking people's shoes, removing plastic knives from carry-ons, that sort of stuff).
That would leave about $250MM for the other stuff - securing the public transportation infrastructure, installing WMD detectors at borders and ports of entry, shipping container security initiatives, etc.
January 18, 2005
The blogosphere is buzzing about the Wiki wars. Matt Marshall has an article in the Mercury News about it. Ross Mayfield felt compelled to write about it and give his point of view on the news. Here is the REAL news about Wikis - CUSTOMERS CARE AND ARE WILLING TO PAY FOR IT!!!!!
At this early stage of the market both companies should cheer each other on when they get customers as it validates the markets and moves people from asking, "What is a wiki?" to which "Wiki software should I buy?"
So Ross forget about trying to defend Disney's move to Jotspot and rejoice that people are buying Wiki software even if it is not yours.
Better late than never - a short intro
Arun Natarajan blogged recently about our blog.
Arun - thanks for noticing and the kind words. However its important to clear any misperceptions - while some of the recent posts on the blog are mine, a quick check on the history will show that Chari has been the more prolific blogger and its posts are way more popular (atleast the weblogs think so ;-)).
Chari, I, and a third blogger - Srikrishna Devabhakthuni - have been best friends since our college days in Caltech. We all have been entreprenuers - I co-founded a company called Trigo which was acquired by IBM while Sri and Chari co-founded Tropos Networks. Sri and Chari still have real jobs while I switched over to the dark side and became a VC.
Chari and I felt that a blog which combined the perspective of both a VC and an entrepreneur would be interesting to folks and hence the beginning of this journey . . . .
The Stockdale Paradox
The Stockdale Paradox is named after Admiral Jim Stockdale who was the highest ranking US military officer imprisoned in Vietnam. He was held in the “Hanoi Hilton” and repeatedly tortured over 8 years. Collins describes going to lunch with Stockdale (can you imagine?) and trying to understand how he survived 8 years as a POW while so many died after just months in captivity.
Here’s how Stockdale put it. “I never lost faith in the end of the story. I never doubted not only that I would get out, but also that I would prevail in the end and turn the experience into the defining event of my life, which, in retrospect, I would not trade.”
“Who didn’t make it out”?
“The optimists. They were the ones who said ‘we’re going to be out by Christmas’. And, Christmas would come and Christmas would go. Then they’d say, ‘We’re going to be out by Easter.’ And Easter would come, and Easter would go. And then Thanksgiving, and then it would be Christmas again. Then they died of a broken heart.”
So, on the one hand it was about unswerving faith that one will ultimately prevail while on the other hand it’s about banishing all false hopes? As usual, the guy who lived it says it best.
“You must never confuse faith that you will prevail in the end – which you can never afford to lose – with the discipline to confront the most brutal facts of your current reality, whatever they might be.”
Holding those two seemingly contradictory notions in his head simultaneously was the key to Stockdale surviving, even thriving, in his experience. And, I believe, it is a perfect summary of the mindset you’ve got to have in starting a company. You have to believe that your vision will come to pass. You’ve got to do everything you can to make it happen. But, you can never let your belief and faith cloud your confrontation with reality.
January 14, 2005
Prisoners of context
One of the elements of my job that I love is the ability to connect and talk with some of the top executives in the technology field. While I like to think its my cheerful personality that allows that to happen, I know that in reality its driven by my role as a VC.
I have had a chance in the last few months to talk to quite a few senior software execs from a variety of software application companies. A constant theme appears:
- The software business has changed irrevocably and has matured
- The glory days of growth are over
- Technology does not matter; its all about maintenance revenues and consolidation
- No white space remains in the software landscape
Hearing all their lamentations reminds me of an episode from my salad days when I had a real job. I was at Microsoft in the mid 90's and was lucky enough to be one of a few microsofties invited to have dinner with Bill Gates and Mike Maples (they had a program in which they selected 30 odd employees every month to have dinner with the big cheeses.) After dinner, Bill and Mike would run a 30 minute question and answer session. I asked Bill that day if he was starting out his career in the 90's and he wanted to create a big company what would he do? Bill earnestly said (and I felt he was being very honest) that the big opportunities in technology was done and that he would do something in Biotech. He felt then (this is 1993) that the software industry has matured and there was not going to be much growth anymore. Mike Maples also agreed. Obviously in retrospect they were wrong. They missed the Internet. They missed BEA, Siebel, Veritas, Verisign and other countless B2B software companies that created billions in value.
The moral of the story is that even very smart people can be blinded by the context of their environment. To most senior software execs, they are living in a tough environment of long sales cycles, even longer implementation cycles, impossible to non-existent upgrade capability and a 3-5x services budget to implement their license software. And they are right - that market is DEAD!!!! Customers don't want that anymore.
What they are not realizing is that the new opportunity is exactly to sell and make software in a manner that has shorter sales cycles, very fast implementation time-lines, easy to upgrade and easy to maintain software, and a small amount of services. The new software companies will do exactly that and they will eat the legacy players alive as history has shown many times.
January 06, 2005
End of Meritocracy in America - I think not!!!
The Economist is one of my favorite magazines. I try to never miss an issue. That is why it disappointed me to read their special survey on meritocracy in America. Registration is sometimes required to access the article so for those who cannot access it, the Economist makes the following points:
- Income inequality has reached levels not seen since the Gilded age
- Past few presidential candidates have been "dynasty" candidates
- Percentage of people who move from the bottom fifth of society to the top fifth has declined
And they end the piece with this conclusion:
In his classic “The Promise of American Life”, Herbert Crowley noted that “a democracy, not less than a monarchy or an aristocracy, must recognize political, economic, and social distinctions, but it must also withdraw its consent whenever these discriminations show any tendency to excessive endurance.” So far Americans have been fairly tolerant of economic distinctions. But that tolerance may not last for ever, if the current trend towards “excessive endurance” is not reversed.
I feel they have made quite a few leaps of logic to make this conclusion:
- Income inequality is by itself not an indicator of lack of meritocracy. If meritocracy means that people are rewarded on the basis of merit that unless we all had the same merit, we should get rewarded differently. In fact rise of income inequality might actually signal more meritocracy because we have figured out better how to reward merit and punish non-merit!!
- Looking at Presidential candidates to make a conclusion is the kind of anecdotal arm chair analysis that you don't expect of the Economist. Better statistics to look at are the stats for congressmen and senators as well as the stats for city and state seats. In the last 10 years the percentage of women in politics has gone up considerably in America. If you look at anecdotal evidence, Louisiana elected two Asian congressman in 2004 (one Asian Indian and one Arab American). That's anecdotal progress
- Percentage of people moving up from the bottom to the top is not meaningful unless we also know some thing about their ability or merit. Meritocracy does not mean musical chairs. It means society rewards people based on their merit rather than based on their race/class/pedigree.
From a personal standpoint, I find the Economist article very off base. Every week I meet close to approximately 10 companies looking for money and more than half of them have been founded by entrepreneurs who are immigrants or folks from modest backgrounds. Of the five companies I sit on the board of, 3 are founded by immigrants who came to this country with no money and no connections. I am sure my experience is not dissimilar from other venture capitalists.
My Economist correspondent you are wrong with your conclusion and if you don't believe me, just head down to Silicon Valley and check out some of her startups.
January 04, 2005
Open Source and the Enterprise Infrastructure Stack
One question I keep thinking about is how much of the enterprise infrastructure stack will open source penetrate? People like Marc Fleury at JBoss believe that open source will keep climbing up the stack and move up from application servers to the middleware platform.
One recent article that does a terrific job of capturing the lifecycle of open source development is Craig James's The Care and Feeding of FOSS. He believes that open source projects start really taking off when a product reaches the maturity stage:
With the slow pace of innovation of the Maturity phase, the FOSS community begins to slowly but inexorably erode the technical lead held by the commercial offerings. FOSS versions of the technology may have been present all along, but the pace of innovation during the Expansion phase often left them in the dust. But now, with the technology mature and the pace of innovation slow, FOSS becomes the proverbial turtle, plugging along toward the finish line, slow but unstoppable. Feature by feature, the FOSS developers eat away at the commercial products.
The commercial suppliers are doubly cornered. First, the product is no longer cutting edge, so staffing is reduced and management interest is low. Since there's little innovation, R&D costs are low, which means profits are high. Developers who want to innovate are discouraged, because there's little potential return on investment.>
Second, the technology has expanded to the logical boundaries, and additional features are less and less relevant to the core technology. These two factors slow innovation dramatically in the commercial sector.
Sooner or later, the FOSS product not only matches the commercial products feature-for-feature, but the nature of open-source software makes the FOSS product more reliable, higher performance and (where security is a concern) more trusted.
This worldview postulates that the commercial software companies will be the ones who will continue to innovate while the open source community will dominate the more mature technology categories such as operating systems, databases, app servers, etc.
Larry Ellison in an interview that was blogged on Always On has a different take on open source:
The interesting thing is that for an open source product like MySQL to get a lot of traction, they're going to have to walk down the same road that Linux did, which is to get a lot of very large companies to support them. There is this myth that Linux was created and popularized by a bunch of guys who worked by day at hobby shops. Then supposedly they'd go home and program in Linux in their free time. But in fact, the biggest supporters of Linux are businesses like IBM. IBM is not a hobby shop. Oracle, we're not a hobby shop. Hewlett Packard. There are huge companies supporting Linux and the open source movement. MySQL doesn't have that same kind of support behind it. SAP is the first large company to begin to support MySQL, but again, if you compare that landscape to the number of companies that are willing to launch Linux...there are just a bunch more companies supporting Linux. So I don't think you can just paint with a broad brush and say it's "open source versus not open source." It's "open source that has support by the technology industry."
If you believe Ellison's worldview then you would feel that open source can only start climing up the stack if there are large technology players behind it. I don't buy that. I am closer in my thinking to Craig James than I am to Larry.
One concern neither of them mention is the legal indemnification issue. Authorship in open source by its very nature is muddled and its often hard to know whether contributers provided original code or have polluted it with copyrighted code. Until the legal indemnification issue is solved or atleast until large companies can stand behind the open source code, most large enterprises are going to feel uncomfortable taking the plunge.
January 03, 2005
IBM and the App Business
Bill Burnham has an interesting blog on whether IBM should get into the application business or not. He covers the issues IBM faces on the infrastructure front and speculates that IBM might just let its software margins wither:
On the other hand, IBM may simply fight a rear guard action and be content to gradually let its software growth and margins decline as software is not’t really its core business now anyway. From this perspective, IBM will accept gradually lower margins and growth in its software business as simply the price to pay for keeping its services and hardware businesses in the black. After all, IBM had its chance to get into the applications market by buying Peoplesoft, and instead of playing the white knight (which Peoplesoft clearly desperately wanted IBM to do), it just stood by and let Oracle acquire Peoplesoft without batting an eye.
I have thought about this issue a lot and I actually think IBM did a smart move by not buying PSFT's white Knight. As tempting as this might have seemed to Steve Mills and his team, PSFT's license business is struggling and they are mainly staying afloat through their maintenance revenue stream. The core ERP market has both the wrong software delivery model (behind the firewall, long implementation cycle, huge customization) as well as the wrong business strategy of trying to sell big bang enterprise licenses. Both of which are not how consumers want to buy software.
I think the future comprises of composite apps built by combining web services and IBM should focus on building the infrastructure that powers that future. They might also considering buying some one like Jboss and offering that as a low end app server and positioning Websphere as their high end middleware platform