This is an essay inspired by Dave Winer's call for comment on the shape of the next technological revolution for the Seybold 2001 conference. I don't think I'm really saying anything new here. But I feel that as technologists (or geeks!) we are as guilty as everyone else of failing to see the woods for the trees. It's so easy to get wrapped into tracking Microsoft or arguing over the most elegant way to design a network API that we lose sight of the big picture. Much of this is just recapitulation of trends that others have seen but I do offer some implications and technologies at the end. The cynic is welcome to dismiss it all with "that's so 5 minutes ago". But it's still true nonetheless.

Exponential Laws
We all know these, but think about them again.

  • Processing power - Moore's Law - Processing power (and memory and anything else dependent on silicon) doubles every 18 months

  • Bandwidth - Gilder's Law - Bandwidth grows three times faster than processing power (or Moore's law).

  • Storage space - (un-attributed?) - The rate of growth in storage space is accelerating. It's currently doubling every 12 months.

  • Network Value - Metcalfe's law - The value of the network is proportional to the square of the number of endpoints. Some commentators argue that it increases faster than this. not N^2 but 2^N or even N^N

    Rick Rashid, the director of Microsoft's research division, suggests, "Whenever you have a really large quantitative change, you begin to see a really large qualitative change in the way people think about things. We're not even going to be able to guess exactly what the applications are going to be two or three years from now, but we know it's going to change."

    As humans we are very bad at coping mentally with eponential growth. We expect everything to stay pretty much the same from day to day. But in exponential growth, the linear change looks small until the last day when it is vast. Suddenly the landscape has changed completely. This has gross effects on our ability to plan. If we don't see that last day change coming we are very likely to under-react in the days before it and over-react as it hits us. For a real world example, we've spent the last 20 years witrh not enough disk space. Suddenly the PC you buy this year has more space than you will use in the lifetime of the machine. This is extremely significant because so much of our design is based on scarcity thinking. Centralize and hoard resources because they are precious.

    Ubiquitous, always on, high bandwidth connection
    Those of us who work in the industry forget what happened when we got always on internet in the office and no longer had to use dial up modems. It changes the way you think about the Internet and how you use it. You no longer keep local copies of documentation for instance. But then there's another change that we take for granted when internet access is always on but hi-bandwidth as well. There ceases to be much difference between local resources and remote resources. In the last 12 months we've added an extra twist to this which is wireless. It's now feasible to get this always on, hi-bandwidth access and make it completely ubiquitous. Your Handspring is a peer to your laptop and a peer to every other endpoint on the net, along with your digital camera, your printer and any other semi-intelligent device you have.

    If we can just sort out the last mile problem and make this sort of access cheap, then everyone can potentially have the access that we've become used to. How does it change the use of the internet if this level of access is really democratized and available throughout society instead of just the computing elite?

    Price
    There are a number of factors that are used to maintain prices. Chief among these is the cycle of continual obsolesence and continual upgrades. We accept an unchanging price for a PC of ~$1-2000, we just get twice as much for our money each year. But we also accept a bundle of software licenses in that price. If we go back to last year's model and remove the license costs, then a fully functional PC could be built for <$500 and possibly even <$200. Several third world efforts are underway to do exactly that and flood their countries with these "Simputers". Even in the west, there may well be a market for such a device.

    And when I say fully functional, I mean it. It may not play the latest games very well, but it can do everything else. This is exactly the by- product of all those exponential laws. There comes a point when last year's model is "good enough". And this year's model has twice as much but we don't actually need the extra half to do useful work.

    Useability
    The last big factor I want to look at is useability. As geeks, we're occasionally annoyed and frustrated by awkward installs, inadequate documentation, and obscure .conf files. But we also see them (perhaps unconsciously) as an intellectual challenge. However, Microsoft and Apple have made great strides towards hiding all the complexity and have allowed the masses to play in the game as well. We must not lose sight of this if we're going to play our part in building technology revolutions.

    Some conclusions
    So. We have all these factors pushing us towards ubiquitous, cheap, local, processing devices with plentiful local storage, universally connected via wireless, always-on, high bandwidth links. What are we going to do with it? Well the first thing to do is to push function away from the center and out to the edge where all this power is located. De- centralization. Make every device a server as well as a client. This poses some huge conceptual problems because all of our thinking so far has been driven by the combination of scarcity thinking (hoard resources) and aggregating power (the only way to generate revenue).

    If this is the landscape we're facing in the near future, we should be looking for technologies that facilitate it's use. We should be focussing on ideas that generate gnarly emergent behaviour from all the vast numbers of connected servers. Things like auto-discovery of edge resources; Instant integration between my program and yours; Massively distributed databases; Massively distributed publishing; Automated local mirroring; De-centralized search.

    Two Applications
    To finish with, I'll toss in a couple of potential applications. The first was considered by the Media Lab 20 years ago. Personalization of news or "The Daily Me". All of us are consumers of news. The question is what form this should take when the generators of news and content are all on line and effectively every network node is a generator of content. Even if they are not a primary source, they are potentially a secondary source creating "Letters to the Editor". We are on the verge of having all this content available in a machine readable form, so how are we going to collate and present it in a form that is digestible, easy to use and tailored to the individual reader.

    The second is the implication for small and medium sized businesses. So far, online business transactions have been driven and controlled by big business with big business price tags. The same trends that will revolutionize consumer use of the network will also revolutionize SME use. It's perfectly feasible now for a small business to build a shopping cart system on the web and take orders online. But if we shift this out to the edge and their own location we can tightly integrate their own general ledger and stock control. A packaged, easy to install solution to online SCM and CRM combined with the classic back office function levels the playing field. It would allow us to build a web of commerce to go with the web of content.

    The Lesson
    If there's any single lesson in all this, it is perhaps that the near future is not going to be "more of the same". The quantitive changes are happening so fast that a qualitative change is inevitable.
    [ << Western governments and taxation ] [ Simplicity and resilience in standards >> ]
    [ 0 comments ] [ G ] [ # ]