Science, technology & design
In the early 1960s, the science fiction writer Arthur C. Clarke formulated three ‘laws’ for prediction, the first of which was that: ‘When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong’. How true. Scientists, like futurists, have a rather checkered history when it comes to prediction. Notable failures include Malthus’s* prophecy that overpopulation would lead to the starvation of vast swathes of people (he was right with his population forecasts but got scientific developments around agriculture totally wrong). A few hundred years later, the ecologist Paul Ehrlich got it wrong again by saying: ‘In the 1970s and 1980s hundreds of millions of people will starve to death’. We’ve also been over-sold about nuclear annihilation, global pandemics, Y2K and perhaps AIDS (although not in parts of Asia or Africa). So why is it that scientific prediction is so inaccurate? One answer is that predicting anything, especially anything set in the distant future, is damn hard. Unexpected inventions, discoveries and events keep tripping things up. Nevertheless, you’d think that scientists, of all people, might be better at it than most. Sadly not.
Scientists are under pressure from government and media alike to make simplistic statements that can be consequently taken out of context. Moreover, the way that science funding works means that scientists are under pressure to make promises, especially promises that link to economic benefit. No promises, no funding. And this, in a nutshell, is the problem. Making predictions in private is one thing but the moment predictions enter public consciousness they start to influence real world research and policy. Moreover, predictions invite reaction that runs counter to the reflective deliberation and logic of science. In one way, you can argue that none of this is science’s fault. However, when scientists make predictions that are proven to be way off the mark, the public becomes cynical about science in general and tends to lose interest in future predictions. A good example might be Martin Rees, the President of the Royal Society in the UK. In his book Our Final Hour: A Scientist’s Warning, Rees claims (predicts) that ‘the odds are no better than 50% that are present civilization on Earth will survive the end of the present century’. Big prediction. It is also, to a large extent, a prediction that falls outside his area of expertise (astronomy) and while he claims that this is his personal view (not that of the Royal Society) the connection is already made by many people.
Ref: The Australian (Aus), 14-15 November 2009, ‘Science takes its best and worst guess on the future’, S. Blackman
Source integrity: ****
Search words: Prediction, future, science
Trend tags: Anxiety
* OK, he wasn't a scientist.
Scientists in Japan are developing a number of paper-thin screens that are bendable and even stretchable. So what? Well, if they become cheap enough, I’d expect screens and ambient devices to start appearing everywhere in the future. We will cover office desks; office walls, restaurant tables, fridges, sofas, clothes, cars and even whole buildings with screens and eventually our interface will become the entire world. This will, in turn, create new businesses, new business models and new societal attitudes and behaviours. We will be able to talk to these screens and they will answer back with geo-coded information in real time. At least that’s the theory. At the moment the technology doesn’t quite work, but it might. Currently, there are two broad ways of making interactive or ambient devices. The first way is to use printing technology to add various electronic elements to surfaces ranging from paper and plastic to glass. Companies such as Seiko Epson and Ricoh are already investigating this technology. The second way is to weave together electronic components (dots, effectively) into, on or around other objects. When can we expect to see this happen? Give it about five to 15 years.
Ref: Nikkei Weekly (Japan) 17 August 2009, ‘Technological advances promise thin, bendable electronic devoices’. M. Ito. www.nni.nikkei.co.jp/
Source integrity: ****
Search words: Screens, displays
Trend tags: -
The idea of controlling a machine just by thinking about it has been the stuff of science fiction for ages. It’s also science fact. Moving a cursor around a computer screen via thought control has been possible (in research laboratories) for years.
And you can now buy a toy that works on brain waves – the Star Wars Force Trainer – for under $200. Why would people want to do such a thing? One reason would be gaming. Virtual worlds and virtual wars would feel more real. Another, more serious, application is for the disabled or paralysed. Linking thought control to artificial limbs or wheelchairs could transform the quality of life for many people. However, don’t get your hopes up too much just yet. Placing electrodes on the scalp (eg, via headphones) works up to a point, but a much better (more precise) method is to implant the electrodes inside your head. The signal becomes much clearer and the instructions much more precise. Where will such technology go in the future? The answer is anyone’s guess, but it’s interesting to speculate what could happen when you blend brain-machine interfaces, gesture-based computing, verbal search and artificial intelligence.
Ref: The Economist (UK), 13 March 2010, ‘Connecting to the brain: Thinking about it’, www.economist.com
Source integrity: *****
Search words: The brain, interfaces, intelligence, thinking
Trend tags: -
Are you sick of the term ‘cloud computing’? You are not alone. The term ‘cloud fatigue’ has already entered the Silicon Valley vocabulary. Nevertheless, the idea of internet-based computing as a utility is not about to disappear any time soon.
The parallel here is electricity. Once energy from different generators could be aggregated and distributed over a grid (using something called a rotary converter) energy could be traded as a commodity. With computing, the shift was first the shift from mainframe to client server in the 1980s, but it was virtualisation that was the real paradigm buster. Virtualisation allows the physical separation of hardware from software so that it’s possible to access computing power from anywhere.
Once this happens it’s also possible to buy computing power on demand (with real-time pricing). This gives rise to ‘cloud brokers’ (people that help other people to switch between cloud suppliers). If this is all too confusing, the best way to explain this is by way of an example. Amazon, the online retailer, has a subsidiary called Amazon Web Services that auctions off unwanted (unused) Amazon computing power in real time. The cost of using the capacity depends on demand and will rise and fall much like the price or oil or gold. Some people say that this is a model for how computing will evolve in the future. Computing will develop in much the same way that energy or financial markets have evolved, which means arbitrage, derivatives and hedging. Others argue that legal barriers will prevent such a move. In Europe, for instance, it is illegal to export some kinds of data. Moreover, it is hardly in the interest of most firms to have customers zipping between suppliers. Rather they will seek to lock-in customers and sell then additional, higher value services. Perhaps what we’ll see, then, is a hybrid model involving a mixture of public and private clouds and a blend of free markets and walled relationships.
Ref: The Economist (UK), 13 March 2010, ‘Clouds Under the Hammer’, www.economist.com
Source integrity: *****
Search words: Cloud computing, internet
Trend tags: Internet
Can Apple’s tablet perform a miracle?
Apple’s new iPad, referred to by some as Apple’s ‘Jesus tablet’, has been a long time coming. Its lineage goes back to Apple’s launch of the Newton in 1993. The Newton failed, as have most other tablet computers since. So what makes the iPad different? The answer is timing. Firstly, most early tablets were expensive and had dreadful interfaces and limited capacities. But developments in battery technology, visual displays and micro-processing have enabled Apple to do what it always does, which is re-invent and improve an old idea. Add some design and marketing smarts and you have a potential winner.
In the US the iPad costs $499 and up, which is as cheap as silicon chips. The iPhone on steroids device is, along with the iPhone, also one of the world’s first truly convergent devices. As a result the iPad has got the computing industry, the telecom industry and parts of the media industry very excited indeed. Is the hype justified? Time will tell, although early indications are good. The iPad will undoubdetedly get more people reading books, newspaper and magazines on screens and it could have an impact on video watching and game playing too.
There’s also the possibility that some people will start using an iPad over a laptop. This could mean an erosion of Macbook sales, but a more likely scenario is that the iPad will eat into the $11.4 billion netbook market instead. So what’s next? If Apple’s history is any guide to its future we can expect the iPad to unleash a torrent of competition and content creators will start knocking on Apple’s door. As for media companies, it’s a mixed blessing. The stronger newspaper companies will probably get a boost and may even be able to persuade people to pay for online content.
However, this could be the final nail in the coffin for the weaker papers, especially mid-ranking metropolitan newspapers. As for books it could be good news in that the power of Amazon (maker of the Kindle) could be reduced, although margins are likely to suffer all round. As for predictions that all media content will end up being digital thanks to this and other devices like it, I think this is still a bit far-fetched. Nevertheless, devices like the iPad will certainly help revolutionise how people consume media in offices, homes and schools around the globe and will accentuate the existing shift away from paper to pixels. I’m not staying this is a good thing but it is a something to cogitate about.
Ref: The Economist (UK), 30 January 2020, ‘The book of jobs’, www.economist.com
Source integrity: *****
Search words: iPad, Apple, e-books, tablets
Trend tags:Virtualisation, digitalisation