Wednesday, May 21, 2008

One Last Post...

One of the more annoying aspects of the NewsCloud is it's tendency to simplify then exaggerate.

Take science vs. religion.

The mere fact that you have some idea of what I'm talking about it disgusting. How can such a puerile expression be useful? As a mental hook for something so complex and profound (and so misunderstood) that it escapes almost all serious consideration.

In the last 24 hours the British parliament has been debating and voting on a series of issues associated with abortion (should the last date at which an abortion can happen be lowered from 24 weeks), hybrid embryo research, and saviour siblings.

Some tasteful sketches of a foetus from Leonardo da Vinci

All these are important issues. I won't comment on them because there are facets of the debate (particularly associated with abortion), which I simple don't know about.

These issues are important and as such they should be treated as such. This is not, nor has it ever been, about "science vs. religion."

That expression "S vs. R" begs so many questions it is almost impossible to dissect it without falling into the trap of dignifying the debate, but I will attempt it.

Science is a tool. It is a way of thinking about, and observing the world. Empirical evidence (input from our sensory apparatus, both biological and artificial) and recorded data acquired through empirical means are considered.

Once they have been considered scientists think of ways that any patterns, or lack of pattern, in the data can be explained. They create a hypothesis.

They then attempt to disprove the hypothesis. They create experiments that are intended to disprove their hypothesis. If a hypothesis stands up to this treatment, and the work of one scientist is corroborated by the work of many other scientists, then an hypothesis is accepted as a scientific theory.

Yes, I know there's more to it than that. Check out this document on the scientific method for more details on the subject.

Religion is many things to many people. To me it is yet another system of control, external to myself.

It is also a panacea in moments of weakness. It is a crutch and it is community. Sometimes it is good and sometimes it is bad. Sometimes it is right. Usually it is wrong.

I do not judge those who have faith. I know what faith is. It is like a powerful drug, and it can make difficult things ... less difficult. I have had faith.

It's glowing! It must be SCIENCE!

However I am entirely within my rights to call anyone who believes in the afterlife a fool and anyone who thinks the universe is run by some dude with a beard who isn't a science fiction writer of some sort (if God can't be a science fiction writer, or is not a full-time [and published] science fiction writer I quit - god is wrong and it is immoral to have faith ;-)) an idiot.

Anyway I despise how this argument is corrupted and dragged through the dirt by slavering hacks wanting to churn out copy on a "controversial" debate.

Addendum: I am entirely aware that this article is without evidence, empirical or otherwise. It is also fairly badly written. So sue me.

Business and Capitalism

In recent months there has been an extensive debate in the NewsCloud (I'm fed up with talking about newspapers, media, the press, the blogosphere - the NewsCloud will suffice) about capitalism; where it is going, where it now, and how it got here.

Two articles in the Cloud today highlight two different issues:

1) Luke Johnson writing in the FT comments:

"Innovation and progress come from embracing markets and encouraging entrepreneurs. The world is more competitive than ever; we cannot rely on old industries and the state to maintain our standard of living."

I happen to agree with this. When commentators go on about how awful the credit crunch is and how evil all these userous capitalists are in dragging us into this mess they always fall foul of the fact that they do not have a coherent alternative strategy.

I also agree with Peregrine Worsthorne that a squeeze on the financial industry might lead to an egress of talent away from finance and towards more useful things like medicine, pharmaceutical research, and entrepreneurism.

Johnson goes on to say:

"Markets are naturally dynamic, whereas governments resist change and fresh thinking. According to the Global Entrepreneurship Monitor, overall early-stage entrepreneur activity in Britain involves about 5.6 per cent of the population, a much lower rate than in the US, Brazil or China."

An Entrepreneur

A nation of shopkeepers? I think not. However Johnson makes the point that:

"A slowdown in the economy and rising unemployment might just stimulate more to start their own business as an alternative. This would be the silver lining of the credit crunch cloud."

Although the UK is not openly hostile towards entrepreneurs, they are not afforded the same respect as accountants, physicians, architects, or academics. Johnson describes entrepreneurism as just as much a calling as these respected professions but (partly because of our confused and irritating emphasis on class) in the UK "entrepreneur" is not listed on the job sheet.

2) The second article is from Edward Pearce in The Guardian:

"Modern capitalism has become etiolated. It has flourished lately upon deals ever more remote from raising capital investment for steel mills and biscuit factories, upon leverage and derivatives, upon credit and the ghost of credit, upon financial rice paper."

Speculation seems to be endemic to capitalism. Fortunately all this credit crunch nonsense seems to be having a negligible effect on actual global economic growth. China makes things.

From a science fictional perspective there is something reassurring about this. Times change, technologies change, but wherever there are financial markets there are speculative bubbles, and crashes and crunches.

The two ends of capitalism: the rarified ivory tower of deriveratives of deriveratives (George Soros et al) and the coalface of business and wealth-creation (Felix Dennis, Richard Branson) and the inbetweeners of capital allocators like Warren Buffett.

How it worked in the good old days

The whole wagon will continue rushing into the future. If it all breaks down completely (a situation where "end of the world" insurance would come into play, from Pearce:

"The existence of such manic trade created secondary explosions (or do I mean secondary deposits?) in the insurance world. Here the rule is the greater the likelihood of damage, the higher the premium. But the least probable horrors may be insured against at modest cost. The top point is called "end of the world" insurance, the unthinkable: Hugo Chávez takes over the White House, the moon coming perceptibly nearer. It's so remote it's cheap, $2,000-$3,000 a year rents $10m worth. Or it did. That volume now sets you back $20,000-$30,000."

I know! WTH?) then at least capitalism, or at least the concept of trade, will survive.

A Self-Hating Pedant

...or should that be "A Pedant Who Hates Himself" or "A Pedant Which Hates Himself Because He Is A Pedant" or "A Pedant That Hates Himself, Due To His Pedantry."


I am a pedant. I am not especially articulate, and I am not especially critical of others in most circumstances. However I have a verbal tic.

Every time someone is grammatically incorrect in speech, or mis-pronounces a word, I will respond with a correction.

Sometimes I manage to bite my tongue and get away with just thinking the criticism very loudly.

I am aware this is annoying and boorish and I can also bring to mind several occasions when it has got me into little social faux pas.

It is an artifact of my upbringing (and probably one that will, on balance, do more good than harm for me over the course of my life). One of my particular annoyances is when I want to use a word and suddenly realise that although I know perfectly well what it means and how it is spelt I don't know how to pronounce it.


What? Exactly! Is the "g" like the "j" in "just" or is it like the "g" in "grandma?"

Thank goodness for Wikipedia and the phonetic alphabet.

Anyway Marcel Berlins has written a stock journalistic article: "let's do something really straightforward and easy to make the world a better place."

A long time ago USAmericans, Canadians, and Australians (and New Zealanders, possibly) rationalised their versions of English by pronouncing clerk as "clerk" rather than clerk as in "Clarke" (as in Arthur C...). They also changed the spelling of "colour" to "color" and did a whole load of other sensible things.

But in the UK these words remain irrationally pronounced and spelt.

The reason for this is that there is a very strong vein of illogical, bloody-minded, stupidity in the British (the English, in particular)...

[ouch! my future self just dropped a few points in the speculative polls or whatever the hell the media uses to cripple the democratic process 20 years hence ... don't worry Future Self, you'd never make it as a Tory (you went to comprehensive school for gawd's sake). Go and try to get elected in Scotland. Bashing the English would probably win you some votes there. Go squander what remains of the oil money...]

...that results in things like this (crappy video link, SSM).

It also results in the sort of people whose sense of morality is based around the sort of trash Melanie Phillips writes in the Daily Hate Mail (she's only doing it because she gets paid more as a "right wing" blowhard than a "left wing" blowhard --- and more power to her for it!) getting shirty because something profoundly "British" like inches, pounds, ounces, and pronouncing ghoti "fish" (Google it or read Berlins) is being "attacked" by meddling bureaucrats from Brussels.

None of that was actually very clear, was it?

Essentially a key component of Britishness is doing something stupidly perverse just because you've always done it like that.

Beyond the point of being funny or endearing.


Also: the first two comments on that Marcel Berlins article have a rather lovely bit of pedantry...

A Commentary on Commentaries

At any given time there are a smattering of article in the dead tree press, blogs, websites, and magazine outlets worthy of perusal by anyone with a healthy interest in what is said about what goes on in the world.

Collected here are a few items that I feel are worthy of comment (I'm going to have one post per article, 'cause it's easier that way).

Privacy and social networking are two key components of the zeitgeist of social debate in the first decade of the 21st century. Zoe Williams writes in The Guardian writes of teenagers and online exhibitionism:

"...trying to inculcate discretion at a time when everybody is seeking exposure is like teaching abstinence at a time when all they want to do is have sex. Never mind the rights and wrongs of it, it doesn't work..."

There is no doubt that adolescence is a time when children are emotionally crippled by their own biology until they emerge, as if from a crysalis, into the neurotic grab-bag of talents, proclivities, and questionable ethics that makes up what passes to be a fully-functioning adult and denizen of the 21st century (that's an awful sentence, on two levels, but I will keep it because I enjoyed writing it - damn it!). However. I don't think teenagers are necessarily stupid.

This brings us on to the next key point in Williams' article. Something that has already occurred to most journos and commentators is that all this rubbish that is stuck up on social networking websites will (theoretically) still be there in the year 2020, when yours truly might be thinking of running for election to political office.

What's to be done? Williams suggests:

"...that 15 years hence, people won't need to be protected from their past excesses, because the very fact that this is a universal impulse that social-networking sites merely cater to, will mean that tomorrow's politicians will all have as many skeletons in their closets as one another. In fact, if you don't have a YouTube video from when you were 16, dancing to Britney Spears's Toxic, then it'll be as much an impediment to your public approval rating as being single is today."

This point is well made. I will now smatter this blog with spelling mistakes and grammatical errors, safe in the knowledge that people will draw from this the conclusion that I am "genuine" and "honest about my mistakes."

However they could also conclude that I am too computer-illiterate to spellcheck my post!

[However if Ray Kurzweil is right, by 2020 the computers will have taken over in an event already being labelled as "the technological singularity" - if I'm capaigning on a pro-singularity ticket my spelling mistakes will be interpreted as an early and tacit recognition of the need to augment my feeble human intellect with a Mighty Processor. On the other hand if I'm going to campaign on an anti-singularity platform my PC-illiteracy will be seen as being evidence of my inherent suspicions of technology.]

The agony of indecision! I feel like the press is saying Gordon Brown must be feeling.

I don't owe the person who I will be anything. I would vote for him, but only after a close examination of the policies he supports on a variety of issues and the relative positions of his opponents.

In conclusion if, by 2020, we're still going on and on about politicians' personalities as if they mattered a gnat's shite then Dog help us, Dog help us all.

Wednesday, May 07, 2008

And You're Only Just Realising This?

There comes a point in every man's life when he realises that he is almost certainly never going to create one of the fundamental circuit elements of electronics.

Also: one of the things that struck me as odd about the recent discovery/invention was how old-fashioned a discovery it seems.

We are no longer used to "fundamental" breakthroughs in areas other than the biological sciences, as Charles Stross comments in this interview:

"We seem these days to be seeing new ground-breaking theoretical developments at a rate of one every six months to a year: breakthroughs on the same order as general relativity or quantum theory. (You don't see such breakthroughs routinely in physics, which is a relatively mature field, but if you look into the biological sciences equivalent breakthroughs appear to be coming thick and fast.)"

There is something wonderfully retro-1950s-buttoned-down-white-labcoat-brylcreme- and-horn-rimmed-glasses about the invention/discovery of the "memristor."

Sadly my knowledge of electronics is ever-so-slightly too limited to truly grasp the theoretical implications of this. However the practical implications look extremely interesting:

"Today, most PCs use dynamic random access memory (DRAM) which loses data when the power is turned off.
But a computer built with memristors could allow PCs that start up instantly, laptops that retain sessions after the battery dies, or mobile phones that can last for weeks without needing a charge."

I spend at least ten minutes every morning waiting for my PC to power up at work (yes I know I could agitate for a better PC... but [deleted due to imminent curtailment of career prospects - free speech go hang]).

Imagine all the time you've spent waiting for a PC to power up: adding up all those two to three minute gaps could make a lot of difference in the world. You probably wouldn't even notice power cuts.

Of course my reading of this is that "instantly" means within a second or two and that the computer would retain the current session.

Anyway there's another thing off my list of things to do before I die...

C'est la vie.