Sunday, July 29, 2012

These are our defenders

I had no real intention to return to blog posting because it can become time-consuming. However, even though writing for The Register is enjoyable, there are things that don't fit the brief. So I'm back to occasionally post my overflow, so to speak.

I wish that self-appointed defenders of my freedom - like Anonymous - would stop interfering with it.

I don’t like data retention, but I do like political debate. And political debate has to include the existence of ideas I might not like - even if only so they can be debated and defeated in public.

There’s even a Web link at which I can detail my responses to the paper that the Attorney-General’s department has published seeking public opinion.

Which brings us to Anonymous, which instead of entering the debate, is trying to coerce the government with the threat of hacking attacks - which include publishing private details of individuals who have nothing to do with data retention.

Some people are happy with the idea of "collateral damage". I'm not.

What about government policies I like, but Anonymous doesn’t? Are they to be stalled or aborted at the behest of the nameless?

This isn’t just fantasy. An exchange with yet-another Twitter account claiming to speak with the name of Anonymous considers carbon trading to be a banking conspiracy, and therefore should not happen - will this be sufficient to justify a "no carbon trading" hack campaign?

The twitter account is called EngageDaMedia, which describes itself as “Anonymous Media”, the operator of which took exception to something I Tweeted regarding climate change. I’ll give you the exchange in full from there (I have taken out the handle of a bystander who happened to get copied on some Tweets).

@EngageDaMedia: @R_Chirgwin Well I'm a sceptic. Whatever the banks sell. I don't buy it! #Rothchilds. Man Made #ClimateChange My (nice) Ass! :)

@R_Chirgwin:@EngageDaMedia Banks aren't selling climate change. They're selling carbon credits. Science isn't banking.

@EngageDaMedia:
Yes they R & so is BigOil. How else would they get CC trading established?  Ive had this argument way 2 may times Research it :)

@R_Chirgwin: I read my first explanation of fossil fuels and climate change in a Pournelle article ... in the mid 1970s. Research? Yep.

@EngageDaMedia: @R_Chirgwin #NWO Plan 50 yrs in advance. Its Amazing! Was it Rockefeller/Rothchilds funded? Did ya read a/thing on the suns effects in 70's?

@R_Chirgwin: @EngageDaMedia So Rockerfeller and Rothchild owned Jerry Pournelle and Isaac Asmiov? Ask Jerry, since Asmov's dead...

@R_Chirgwin: @EngageDaMedia Don't let me ruin a good Evil Jews Are Running Climate Change scenario. Just keep it for the left-hand side of the bell curve

@EngageDaMedia: @R_Chirgwin It's NOT a Jewish Conspiracy. It's a Banking Cabal Zionist One. The 'illuminated' Ones! Read this: http://t.co/n3xRjXt0

[I’ll discuss that link later. In the meantime, don’t bother: it’s insane – RC]

@EngageDaMedia: Only polluter will pass on to the people. CarbonTax! @R_Chirgwin

@R_Chirgwin: @EngageDaMedia Which still doesn't say that simple chemistry >100 years old is somehow Zionist. Are you able to disaggregate the concepts?

@EngageDaMedia: @R_Chirgwin 100 years? Its goes far beyond that. Read the article I sent ya. Know What/Who we are up against. Let me know your thoughts

@R_Chirgwin: @EngageDaMedia What's that? The conspiracy to turn climate change into a banking instrument is >100 years old? Excuse incredulity...

@EngageDaMedia: @R_Chirgwin Committee of 300. 13 Illuminati Bloodlines  CFR Club of Rome etc Go!

@EngageDaMedia: @R_Chirgwin Nooo. Talking about the rulers of the planet! NOT a Jewish Conspiracy. Read the BIG article dude. It's Accurate. Scientific too

@R_Chirgwin: @EngageDaMedia Forgive me, I'm still trying work out when the properties of carbon dioxide (19th century) enter the conspiracy.

@EngageDaMedia: @R_Chirgwin You need to understand the mentality of these ppl to understand the grand plan/s. READ THE FUKN ARTICLE. :))

Apparently in response to someone else …

@EngageDaMedia: That's Right … But you cant blame the masses. #Fluoride, #GMO's #MSM #BPA #Drugs FM! Total onslaught from evey direction  @R_Chirgwin

Perhaps I should have stopped sooner, or perhaps I should have trolled for longer. When the discussion reached the use of fluoride, I decided I’d had enough.

The link, by the way, leads to full-on-kook conspiracy theorizing of the first water,  ticking all the boxes from fluoride in water to Freemasons in the 18th century, complete with extra-terrestrial interventions in Earthly politics – it really isn’t worth your time.

I’d like to return now to where I began this article: Anonymous’ intervention in the data retention debate.

Australia also proposes a carbon price which will, ultimately, morph into an emissions trading scheme. Anonymous, at least in the voice above, not only opposes this: it considers our “carbon tax” to be the creation of the Rothschilds and Rockerfellers. The voice that claims to speak for me and be acting as my defender is beholden to the most infantile fantasies imaginable.

Speak for me? Give me a break…

Thursday, February 26, 2009

How's this for tasteless?


It's a good thing that the Canberra Times is trying to add sense to the debate surrounding bushfires and prescribed burning ( article). But the Google advertisement, placed over the headline, seems a bit tasteless to me. Here's how it presented in Firefox ...

Tuesday, February 24, 2009

Solar component prices in Australia

Further to my post the other day about solar prices, I have conducted a brief study.

I should at the outset make it clear that this is not comprehensive. I did not include every solar system component from every outlet, because that would take ages and nobody's paying for this. This is a brief, personal study. It does, however, indicate that a more complete analyst's study would be valuable: perhaps if the solar industry is under scrutiny, prices might fall.

And I'm not going to name names. I don't have the resources to fend off companies who, under criticism, decide to launch legal attacks. But I will nominate the best-price sources for a few items, since I suppose nobody's going to object to being identified in that way!

It should also be noted that the prices I worked with were published on February 19 and February 20, 2009. Prices move, and may not be valid by the time you read this.

This study covered system component prices from 12 online suppliers in Australia.

In this small study, I have focussed on a few core solar components: solar panels, batteries, regulators, and inverters. I have ignored hardware such as frames, and I haven't had time to add chargers to the list.
I have not included eBay prices in this small study. It's often difficult to identify the manufacturer of equipment sold on eBay, which makes it impossible to assert the validity of even a small study.

I have focussed on components from “name” manufacturers. For solar panels, these include Sharp, BP and Kyocera. For inverters, the prices cover Lantronic, Selectronic, Powertech, YK and X-Power. Batteries include Fullriver, Rolls, Trojan, Crown, Concorde, Federal, Exide, Remco and Lifeline (because of the huge price difference in battery technology, I focussed mostly on lead acid batteries). Regulator manufacturers include Steca, Morningstar and Plasmatronics.

Finally, methodology needs to be explained. Panels are different sizes, batteries have different capacities, and regulators and inverters are designed for different loads. To normalise these, I calculated the following parameters:

Panels – Price per rated watt of output power.

Batteries – Price per rated amp-hour of capacity at 12V (allowing comparison between different battery voltages).

Regulators – Price per rated amp output for 12V regulators.

Inverters – Price per rated watt of continuous output.

Results

Panels – The price per rated watt of output ranged from $7.72 to $11.92, so the most expensive panels are 154% more costly than the least expensive. The best price per watt in the study was available from Solaronline for a Sharp 180W panel.

Batteries – Cost ranged from $131.68 per rated Ah up to $665.71 per Ah, making the most expensive batteries more than five times the price of the cheapest. The cheapest battery was a 202 aH Federal battery available from Australia Wide Solar. Difference in battery prices should, however, be treated with caution, since this does not take into account expected battery lifetime nor the different level of maintenance required for different battery designs.

Regulators – Price per Ampere for 12V-capable regulators (this includes 12 / 24V dual-voltage models) ranged from $6.17 to $11.18, a range of 181%. The best price was for a Steca 30A regulator from Energymatters.

Inverters – Inverter prices showed the greatest variation of any of the core components studied, from $0.31 per continuous watt output up to $2.50 per continuous watt. It should be noted that there is high variability in the features offered by different inverters; however, even within brands, the highest price may be double the lowest price for the same unit. The best price was for a YK 1500W, 12V regulator from Solar Online.

Saturday, February 21, 2009

Quick observations on the solar industry

I've just spent a piece of today costing out solar systems for home use, because soon I'll be starting to build a holiday shack on a little plot in the Southern Tablelands.

I settled on price per watt as a handy benchmark because it was easy to calculate, and there were some surprises.

The first is that the price variations are huge - the most expensive at retail are twice the price of the cheapest, even though I found the range of brands available quite small (so it can't be put down to the old "quality" standby excuse for high prices).

The second is that eBay isn't a great place to buy solar. Only a handful of prices on eBay for stuff that was sold were much below normal retail, and many of the cheapies could not be identified by brand.

The third is that I went back over some old prices, courtesy of the Internet Archive's Wayback machine, and have come to the conclusion that the solder industry knows a boondoggle - courtesy of government subsidy - when it sees it. So much for saving the planet; we're really about stripping the punters, and who's going to notice a price hike when the government's picking up 2/3 of the tab?

Finally, I really hate the sorts of companies that run the same business under 20 domain names just so they can drag Google searchers. Once I've dismissed someone as overpriced, I'm not going back under some other domain name.

I will be able to put together a solar system I can afford, but the industry doesn't make it easy. Three out of ten, solar industry, and try giving up the extra fat and getting rid of the ripoff merchants.

Monday, February 16, 2009

Let's burn a few social media straw men

Perhaps it's the insecurity born of being the new kid on the block, but social media experts are full of criticism of journalists for not understanding or embracing social media. Their critiques are almost exclusively built on straw men, and I figure it's just as well to identify these.

1) Journalists don't use Twitter.

That's odd, because I know a bunch of Twittering journalists. In my immediate experience, journalists have grabbed Twitter with enthusiasm. Some journalists do, some don't, but that's not surprising; Twitter is still a minority activity – a couple of million users out of a billion Internet users.

2) Journalists don't understand how to get stories from Twitter.

If it was true yesterday, it won't be true tomorrow. Journalists pick up story leads and tips wherever they happen. Just because they prefer selling their output to a paying customer such as a publisher, rather than spilling their livelihood in 160 characters for free, doesn't make the journalist clueless.

3) Journalism is old media

I know quite a few journalists who avoid the term “journalist”, because they've bought the new media fan's cry that journalism is old media. Actually, journalism is an activity. It's something that people do, and the activity is easily differentiated from both the medium (old or new media) and the business model. Avoiding a word because some idiot uses it as a term of abuse means you've accepted the other person's mindset; you're defining yourself in their terms, instead of your own.

Generally, those who say journalism==old media publishing reveal their own lack of understanding. The journalist can exist in a huge range of places that aren't in (say) News Corporation or Fox, and always have.

4)Twitter is the best / only place for contacts

It sounds so silly, put like that, but there are people who believe it. The best place to get a contact is wherever the story comes from. If your contact is on Twitter, that's the place; if your contact is in the pub, then the pub's the place. Anyone who can only interpret life in terms of one channel is, themselves, unidimensional.

5)Twitter is first

The first time I read a story outlining how Twitter outruns the old media straw man, I took the analysis as straight and thought “that's interesting”. The second time, I was watching Twitter unfold at the same time as the story. Since Twitter's stories came from other media (eg, RSS feeds, including 'old media' sites), the claim that it was outrunning its own primary sources is demonstrably nonsense.

As with all media, Twitter has the ability to be first. Any moderately real-time medium can be first; it depends on the channels and reporters (I don't restrict “reporter” to “professional journalist employed by major publisher”) that exist between the event and the reader.

Thursday, February 12, 2009

Self Interest or Public Interest? - Google Maps Versus Crown Copyright

The story goes like this: seeing that the Country Fire Authority Website was struggling (as, for that matter, was the Geosciences Australia Sentinel site), Google decided to run a mash-up on Google Maps to display fire locations and help people get the information they needed.

It was able to secure information very quickly from the CFA, which provides a feed of fire information covering private land, but fires on public land are tracked not by the CFA but by Victoria's Department of Sustainability – which didn't give the okay for use of its copyright.

In this ZDnet story, Google's Alan Noble complains that the refusal to provide information has its roots in Crown Copyright, which stands in the way of expanding the geospatial information available to the public.

It's all pretty cut-and-dried, isn't it? On one side, we have a go-getting company devoting resources to a public service, only to be baulked by a hidebound bureaucracy.

But that's not all there is to things.

The first is that Google is well aware how mapping copyrights are handled in Australia. The general reader, I guess, doesn't need the entire end-to-end detail, but the short version is that most of the data used in detailed maps (street directories, for example, or the property boundaries known as cadastral maps) is managed by PSMA – the Public Service Mapping Agency, whose copyright appears on Google Maps Australia's home page next to MapData Sciences Pty Ltd.

Roughly, the PSMA acts as the copyright clearing house for map data at detail finer than that offered for free by Geosciences Australia. In general, this means maps of finer scale than 1:100,000 (this is the potted version, a full description of Australian geodata would need a book).

States own the copyright in their mapping information, and sell it for fairly decent slices of lettuce.

So the first issue in asking for any give-away of state mapping data is simply that there are probably several approvals to go through to get the sign-off; and anyhow, I would guess that a public servant that okayed the free use of data that other users have to pay for would be on a tight spot at some point in the future.

But that doesn't fully satisfy me.

For the last three years or so, in my capacity as an analyst, I have become intimate with geospatial tools and various spatial data sources.

One thing I can tell you is this: while a particular map, or a database of locations, is subject to copyright, there's nothing to stop a person discovering the latitude and longitude of a particular place for themselves, and using that latitude and longitude in a publication. They may well reproduce the co-ordinates held in someone else's database, but the existence of the co-ordinates isn't copyright.

So there's something other than a simple issue of Crown Copyright at stake: Google wanted a particular dataset, to enable a particular programming approach, and it is this that was denied.

There's another remark made by Noble that's worthy of remark:

"It's ironic that I can download detailed NASA satellite imagery [of Australia] more readily than I can get satellite imagery from the Australian government,” he is reported to have said.

That's not surprising: most of the world's satellite imagery comes from NASA, and the Australian government doesn't own the copyright. I can (and do) download NASA images directly to Grass-GIS, the open source geographic application. If you want to get copyright information, go to the copyright owner.

As for its call for mapping data to exist in the public domain, it's worth remembering that PSMA copyright notice at the bottom of Google Maps Australia. There is a clear self-interest here, because Google has to pay a license fee for the PSMA maps – and these don't come for free.

In other words, Google's stance could safely be dismissed, except for one thing.

I happen to agree.

The prices charged for government geospatial data are outrageously high. We paid for the mapping in the first place, via various state lands departments and so on; to buy the maps, we have to pay again – and to buy (say) detailed national street maps runs well into the tens of thousands.

Greater availability won't just support Google: it would open the gate to lots of innovation and, heaven help us, lots of competition, because most of the value-add over the top of geo data is not intrinsic to the data itself.

Saturday, February 07, 2009

A Non-Rigorous Calculation: Are E-Books Green?

Just ask Steve Jobs: the best kind of marketing you can get is where your customers do the marketing for you. Buzz is everything.

E-books are one of those markets. Users are absolutely convinced they're right about everything associated with e-books, and are happy to do the marketing on behalf of the industry (how stupid can you be, to act as an unpaid sales rep for anyone?). A quick round-up of e-book arguments runs, roughly:

  • Convenience – Why carry one paper book when you can have an entire library everywhere you go?
  • Cost – E-books are cheaper.
  • Inevitability – This is the way of the future. Anyone who won't get with the program is a luddite.
  • Environment – E-books are greener.

Since the environmental argument is the clincher-of-the-day, I'd like to run it over the back of an envelope and see how it stacks up.

Warning: this is not an academic paper. If you want rigourous analysis, it won't be here, because I don't do rigour for free.

Let's start with the simplest case: all households in Australia have three e-book readers and no longer buy paper books. With roughly 7 million households in Australia, we need 21 million e-books to make paper books redundant.

I don't have a calculation of carbon footprint for e-books, but there's a common enough number for mobile phone manufacture – about 60kg of carbon per phone. Let's give e-books the same number.

That works out to 1.26 million tonnes of carbon to equip Australia for a future free of dead-tree books.

What's the carbon footprint of Australia's book consumption?

That's not too hard to provide, at least as a thumbnail. Australians buy about 130 million books a year, and according to the US Book Industry Study Group, a book's footprint is about 4 kilograms. That comes to 520,000 tonnes of carbon in Australia, if the American figures hold true.

This, however, ignores a couple of things.

The first is the electricity consumption of e-books. This is imponderable, because the one thing I can't tell you is how many people will leave the wall-warts plugged in and warm even when the book is charged.

The biggie, however, is the matter of product turnover. No manufacture can stay solvent by selling a device once, and never selling any more. So let's assume that just to keep the suppliers available to us, we'll need to turn the inventory of e-book readers over every ten years – roughly two million devices sold each year.

That puts a premium on the existence of the market; just for the back-of-the-envelope, we'll put that premium at 120,000 tonnnes of carbon (the footprint of 2 million replacement readers each year).

The other item is the network that delivers the books.

I'm going to use a couple of rough estimates here. The first is from Tom Worthington, who calculates that 20kB crossing the network equates to one gram of carbon (he was talking about e-mail, but let's just borrow the number for PDFs as well).

If each book, as PDF, is 1MB, then its network cost would be 50 grams. If Australia's book consumption remained at 130 million books on e-book readers rather than paper books, the network cost would be 6500 tonnes of carbon per year; nowhere near the millions, but neither is it insignificant.

Let's take the whole thing over five years.

Paper books, at 130 million per year sold in Australia, would generate 2.6 million tonnes of carbon.

For e-books, the assumptions (to repeat myself) are 21 million devices, plus 2 million per year new sales to sustain the market, plus 6500 tonnes of carbon per year for downloading books.

The total, at 1.9 million tonnes, is less than the paper book – but it's not as huge as advocates would have us believe, and it certainly doesn't make the e-book some magical “green” alternative.

And there are a few other cost items that I didn't include.

The first is simple: if the e-book needs more carbon than the mobile phone – just 80 kilos of carbon instead of 60 – then the advantage is nearly zero.

The second is that I've ignored disposal. E-books will either add to the toxic metals stream in landfill, or they'll need energy spent on their disposal.

And finally, there's the assumption that the two markets are exclusive. That is, the e-book isn't just another must-have consumer toy: it really is an exclusive replacement medium.

If this last is not true – if e-books create a distinct market that co-exists with paper books – then e-books are an environmental disaster: they will add to the carbon footprint, not reduce it, while at the same time delivering marginal consumer benefits at the cost of being leg-roped into licensing agreements, dragging publishing under the net of DMCA-type legislation, and reducing our freedom to read, share, resell, and give away our books.

Wednesday, June 04, 2008

Testing a new disto...

I haven't posted for a while, partly because I've been busy with other things (including paying work), partly because I've been busy trying to get Linux to work properly.

Eventually, I had to give up on the Freespire experiment on this laptop (Acer 5315). It's too easy to brick Freespire, thereby putting myself in a place from which the only escape is re-installation.
For example: when I connected a second screen to the machine, Freespire detected it straight away. I couldn't get the second screen to behave like a second desktop, but something's better than nothing, right?

Except that in auto-configuring the second screen, something (no, I don't know what or why) happened, and thereafter the machine declined to boot unless the second screen was connected.

Facing the nth reinstall, I decided that Freespire had had its day, and am now an OpenSuSE user.

But getting things bedded down and stable was non-trivial. OpenSuSE is just as breakable as Freespire was if you're trying to install anything unusual.

The great paradox of Linux is this combination of stability and fragility. Once you get things set up and working, the golden rule is "touch nothing", because if you do anything rash, you'll get into trouble.

On OpenSuSE, what got me into trouble was trying to get the trackpad to behave itself. The 5315 has a hyper-sensitive trackpad and a badly-behaved cursor, and the only way I can make the machine usable is to attach an external mouse and switch the trackpad off.

To fix this, I tried to get the Synaptics driver working; this broke the Xorg configuration, which meant the machine would only boot to the command line. Thankfully, I was able to recover the orginal configuration without a complete re-install.

I have, however, found to my satisfaction that OpenSuSE has a much better understanding of the wireless configuration. Ndiswrapper with the right driver worked completely without a hitch.

Now I just have to win the battle with the machine's sound card, learn how to run two screens properly, and I'll have an almost functional machine.

Tuesday, May 06, 2008

About the ITIF's Broadband Rankings

It's nice to see a broadband measure that gives Australia a reasonable score. It's a pity there are so many flaws in the data's assumptions. I'm referring to the ITIF study published here.

Since last year, when with Market Clarity's Shara Evans (my then employer) I looked at the then-inadequate OECD data (it has improved immeasurably since, but still has some unfortunate measures and assumptions that I'll get to in a minute), I have approached this debate with some trepidation. Broadband measures are a religious thing: they seem to exist primarily so that policy-makers and advocates can verify the opinions they already hold, rather than to inform the debate and spark new ideas. Saying “the broadband league tables don't mean that much” is a good way to get screamed at from every direction.

Hence any measure that says “(insert your country name here) is (leading the world) / (trailing the world)” will be accepted uncritically by whichever side agrees with the figures, and rejected by the other side. The figures themselves are irrelevant to the debate.

Well, here's what the ITIF has found: Australia is placed 12th in the OECD according to a measure of penetration, average download speed, and price per megabit per second.

Only one of those measures – penetration – is valid as a measure. As for the other two:

Average download speed – measured, according to ITIF, by “averaging the speeds of the incumbent DSL, cable and fiber offerings [sourced from the OECD's April 2006 figures, so 12 months out of date – RC] with each assigned a weight according to that technology's relative percentage of the nation's overall broadband subscribership”.

The ITIF admits that “average plan speeds” is flawed, and tries to weigh the average plan speeds against share of subscribers. But that's silly: because the core measure (average plan speed) is meaningless, and anyhow, all OECD nations publish statistics placing subscribers in speed bands (similar to the ABS's ongoing Internet survey in this country). Why not use a direct measure rather than massaging an indirect measure?

Lowest price per Mbps – This measure is even worse than the last, because the measure gives the best score to the worst service. The provider that most oversubscribes its infrastructure will yield the lowest price per Mbps, but at the cost of the real throughput delivered to the customer. A better measure would be to analyse price on a per-Gigabyte, per-month basis, and the only reason I haven't done such an analysis on an international basis is that nobody's paying me to do so.

Another point: international price comparison should be based not on a “raw” translation to US dollars, but on a proportion of average income. That way, if an Australian Gigabyte per month costs X% of income and a South Korean Gigabyte costs Y%, we can confidently identify which consumer is paying more.

Still, these flaws can arguably be forgiven, if the authors were able to argue that they do not affect relative outcomes between countries.

What's interesting is the overall conclusion from ITIF that “non-policy factors account for roughly three-fourths of a nation's broadband performance”.

In other words – as I have argued for years – government policy is not what broadband is all about. Broadband is a consumer product, just like 4WD vehicles, plasma TVs and the rest, and beyond the single question “is it available?” there's not much the government can do.

And why, anyway, is there some magic about broadband which makes it deserving of government funds, but not an LCD TV or a new fridge?

So what are the factors, according to ITIF, which most influence a country's broadband scores?

Urbanicity, age, and Internet usage.

The last is a no-brainer: people don't buy broadband if they're not connecting to the Internet at all. The age of a population can be discussed at length, so I'll leave that for another time.

But urbanicity caught my eye, because (again with Market Clarity) I ran this analysis in 2006. At that time, we concluded that urbanicity was a huge factor in broadband adoption – simply because concentrations of people attract telecommunications infrastructure investment. It's nice to see ITIF agreeing with that analysis.

Thursday, May 01, 2008

Digital Dividend or Free Spectrum?

So what's the value of the "digital dividend"?

The question arises because that silly expression is so ineradicable in the political lexicon. A "digital dividend" is out there somewhere, we just have to (as Senator Conroy put it) "put in the hard work" and we'll reap the rewards.

The commercial data that led to America's recent spectrum auctions raising $19 billion aren't on the public record. We don't know how many people Verizon and AT&T consider to be the addressable population of their spectrum. But the price tag provides a hint: the premium paid for the spectrum tells us that the spectrum will be used to deploy broad-based services. The "Internet socialist" ideal, that TV spectrum could deliver broadband to those without fast wired services, won't come true in America.


If we assume that the spectrum is destined for broad-based services, then population is a good way to look at the value of spectrum from the outside. The auctions raised $19 billion; America has just over 300 million people; so the per-person value of the spectrum is about $63.


And if that figure is applicable to Australia, the digital dividend would be just over $1.2 billion - or three years' interest on the Communications Fund.


While $1.2 billion is a lot of money, it's not much in terms of the total economy, which is close to $650 billion - and there's no guarantee that investors in Australia would put the same value on new spectrum anyhow.


There was, however, a very interesting nugget in Senator Conroy's speech to the ACMA RadComms conference last week: the Ofcom estimate that radio spectrum contributed £42 billion to the UK economy. It's not actually news (the report was published in 2006), but still interesting.


Instead of focussing on the price tag, though, the employment impact is worth noting: Ofcom claimed that spectrum use contributes to 240,000 jobs in Britain, or 0.7% of the workforce. That would be about 70,000 jobs in Australia.

The way to maximise jobs growth based on spectrum would be to make spectrum applications irresistibly attractive.

If you accept that the likely "digitial dividend" is going to be small (and $1.2 billion only sounds big), perhaps the idea that the spectrum should be opened up for free (or close to it) isn't so silly. Those who entered the fray would be able to focus on infrastructure and services rather than having to design the business plan around recovering money over-spent at auction.


Look again at the US experience. What are Verizon and AT&T going to do with their expensive spectrum? Verizon is talking 4G cellular services, while AT&T is talking about a network that can support 3G iPhones.


Somehow, it's hard to wax lyrical about spectrum bought to support a network-locked toy phone (with no apology to Apple fanboys) in a new cellular network. Whatever is claimed for new mobile networks, they remain focussed on the urban market, because that's where the people are.


The widespread belief that the digital dividend will bring new rural and remote services is a delusion. Unless political thinking changes - and unless somebody pops the wishful expectation of a huge payoff to general revenue - the digital dividend will end up with the same urban focus as is clearly emerging in the US.

Monday, April 28, 2008

Farewell CDMA...

The CDMA network was never going to last forever. Did AMPS last forever? Even GSM will be killed off one day.
However, as the CDMA switch-off approaches, I can say one thing that most of the Sydney-centric IT press can't say: I have seen first-hand that there are places where CDMA works and NextG doesn't work.

Last week, I took a trip down to Bendigo to see some relatives, some of whom live to the south somewhere near a town called Harcourt.

Mostly, phone coverage is a pretty dull topic, hardly in the “barbecue-stopper” category, but head out to the world of patchy coverage, and people get quite heated on the subject. So it was that I found myself looking at: a CDMA phone and a NextG phone. Guess which one couldn't get a signal?

Telstra confidently asserts that NextG has more coverage than CDMA. This may well be true: what has irritated the country users is that it's quite possible to cover more area with the new technology while still leaving out parts of the country.


Telstra is big on telling us that NextG covers two million square kilometres – which is still only a bit over a quarter of the landmass. Most of the uncovered area is pretty empty of people. But to a degree, this is slieght-of-hand. Nobody has asked Telstra whether NextG is “bigger” than CDMA: they ask whether NextG replicates the service they currently receive.


The answer is “probably, nearly”, but there are still gaps out there.

Monday, April 14, 2008

How to Build an Insecure Environment

One of the worst habits in the IT industry is for engineers to do things "because they can".

Today's manifestation of "because we can" comes from Cisco, which is introducing application server capability to routers. The idea is that since all your traffic's going to pass through a router, why not put the application right there in the router, so that the application is where the traffic is?

It's a very simple idea, and very obvious. So simple and obvious that it has, in fact, been tried before. Back a decade or more, I recall looking at enterprise network equipment with modules designed to host Novell NetWare servers.

It didn't take off then, and I hope it doesn't take off now.

In case you hadn't noticed, Cisco's name makes regular appearances in CERT advisories. Now, this isn't to say "Cisco's been slack", but rather to point out that its equipment, like all the equipment on the Internet, will be subject to vulnerabilities.

Only fools put all their eggs in one basket.

Where I will criticise Cisco is that in this case, the marketing pitch is at odds with the customers' interests. The customer's interest is to have a properly segregated environment; Cisco's interest is to be in command of as much of the customer's infrastructure as possible.

In previous iterations, "servers in network infrastructure" failed not because of security concerns - we all lived in a more innocent world 15 years ago. Rather, customers were concerned about tying their applications to a single-vendor execution platform, and expected that server vendors would run a faster upgrade cycle than network platform vendors (for whom servers were a sideline).

This time around, security should be the deal-breaker - if the customers are paying attention.

Thursday, April 10, 2008

Telemetry and 3G

Telstra is spruiking the wonders of telemetry over the 3G network.

It's an interesting thought, although not quite bleeding-edge. After all, telemetry over cellular networks has been with us for a while - even if it's hard to find in carriers' financial reports.

The reason it's hard to locate in the financials is, of course, because the telemetry market is so small that it doesn't warrant a break-out line item of its own.

There's a good reason for this: telemetry is a misfit with the way carriers like to market their mobile networks. The best mobile subscriber is the one that generates lots of billable events (calls, SMSs, MMSs, browsing and so on).

Why do you think prepaid dollars have a use-by date? The last thing any carrier wants is to have someone buy $50 of prepaid minutes and leave them sitting there for a year. Among other things, the expiry of pre-paid minutes is designed to encourage people to make calls. Bundled free calls with post-paid mobiles share the same aim (in social engineering terms, anyhow): if you habituate the user to make lots of calls, there's a good chance they'll keep happily racking up billable calls after their free calls are consumed.

The telemetry SIM might stay quiet for days - which is why, for the small amount of hoopla Telstra generated with its "telemetry over 3G" story, this kind of application remains "white noise" in carrier financias.

That's quite reasonable. Take a weather station, for example: it's sufficient for the device to collect its data and return that data to base by making a few of extremely brief calls each day. There may only be a hundred kilobytes to transfer with each call, after all (temperature samples, humidity samples, rainfall totals).

It's the same with many telemetry applications. Even with continuous data collection, you don't need continuous data transfer, unless there's an exception. If you're monitoring remote equipment, you don't need "up to the second" updates that simply say "two o'clock and all's well".

What you want is something which says "here's the data" at a specified period, but which can call for help if something goes wrong. And that's what most cellular telemetry does today - which means that compared to a human user, telemetry is very frugal in its use of the communications channel.

What Telstra wants is to create demand for telemetry applications that are more talkative: hence the focus on video streaming and live remote control in its telemetry press release. The ideal is to encourage applications that use the data channel as much as possible.

Another small point. Telstra made much of the usefulness of telemetry in remote locations. This is perfectly true: if you're monitoring a device that's 50 km from the homestead, you want to avoid travelling to the device wherever possible.

But who is Telstra kidding? For all of its claims that the mobile network covers 98% of Australia's population, the geographical coverage is a different matter.

So here's your quiz starter for ten points: is remote equipment likely to be close to the population centre?

As an idea, remote telemetry for the rural sector is wonderful - but only if the device is in the mobile footprint.

Tuesday, April 08, 2008

Who will save us from bad software?

No, this is not part of my ongoing battles with Linux (the sound's died again, by the way. Damned if I'm reinstalling...).

What's on my mind here is the more general issue of software quality. More precisely, what's going to be the impact of crap software on the world at large?

My washing machine is a case in point. Unlike the old world of washing machines, where a complex and breakable machine (a timer / controller) ran the wash cycles, the modern machine is computer-controlled. That means its functions are controlled by software.

And that means the behaviour of the machine is at the mercy of the brain that conceived the software. In this particular case, whenever something like balance upsets the washer, instead of going into a "hold" state and simply resuming its cycle when you re-balance it (like the old electro-mechanical timer would have done), the software designer has decided the machine should revert to the beginning of the cycle in which it unbalanced. If it unbalances at the end of a rinse cycle, it returns to the beginning of the rinse cycle - unless I or my wife runs over to the machine and resets its timer to a more appropriate spot.

But wait, it gets worse: its rinse cycle is actually multiple rinses. So if it unbalances when emptying from "first rinse" to begin "second rinse", it often decides to reset itself - it returns to the beginning of "first rinse" automatically. It's quite capable of spending hours getting half-way through its rinse cycle and returning to the start (which means you can't leave it home alone).

The result of bad software in the washing machine is a machine that is a relentless waster of water and electricity, and which needs constant supervision - neither of which would or should be the intention of the designer.

Since software is increasingly at the basis of everyday life, software quality is important.

Last week, I was at a press lunch hosted by Websense. That company is putting a lot of effort into new technologies designed to protect companies against the damage caused by botnets, trojans, drive-by Websites planting malicious code in browsers, and so on.

All of this is laudable and important - but like the water-wasting washer, the dangerous Web application has its roots in bad software: insecure operating systems, very poorly coded Web applications built in a rush and hyped to the max, and software that's simply a dangerous idea in the first place (why, oh why, do people think it's a good idea to host your desktop search and office applications out in Google?).

Websense was quite happy to agree that bad software is the starting point of bad security. Happily for that company, there's no immediate prospect of thing improving...

The Old “Peak Speeds” Game Again

I'll have more to say on this in Communications Day next week – I need a little time for research – but it may as well be blogged as well.

Part of the ongoing muddying of the waters in Australia's WiMAX-versus-mobile debate comes from confusion about the intended applications of technologies (for example, fixed WiMAX is older than mobile and therefore it's obsolete).

But the misuse of speed specifications also plays a role here.
Hence we're told by breathless journalists that the next generation of LTE-based mobile cellular technologies will deliver speeds of 40 Mbps.

So if anyone's listening – it has to happen eventually – here's a three-second primer.


1) Peak speed is meaningless


The peak speed game is played over and over again: years ago, someone discovered that 802.11g (peak speed 54 Mbps) didn't deliver those speeds to end users. But the press still gets very excited at peak speeds. Whatever the 40 Mbps represents in LTE, it doesn't represent the user data download speed from a base station to a device.


2) Users share channels


This is not unique to cellular environments. In Ethernet, users eventually share the total available bandwidth (even if it's the backplane of the switch). In WiFi, users share the base station bandwidth. In DSL, users share the capacity of the DSLAM. In LTE, as in all cellular environments, users will share the data capacity of the cell. I'm still doing the research to put a number to this, but I can state categorically that a user will only get the base station's top speed if that user is the only one running data through the base station at any given moment.


3) Remember Shannon's Law


The 40 Mbps now carelessly cited all over the place for LTE is a best-case test result. Any white paper from the industry makes it clear that in LTE as in every communciations technology, Shannon's Law still holds true. In the case of cellular data, the practical upshot of this is that as you move away from the base station, your speeds drop.


4) Carriers provision services


The service speeds carriers eventually offer over LTE won't just depend on what the cells can do. They'll be constructed according to the different plans the carrier wants to offer. I'll bet a dollar, here and now, that carriers won't be rushing to sell users $29 mobile broadband plans with 40 Mbps download speeds.


So the next time someone says "the next generation of cellular wireless will give us download speeds of 40 Mbps", the correct answer is "bulldust".

Monday, April 07, 2008

Could the New Zealand Institute's Fibre Co Work?

For some time, I have believed that the structure of Australia's telecommunications industry and networks should have been sorted out before privatisation of Telstra went ahead, rather than after.

The problem is that when they're bent on a privatisation, governments are happy to take the biggest cheque they can get – something like structural reform of an industry might reduce the sale price, so it gets left on the shelf. Afterwards is too late: any government that halved the value of a private company would be in line for lawsuits. So it's too late to easily restructure the Telstra monolith.

New Zealand is grappling with a similar problem, which is why the New Zealand Institute has suggested an independent monopoly (the so-called Fibre Co) be put in charge of the last mile. That way, the NZI argues, the upgrade of NZ's customer access network can proceed without tensions between the incumbent and its competitors getting in the way.


A long time ago now – back when Australian Communications was in publication, and therefore in the late 1990s – I advocated a very similar structure for Australia, with the wholesaler existing as a monopoly whose shares were owned by the carriers using its infrastructure.


Since I'm an advocate of such a model – well, since I was prepared to consider such a model as viable – I'm going instead to ask the opposite question: what's wrong with the idea of a wholesale infrastructure monopoly
?

1) Bureaucracy

The biggest problem is that there are more than 150 carriers in Australia. Some of them might not need or want membership of a Fibre Co, but even 50 companies might prove unwieldy, with different agendas, commercial imperatives, and geographies to be serviced.

Without offending people on the other side of the Tasman, New Zealand has a much smaller market. There are fewer carriers to participate, and there's less geography to cover; so a Fibre Co might well be manageable.


2) Clout

A simple “shareholding by size” model has this against it: the Fibre Co would be at risk of being under the control of its largest shareholder (which is also its largest customer). So there's a legitimate question about the degree to which one carrier's interests should dominate the process, versus the degree to which minor carriers should influence decisions reaching far beyond their own sphere.


3) Flexibility

Some commentators in New Zealand have criticised the Fibre Co model as a “one size fits all” proposal. It may be so – I haven't yet read the entire New Zealand Institute report – but that doesn't mean any implementation of Fibre Co has to be inflexibly chartered to deliver “this solution and only this solution”.


For some reason, people treat consumer telecommunications as a different creature to business telecommunications. Any sensible business starts building its network by defining its requirements: work out the applications you need to support, and build the network to support those applications. But in the consumer world, people insist on defining the technology first, without reference to requirements.


If a government were to charter the establishment of a Fibre Co, it should do so without reference to a particular technology. Rather, it should mandate a simple charter of requirements, and leave the infrastructure owner to meet those requirements with whatever technology best fits the circumstances.


But there is a second “flexibility” issue – one that goes with the structure of a Fibre Co. If the structure is too difficult to manage, bureaucratic, or too dominated by narrow interests, then inflexibility may be the result.


Can it be done?

The answer is “yes”. A Fibre Co could be made to work, and we can even see an example in Australia of a co-operatively managed carrier, in which there exists dominant participants and minor participants, but which has a clear enough mandate so that it stands as a success story of the industry.


It's called AARNET.

Saturday, April 05, 2008

Where on Earth is my Microphone?

I recall reading research that says Linux users do it partly for the fun. As a full-time, no-safety-net, no-dual-boot Linux user myself, I'm mystified by the definition of "fun" these people must use.
Philip K Dick once described psychosis as being typified by an inability to see the easy way out. Dammit, I can see the easy way out, but I've already put too much of my stuff into the Linux box.
During one of my sessions trawling around to get the USB headset working, I managed to do something (heaven knows what) to stop Freespire booting: it hung while loading the desktop. Eventually, failing every other possibility, I connected the external backup hard drive, learned how to mount it at the command line, copied most of my data (forgetting the mailbox, dammit) to the drive, and re-installed.
At this point, I gave up on the USB headset and reverted to an ordinary headset. This, at least, functions. Next, I tell myself, I need to get a softphone.
None of them work with Freespire sound, or ALSA, on the Acer 5315. All of them have output, but none of them respond to the microphone. To save time, I'll quickly list the softphones I've tried: KPhone, Gizmo, X-Lite and Twinkle. KPhone crashes when I try to make a call; Gizmo can't see the microphone; X-Lite barely launches; and Twinkle won't install properly (yes, I added the unbelievable host of libraries it needs).
So I decided to see whether any other application would see the microphone.
KRecord is happy with the microphone, as is Audacity. GNUSound, sort of, but it has the sort of catastrophic crashes that boosters tell you don't happen under Linux. So the problem has to be the interaction between the softphones and the sound system.
But it's not just Linux that gives pain.
Somewhere in the world, there's a designer of such unrestrained malice or stupidity that he or she has made the touchpad of the Acer 5315 almost completely unusable.
The design decision was this: "Wouldn't it be great if, instead of having to click, we made a self-clicking cursor? That way, you only need to hover the cursor over something, and it will act as if you clicked on it."
It's the worst idea I have encountered in a long time. While you're typing, the cursor regularly activates other parts of the screen, and the user has no input into this whatever. The only way to get a usable machine is to connect an external mouse and de-activate the touchpad completely.

Thursday, April 03, 2008

Opel Coverage: Why is WiMAX Obsolete?

A day after the news story broke, there isn't much to add to the facts themselves: the Australian government cancelled the contract for the Opel (Optus / Elders) consortium to build a broadband network combining ADSL2+ and WiMAX; the Opel partners are upset; the opposition is outraged; and Telstra is gloating; and so on.
The decision to unplug the Opel contract received plenty of media attention, as you might expect. And, as might also be expected, the mainstream media has managed some bloopers in its coverage of the affair. I thought I'd pick out a few highlights.
From Malcolm Maiden of Fairfax: “The technology Opel was relying on was also questionable, because it relied, in part, on outdated WiMAX fixed wireless microwave signal propagation...”
WiMAX outdated? Nonsense: there is legitimate debate about whether the technology is the best one for a particular application, but that has nothing to do with the alleged obsolescence of WiMAX. WiMAX is, for all its faults, a current and ongoing technology development.
I don't mean to take the stick to Fairfax, but its commentators seem to be hunting in a pack on the question of WiMAX obsolescence: Jesse Hogan of The Age also pursued this line: “The problem was that Coonan wanted to use an obsolete fixed (non-portable) version of WiMAX instead of the newer mobile version...”
Hogan simply doesn't fully understand the technology. Yes, mobile WiMAX is a newer standard than fixed WiMAX. But that doesn't mean the requirement for fixed wireless applications will disappear. Again, we have a confusion between two questions: “Does the world have a use for fixed wireless broadband data services?”, which anyone familiar with this industry would answer “Yes”; and “Is fixed WiMAX the best technology for fixed wireless broadband data services?” for which the answer is “Some people say yes, others say no”.
Elizabeth Knight, also of Fairfax, had the good sense to stay away from technology commentary, but couldn't resist repeating the government's line about network duplication: “The risk of overlap is real ... remember the Telstra and Optus overbuild of cable in metropolitan areas. It was a multibillion dollar commercial gaffe.”
I agree that it would be silly for the government to fund two networks, and I also agree that the way HFC was rolled out was a disaster. But neither of these make a general rule about infrastructure duplication. Mobile networks are duplicated, yet they're a commercial success – and as we see in most telecommunication markets, infrastructure duplication (another word is “competition”) drives prices downwards, which is why it's cheaper to buy a data link from Sydney to Melbourne than from Brisbane to Cloncurry.
If “WiMAX obsolescence” is the pack-line from Fairfax, News Limited's favourite line is that Optus is a “lazy competitor”: this came from both John Durie and Terry McCrann. The Australian, however, has had the good sense to stay away from the WiMAX technology debate!

Tuesday, April 01, 2008

The Life of the Linux Newbie

OK: it's not a proper blog. For a start, there's no link-farming. Second, no plagiarism (bloggers often don't dig ideas like "fair use" and "attribution". For them, it's about hitrates: whatever gets clicks is good). And third, I promise to try and offer moderately readable articles.

Mostly I'll be talking about telecommunications, sometimes computing, sometimes life in general.

But first, I want to talk about my life with Linux. Since I've just set up on my own, I had to find a low-cost way of doing what I wanted to do, and that meant finding a laptop that could support Grass-GIS, my favourite mapping and geographic analysis system. Having tried to get it running on Vista and failed, and having run it on Windows but worried at low speed, I decided to give Linux another spin (my third attempt since about 2000).

I fear my lessons can't be taught. You have to feel the pain yourself. Still...

Lesson 1: Research

All smug Linux experts, please leave the room now. Yes, I know you have to find out in advance whether your laptop will run with Linux. That's easier said than done, since models turn over faster than pancakes, and different models get different monikers in different countries, and even the claim that “this laptop will run Linux” can't always be taken at face value.

So I bought an Acer Aspire 5315: cheap and low-powered, but most of what I do doesn't need muscle, and I didn't buy this little box expecting a multimedia powerhouse.

The problem is that the Acer Aspire 5315 has an Atheros chipset for which Linux doesn't really have decent driver support yet, so I have no wireless networking. And its sound system doesn't seem to like Linux, so I have a silent machine. More on this in a minute...

Lesson 2: The Distribution

It might sound easy: choose your distribution carefully. But there are still way too many distos, and Linux distribution research is difficult. You may as well say “buy a dictionary and choose the word that suits your needs”. Choosing words comes naturally if you're a native of the language you're using. Choosing a Linux distribution comes easy if you're intimate with Linux. For second-language people, in computers as in conversation, the right choice is not second nature (the fact that every Linux disto believes itself to be the best is no help at all. Could we leave the marketing-speak over in the world of commercial software, please?).

After trying and failing to make a sensible choice of distribution, I decided to waste some CDs and take a more active approach. The end result is that I'm using Freespire, but there were some burns along the way.

My favourite distribution is actually Puppy Linux (http://www.puppylinux.org). The problem is that Puppy would not find any networking of any sort – not even the wired Ethernet would come alive. I wandered around somewhat, searched for drivers, downloaded various extra packages to try and install them, and achieved nothing. Perhaps I'll give it another swing some other day, but this time around I had to give up.

Next, I investigated Knoppix. I quickly gave up on that line of investigation. The problem with Puppy, I reasoned, was that nobody had caught up with the laptop's Ethernet drivers yet, and the latest build of Puppy was only a few weeks old. As far as I could tell, all the Knoppix distributions I could locate were more than a year old (this was just before the 5.8 release was posted, as I have just discovered). I couldn't see it likely that older software would have out-of-the-box support for a newer driver, so I moved on.

Next came Ubuntu. I decided to go for an install to the hard drive, intending to use the boot loader to twin-boot the box, but I was never offered a Windows boot again. The Windows directories and partitions were still in place (as I discovered later), but the machine simply skipped through to booting Ubuntu without asking my opinion on the matter. This may not have mattered, but for one thing: Ubuntu would not boot.

Not at all.

It just hung during the boot cycle.

Overnight is long enough to wait, so when I came out in the morning to a machine that still hadn't booted, I decided that Ubuntu wasn't what I wanted. So I downloaded a Freespire Live CD ISO to test that one – and somewhere around this time, I noticed the aforementioned no-Windows issue. This, I must say, brought me out in a bit of a sweat. I didn't relish the idea of reinstalling Vista from cold, but I was worried that whatever had happened to Ubuntu might also happen to other Linuxes.

Thankfully, Freespire did boot, and it booted with Ethernet if not with wireless, so it's been Freespire that I've stayed with.

If you're just a user – and yes, I am just a user – you shouldn't have to burn midnight oil just to get to the starting gate.

Lesson 3: The Application Question

Thankfully, Freespire will run Grass-GIS (given a certain amount of bloody-mindedness. Fortunately, I found some lying around in the shed...). But getting the installation happening was

very

very

painful.

You see, Grass-GIS is not offered on Freespire's preferred software installation site, CNR.com (which, by the way, needs more consistency: what's the point of offering software for installation, only to give the user the message “not available for your distribution”?).

I started by asking myself “what's the underlying distribution beneath Freespire?” and, since the answer seemed to be Debian, I tried installing the Debian port of Grass-GIS. This went very, very badly, because Grass-GIS needs lots of supporting libraries, some of which were difficult to locate “the Debian way”, and others of which seemed to dislike the versions of themselves installed by Freespire ...

So I broke the installation and had to go back to scratch. When things were working again on a clean install, I made a list of the libraries Grass-GIS needed, installed as many as possible from CNR, got others direct from their source rather than via the Debian repositories, and ended up breaking everything again.

Oh.

Next, I picked the computer up off the road and glued it back together – OK, that's exaggeration – I re-installed Freespire again, and decided that I didn't want the Debian port after all. Would the Ubuntu port work?

Yes, it would. Not only that, but once I'd let the computer know where to download libraries from, I managed to get the libraries installed in only a couple of steps, Grass-GIS in a couple more steps, and away we went.

Other things are less forgiving – by which I mean that instead of a painful process with success at the end, some software declines to run at all under Freespire, even when the brain-dead CNR.com installer thinks (a) the software is available and (b) the installation went just wonderfully well, thank you.

Things that don't work include Google Earth (which induces a complete and catastrophic crash – all the way back to the user login); the MySQL query browser (thankfully I can run MySQL from the command line – and before you ask, I have the database sitting behind the geographic stuff), the Navicat database query application, the Seamonkey browser, and various HTML editors offered on the CNR.com Website.

So I can't really recommend Freespire on any basis other than “at least it boots”, which has to be the lamest recommendation I can imagine.

Lesson 4: Some Things Work Well

I'm glad I don't really care about sound, because I would be really upset if I had a sound-crippled laptop when sound really mattered to me. I'm a little less pleased about the wireless, but thankfully I was able to pull an Ethernet cable to the right part of the house, and with any luck it won't get rat-bitten.

But there are upsides, and the best one of all is in Grass-GIS.

Geographic information systems can be pretty demanding applications. Back in an office environment, before I struck out on my own, I was accustomed to running Grass-GIS on a very high-powered MacBook Pro carrying as much memory as you could install. I was also accustomed to Grass-GIS being so demanding that some maps might take half and hour to display on the screen (which made me handle the mouse with care: accidentally reloading a map could mean “we're staying late at the office tonight”).

Grass-GIS on Windows machines could be even worse, because it had to run under the Cygwin libraries. So you have an operating system running an emulation layer for another operating system, and on top of the emulation layer you are running Grass-GIS itself. Doing things this way requires very serious grunt.

But Grass-GIS on Linux is so fast that my first response was “what happened?” – things happened so fast. Here is a box which, compared with its predecessor, is little better than a cripple both in processing power and in memory, Moreover, it's running both the GIS system and the MySQL database feeding it, yet it cruises along.

So after the pain, there has been a payoff. All I need next is to find out why there's no sound and wireless...

On the Loose Again

Some time ago - about two years ago! - I found it too hard to combine life as an analyst and life as a blogger, for two reasons: I had to self-moderate so as not to spoil the professional life with the fun life; and there just wasn't time to blog.

Now I'm freelance again, and so with any luck I will get to blog again. I wish I could find some way to move all the old posts into an archive directory, but either I'm stupid or I just haven't worked things out yet. Time, time ...

I'd also like to mention that I've acquired some skills over the last few years which may be of use to the industry I work in. In particular, I have had great fun learning how to operate the Grass-GIS open source geographical analysis and mapping software. At the same time, I've been learning how to run Linux on a laptop (Freespire, not because I love it but because it did the best job of finding my LAN).

And I've got a couple of quick items for other people wanting to use Grass-GIS on Freespire.

The first is that the distribution that works best on Freespire seems to be the Grass-GIS compile for Ubuntu, here: http://www.les-ejk.cz/ubuntu/

Second: If you want to experiment with QGIS (http://www.qgis.org) as a front-end for Grass-GIS, do not install QGIS from the CNR service. The CNR installation wiped my Grass-GIS 6.2 and replaced it with 6.0, which I didn't really need.

Saturday, May 20, 2006

Rules for Google Reporting

Google Maps has discovered Australia: and the press gave a vivid illustration of the rules Australian media follow when reporting on Google.

First and foremost, the Prime Directive is "go soft". Google is surrounded by a skepticism-removal field, Australian media can't get rid of that bit of cultural cringe which says "Australia needs to be noticed", and put together, the end result is a welter of "Google noticed Australia!" stories which ignore the quality of the maps themselves.

Google's Australian maps are really awful.

It hasn't bothered with searchability beyond the grossest features. So I can search for Sydney but not Penrith or Parramatta or Chatswood or Brisbane.

The maps are occasionally years out of date, so that a major freeway near my home is shown following its intended route instead of its real route.

Strangely, the accuracy of the maps is different in adjoining sections, so while Lilyfield Road carries a years-out-of-date one-way diversion, Olympic Park is accurate and the Metro Light Rail is shown.

The corresponding satellite images don't correspond all that well with the maps, because they show a different point in time. Hence Darling Park is in the satellite photo, but with the map superimposed it clearly can't exist because the Western Distributor runs right through it.

And the interface is flaky, so that "hybrid" only works properly if you start with the map. Start with the satellite photo, and hybrid doesn't always draw the map overlay.

And so on. The whole thing is an appalling mish-mash that looks like it was knocked together for no particular reason other than to generate some positive downunder publicity.

What's embarrassing is not that Google has slapped lipstick on a pig and called it product, but that the Australian press acted just as Google knew it would act: like a toadying retainer to a medieval lord, thanking Google for showering grace upon this insignificant backwater.

The whole thing is out of the half-bakery. But Google's media objective of "attract no evil publicity" has remained intact, courtesy of fans-with-typewriters.

Monday, May 15, 2006

Thought is not property

"Patents and copyrights are the legal implementation of the base of all property rights: a man's right to the product of his mind."
So wrote Ayn Rand, a deity of libertarian thought.
Her cry has been taken up by "intellectual property" lobbyists the world over (example
here
), but most particularly in America, where Rand is still considered to be a great philosophical thinker.
Rand was wrong, and it's important to understand why.
The sole "product of mind" is thought. Everything else made by humans needs "mind and muscle", so to speak: there is an action following thought which results in the expression of that thought.
The thought itself, however, is not property. The mind is mine, but the nature of property makes it inapplicable to mind.
I'm going to avoid the very abstract parts of this discussion. Much of the debate over intellectual property (from one side) focuses on "exhaustability" and "exclusivity". That is, that when I own this hammer, you can't use this hammer; and because there's a finite supply of hammers, my ownership changes your position. A debate focussing on these characteristics ends up in a quagmire, so I ignore it.
Instead, let's look at two other aspects of property: evidence, and enforcement. I can provide evidence that I bought the hammer and therefore I own it; and on its theft, I can appeal to an authority to enforce my ownership of the hammer.
Exclusivity and exhaustability are abstract economic concepts; evidence and enforcement are more concrete. More to the point, evidence and enforcement don't have to take economics into account. The bottom of the matter is that I own the hammer, not that there are plenty of other hammers around to own; and that the state agrees that I own it, and agrees that my right of ownership can be backed by the law.
Ownership of thought, divorced from effort, cannot be owned: ownership cannot be established by any reasonable evidence, and a claim of ownership is unenforceable.
Until someone sets pen to paper, or finger to keyboard, or tool to timber, the thought has no expression. I can be the first person to make a polished-maple space shuttle, but I cannot ever reasonably claim to be the only person to have thought of making a polished-maple space shuttle.
Copyright and patent protect not the thought, but the first expression of the thought. Until the thought has been published, it has no protection, for very good reason: there's no way to prove primacy (that you thought of it first), and there's no way to enforce exclusivity (to stop somebody else thinking of it).
Back to Ayn Rand. As I said, the product of the mind is thought. But what if, as Rand asserted, thought were to be treated as property?
The only body either enabled or empowered to enforce the evidentiary or enforceability basis of property is the state.
In other words, to make thought into property is to invite the state to police the contents of the mind.
As far as I can tell, only the most oppressive state mechanisms in history have ever attempted that: the policing of thought, for whatever reason, is the extreme end of totalitarian behaviour.
Thought is private, but not property.

Tuesday, November 22, 2005

The $100 publicity device

It's one of those stories that people just can't resist: self-publicist and acclaimed guru Doctor Nick - sorry, Nicholas Negroponte - pops up in Tunis at the WSIS Summit promising $100 PCs for the world's starving children.

Rather than critique the stunt itself, it's also instructive to take a look at what happened in the media, which was simple: it was taken as gospel. You know that guru-worship has reached ridiculous heights when someone showing a not-particularly-functional prototype is written up as if he'd launched a commercial product, which was how the show-and-tell was swallowed at the first pass.

By the third day or so, a very few media outlets had started to pick over the entrails and decided it wasn't such a hot story after all. As far as I can tell, The Inquirer (http://www.theinquirer.net) was the first to pour scorn on the idea.

But that's not going to be enough: the lack of scepticism at the beginning has created an international belief that Doctor Nick's done it again - a stroke of brilliance from the guru, save the world and feed the starving, how do you nominate people for the Nobel, and so on.

Media guru-worship is bunk; it's the inverse of "ad hominiem", where all you need to turn bad ideas good is to have the endorsement of a guru. So I've long since given myself a guru-ectomy.

There are a great many questions any journalist could have asked if he or she hadn't been sleepwalking at the time. But to me, the outstandingly obvious one is this: Why would authoritarian kleptocrats spend money buying up PCs for their citizens at the same time as denying them food, suppressing communications, and repressing information or debate?

Some context is instructive here. Journalists attending the summit were seeing first-hand the effects of government Internet censorship in Tunis, which is by no means the worst offender on the "control citizens' access to information" league table. The Register (http://www.theregister.co.uk) reported hilariously that a Swiss tourist information site (www.swissinfo.org) was being filtered out of the Tunisian Internet.

But a huge part of the premise of the $100 laptop is that it gives the villager in the third-world access to the Internet, yet neither Doctor Nick nor the waiting acolytes in the press can say "but what if the government blocks their access?", in which case the village child got a $100 western doorstop.

I would also have asked for evidence supporting the article of faith that you can't get an education without a computer - and by evidence, I mean real, peer-reviewed, non-industry-supported, independent research, not an arrogant American metaphor about sharing pencils around a classroom.

This piece of world-wide blue-sky media dog-whistling was nothing more than a publicity stunt.




Wednesday, November 09, 2005

Sony's Rootkit Gets the Go-By From Australian Journalists

By now most people with a vague interest in stories about computers know that Sony tried to distribute copy-protection software with the characteristics of a rootkit with music CDs. The result was that playing the CD on a home PC would install the software; the software would hide itself; and it was difficult and dangerous to uninstall.

What's fascinated me about the story is that it had almost no play in the Australian IT press.

If I look at Google News, I find that this morning (7.15am, October 9, Sydney time) there are around 249 stories posted about the rootkit.


Filter the Google News to location:Australia and the number drops to a mere handful. Just how many is a little difficult to fix because small variations in search terms change the result, but fewer than 20.


At first glance, it looks like it got a run in most publications: The Sydney Morning Herald (http://www.smh.com.), Smart House (http://www.smarthouse.com.au), The ABC Online (http://www.abc.net.au), ZDNet (http://www.zdnet.com.au), Linux World Au (http://www.linuxworld.com.au), IT News (http://www.itnews.com.au), Australian IT (http://www.australianit.com.au), ARNnet (http://www.arnnet.com.au).


However, with only two exceptions I can see, the coverage was entirely syndicated, either from a wire or from international mastheads.


The locals were Smart House and the ABC. That's it.


Now, on efficiency terms, that might seem like a "so what?" After all, there's no reason to rake over the story when the US has already covered it, right?


There are three reasons to give the story local treatment.


1) Was the software distributed in Australia? If the answer is "yes", it gives rise to all kinds of journalistic fun, such as whether the software breaks the law here, what product liability issues may arise, and so on.


2) Sony's attitude to copy-protection has already brought it to prominence in Australia, where mod-chips have been declared legal by the High Court after a long battle initiated by Sony.


3) Some of the wire pieces were nothing more than press release rewrites anyhow, like the wire story saying "Sony hoses down hidden file fears" (a poor choice of syndication since by the time it ran, the patch it referred to had already been found wanting).


As a postscript to this story, the original rootkit discoverer, writing at http://www.sysinternals.com, found that the software was "phoning home" after installation.


Now: Australia has many, many people who are expert in security or privacy, and whose profile ranges between media-savvy and media-tart. Comment was available, local angles were certainly available - only the media, it seems, were not available.

Saturday, October 01, 2005

They Drop Like Flies in Korean Cafes!

It's easy to understand why a daily might give space to a suspect wire story on its Web page: there are too many wire pieces and there's too little time.

However, it's fair to say that the wires have a very, very long tradition of falling prey to urban legend. One of my favourites, "Man Dies in Internet Cafe", has made its annual appearance again, with a wire story spending more than a month floating around and getting picked up by dailies as gospel.

Funny story: I mentioned the mythology to a journalists' Website, who gave it a paragraph. One of the Australian news organisations, or should I say News organisations, went ballistic and the story was un-posted. Well, let them go ballistic at me if they take exception to this post on this blog...

In 1969, the late AP Herbert - a long-time humourist (writing for Punch, I think) whose speciality was making fun of English law - described how his "Negotiable Cow" (a piece of silliness which discussed the laws relating to cheques, and whether a cheque could be written on a cow) escaped via a BBC comedy to traverse the wire agencies and finally land in The Memphis Press - Scimitar as straight news.

More recently - just a couple of years ago in fact - a US county was made fun of throughout the world because it allegedly maintained "Klingon translators" among its mental health services.

This story, again, was a blooper run by the wires. There never was a position for Klingon speakers - it was a programmer's joke which took off on the wires and became an ineradicable belief.

There are still people who run the story along by trying to "document" the scandalous waste of taxes in America because so many public sector organisations are hiring Klingon interpreters.

Here's another which falls into the "reasonable cause for scepticism" category. A couple of years ago, mobile phone batteries all over the world were exploding in pockets. Manufacturers solemnly issued warnings that users should never install after-market batteries (just like "use only Toyota original parts, I guess!), and investigations were promised.

At the time, with CommsWorld still live, I received the "no imitation parts" press release from a manufacturer and promised an exchange: if they could supply a photograph of a blown-up battery, I would run the story.

Not only did I never get the requested photograph, but after a very long time, the story took an unexpected turn. AMTA issued a statement saying that while some fake batteries had been observed to overheat, no batteries exploded, and there were never injuries - at least in Australia.

Actually, a moderately-competent chemist would have sufficed for debunking that piece. "Are there any potentially-exploding chemicals in batteries?" "No." "Oh. Thanks." Or you could ask yourself: in our paranoid world, just how long would anyone be allowed to sell potential explosives as a ubiquitous consumer product?

On to deaths in Internet cafes.

Urban legends on the wires tend to have very predictable characteristics.

They're always located "somewhere else". In recent years (although it's been around since 1981), people have preferred to die of game-playing in South Korea, Taiwan or Vietnam. It's never happened in America, Britain, Australia or New Zealand. Even the geographical details can be hints; in the "man dies" story of two years ago, the event was placed in "Kwanju, 260km south-west of Seoul" - which according to my Atlas moves Kwanju into the Yellow Sea by about 50km (interestingly, but of no particular significance: this year's cafe death happened not only in the same country as the victim of two years ago, but within a fairly short commute. I have to conclude that the southern end of South Korea has very dangerous Internet Cafes.)

The urban legend often involves unnamed characters. Hence we have a victim for whom only the first name is given, if any (imagine the editor asking the journalist to track down someone called Lee or Kim in South Korea!); "police officials" and "government officials" without names, and unnamed witnesses.

Another characteristic is that any additional research adds no facts, only local commentary. The BBC's coverage of this story, more than once over the years, is a wonderful example: every time "man dies" hits the wires, the dear old Beeb suffers corporate amnesia and sends someone to quiz medical experts in Britain, computer games experts, market experts - anything except for confirming the original facts of the story. Well, once the Beeb managed to do a "local colour" piece, and even ran a photo of such astonishingly poor quality that you have to wonder whether someone was stringing them along.

Urban myths love the moral angle: the British have a strange legal system (cheques and cows); public officials are wasting our money; fake phone batteries are dangerous; too much computer gaming is bad for you. Quite often, the moral angle is married to the kind of society where governments believe the media has an obligation to uphold public behaviour.

It's also worth observing that urban legend participants are generally cut-out stereotypes: nobody ever died in a Korean Internet cafe without being an unemployed 20-something who lives at home.

And, of course, the best urban legends happen when the wires crib the local media and get facts scrambled along the way...

Media Advocacy

The question you never ask about a blog is whether anyone's reading it. They're not and they don't, unless of course you proactively point someone at it. Well, I have kids and a job and all sorts of things to keep me busy other than this.
But after a long silence, there are a couple of things which warrant some words, and this is the only outlet.
First, there was this piece of silliness about the matter of "computers in education".
http://newsletters.silicon.cneteu.net/t/81893/642015/71164/0/
Teachers fear computers in the classroom
IT interferes with 'genuine' book-based learning, study finds
By Andy McCue
Tuesday 13 September 2005
Silicon.com
"Schools are failing to take advantage of IT in the classroom as teachers worry computers will interfere with traditional book-based learning, according to a new academic study."
IT journalists work from the presupposition that their job is advocacy rather than journalism. Because they're there to promote the industry - if not individual advertisers, then at least the entire sector - they forget any kind of healthy scepticism.
As a sceptic, I will note the following aspects of this kind of story.
1) Saying "failing to take advantage" presupposes a benefit. The study also gives the impression that all subjects are equal in the face of IT - that is, whatever you're doing, you can do better with a PC in front of the student.
2) The journalist doesn't mention the considerable industry sponsorship behind the study. Does that taint the research? I don't know one way or the other - but it taints the reporting of it.
3) The research came to the stunning conclusion that "creative subjects" "suffer worst" at the lack of computers. Why would a painter, to pick one, be "suffering" without a computer? Why is it a surprise to the researcher?
4) Another revealing quote: "many teachers simply lack the confidence to take the risk of using technology in their subject areas". "Risk" is not an obligation. And, of course, it could be that the teachers themselves are making an informed judgement.
Of course, the naysayers don't get a look-in. It was a single-source story written from the press kit.

Wednesday, June 15, 2005

Grafedia: Conditioning users to insecurity

The press release view of Grafedia is that it's a new social phenomenon.

Fine. I've never held the sort of rosy view of the world which expects a perfectly comprehensive and accurate description of something in a press release. But one of the jobs that media can claim as its own should surely be sufficient knowledge and scepticism to identify the whole of an elephant even if the press release only describes its leg?

Apparently not.


Grafedia, if I were to believe the cargo-cultists and gullibles on the promotional side, is one of those new things that demonstrate the "interface between the e-world and the real-world".

Here's how it works: you see an address written in chalk on the footpath or stuck with tape to the power pole; you use your mobile to send a message to the address; and you get something (like "pop art on the mobile phone", now there's something worth having...) by a return message.

Inspired by nothing more than a press release, outlets like CNN (http://www.cnn.com/2005/TECH/ptech/06/13/cell.phone.markers.ap/index.html), Wired (http://www.wired.com/news/culture/0,1284,66992,00.html), CNet (http://news.com.com/Photos+Grafedia+sightings/2009-1026_3-5694946.html?part=rss&tag=5694946&subj=news) and others have tossed out their critical brains and gone with the "wow" angle.


Umm - getting a message in return for a message hardly seems wow to me, but that's another matter.


Have these guys forgotten that viruses are spread by users who open messages from unknown sources?


Have they forgotten that sending messages to unknown places is a good way to end up on spam lists?


Ignoring for a moment that Grafedia is just another one of those lame attempts at breaking through the marketing noise by painting footpaths, it's really dumb to create a "social phenomenon" whose side-effect is to condition users to behave in an insecure manner.