Using that second display: 4 news visualisations of questionable utility

For both yours and my ever decreasing attention spans, in the race to distinguish and spice up the daily news product, here’s more news, shallower, and faster.

MSNBC Spectra screenshot

Spectra from MSNBC is a pretty terrible romp into 3D. Pretty, but completely unusable and just rather useless. You select what channels of news you want, and as you do a selection of stories from each channel floats into the display as a rotating ring. It wouldn’t be so bad if you could actually click on the floating news items. But no, that does something completely unexpected, it ejects that entire ring of stores. To get to a story you want, you have to navigate via a ridiculous horizontal scrollbar. I thought we had learnt in the 90s that 3D interfaces like this just don’t work. From Information Aesthetics via Data Mining.

Newsmap

Moving from the realms of insanity to just the slightly overwhelming comes Newsmap, based off Google News.

Digg\'s \"Big Spy\" visualization

Digg\'s \"Stack\" visualization

From the very epitome of fickle and populist news rivers comes a selection of cool-looking, fast moving and not really that value-additive visualizations at their Labs section.

Mapped Up screenshot

Finally comes a low-key (and the most embeddable of the lot) Flash widget that just rotates geo-coded stories on a world map.

Advertisements

Will WinFS return? Will anyone care?

Guns N’ Roses started recording their ‘upcoming’ album Chinese Democracy in 1994. George Broussard started work on Duke Nukem Forever in 1997. Both titles have become standalone jokes in the music and game industries respectively, commonly regarded as having a release date in the vicinity of Armageddon. In some way, Microsoft has been working on WinFS (on and off, and not always under that name or scope. To be honest, I’m probably not being to fair them in this paragraph) since 1990, promised at some point to be released with multiple versions of Windows, and always pulled before release. When it appeared in the Longhorn betas (the OS that would become Vista), it was slow and incomplete, and the eventual news that it would be pulled from Vista too wasn’t terribly shocking. Vista itself was at risk of becoming a perpetual vapourware joke like Duke Nukem Forever, and after five years development MS was very painfully aware that they needed to get something out the door. So to much jeering at having once again over-promised and under-delivered, one of the three pillars of Vista was dropped.

Not that there wasn’t good reason for taking a long time about it. It was actually a really, really tricky problem they were biting off. Or an entire set of problems. WinFS was Microsoft’s latest iteration on their ongoing attempts to unify (or at least bridge) the concepts of a filesystem and a database. It’s the sort of proposal that automatically intrigue computer scientists, (as can be seen in the many other attempts at it). Why the hard separation between data stored as files, and data stored in a database? Surely the two could be unified, and imagine what it would bring! You could store your music and movies in a normal directory structure, and access them through the normal file access means, but browse them by artist, genre, or with queries such as “All movies directed by Steven Spielberg from the 1990s”.

Why it died

WinFS died as a monolithic entity that integrated into Windows for a number of reasons:

The Web – The concept of an Object Filesystem was something MS had been touting since 1990. It also made more sense back then. In 2006, with the web taking off and obviously becoming the new place for data to live, it didn’t so much. Why bother maintaining your contacts as entries in a local database so you could perform interesting queries on them when Facebook et al could do a better job for the most part, in a centralised location? And if this trend of apps moving to the web continues, then WinFS as a client-side application is weakened drastically as your data moves out of its purview.

Desktop search: good enough – The biggest use case scenario for why WinFS would be awesome inevitably worked out as being desktop search. But when Apple introduced Spotlight for OS X, which was just a simple (compared to what WinFS hoped to achieve) file indexing service it made a mockery of the need for such a complex solution to the problem. The release of Google Desktop for Windows put pressure on this wound. Eventually Microsoft released their own desktop search client for XP, an embarrassing and face-saving move given that file-indexing had already existed as a turned-off-by-default option in XP.

Embedded databases: good enough – The other client-side stories for why WinFS would be a good thing often involved media files. Everyone likes movies and music, and they’ve got lots of meta-data associated with them, like genre, year, length, artists, albums, etc. Lots of ways to order and query a collection. The problem for WinFS was it was never clear why this couldn’t be just as easily handled by an application-specific database. Like iTunes. And the tides were shifting on this front: cross-platform development has become important again. And SQLite is a lot more Mac and Linux-friendly than a new Windows-only API would be. It’s also a lot more existent. Developers like that.

But who can really say what “dead” means?

Jon Udell has an interesting interview [1](for certain definitions of “interesting”) with Quentin Clark, a manager at Microsoft who used to be the project manager on WinFS.

Take-away points:

– Microsoft proceeded to start rolling the WinFS development into its backend software efforts ADO.NET and SQL Server. This isn’t news. And it makes sense in its own way. If web-apps are going to be where we store our data in the future, then the databases backing them are going to become our filesystems in a sense. Although if you think about it that way, then we’re a good way already towards the WinFS vision, and WinFS as it was originally envisioned is once again undermined by good-enough approaches.

In the interview Clark talks about several features in SQL Server 2008 that keep alive the WinFS dream:

– They’ve added some filestream column-type, which reference files on the underlying NTFS partition rather than storing the data itself in the database, which makes for better performance for large binaries.

– They’ve added a hierarchical column-type. You know, hierarchical, like a directory structure.

– They’re going to add a Win32 namespacing function, which will expose the database to Windows as another file storage device you can then browse and do all the usual fun stuff. WinFS by complex stealth. There’s more than one project for Linux that does the same thing through FUSE.

So in short, SQL Server 2008 will/is able to store large files just as well as NTFS is able to. It will be able to describe hierarchial data structures. It will be accessible from the Win32 file-system APIs. It’s pretty much offering all WinFS did, except for the client-side specific schemas (such as contacts, music, etc).

It’s also still as useless for most users.

I think the interesting part about all this (certainly SQL databases are a subject I struggle to get excited about) is that once you examine the WinFS idea from the server end of things, stripped of the client-side schemas and vapourous UIs and dubious use-cases, it’s pretty mundane. That is, it’s everyday stuff. The critical step’s already tak[en/ing] place: we’re moving our own data off the filesystems and into the cloud, where they’re shaped behind the scenes into whatever schema is best for the application, with an interface on the front-end designed to fit that structure. Files can exist in multiple folders in Google Docs. Social networking sites deliver on much and more of the contacts functionality originally promised by WinFS. iTunes structures your music collection as a simple database and collection of playlists built from queries into it (dynamically if you wish). The battle WinFS was going to fight has already been won. The next one is one Microsoft was never going to fight anyway, one for the structure and open exchange of this data.

[1] Via OS News

I’ve got brick and ore for wheat

In the world-popular board game Settlers of Catan, there’s six types of land tiles that you can occupy in order to obtain resources to fuel your growing empire. Quarries for brick, mountains for ore, forests for wood, plains for wheat, and grasslands for sheep. In your bog standard Settlers game, the most important two to secure are invariably brick and wood: you need them to build settlements and roads, and the game will often go to who can expand the fastest. But in the expansion set Knights and Cities, the games take longer, and the dynamic shifts. Invariably I find the most important land tile to secure in the long-distance run are plains. In the longer game you run up against limits of land. You reach the point where you can’t gain points from expansion anymore, and so you must develop what you have. Your settlements have to turn to cities, and that takes wheat (and ore). Then you need to raise and maintain knights to defend them, and that takes wheat too. The thing about the game is that if you only realise this need once you’re starting to turn your settlements into cities, you’re going to be starved by the players who occupied all the prime spots from the start. The game’s often made or lost by what tiles you initially claim.

Food supplies are one of those things you have to think long term about if you want to win. And two of the latest Stratfor podcasts[1] give an insight into how one of the governments more known for its long term thinking is viewing this particular phase of play, given the current food and fuel crisis.

The first discusses a proposal by China’s agricultural ministry to provide incentives (think tax breaks, subsidies for farm buildings and projects, etc) for companies to purchase or lease farmable land in foreign countries, and particularly Australia. This indicates the long term aspect of their planning: currently four out of ten Chinese are farmers (albeit in a peasant economy), and it’s still a net food exporter. But the podcast cites a McKinsey report saying that China’s urban population will reach 1 billion by 2030. By 2025 there’ll be 219 cities in the country with individual populations >1M.  And for a country with currently 19.6% of the world’s population, it’s only got 9% of the world’s arable land. Additionally, the east asia analyst that starts talking partway through points out that food imports are historically important to China as a result of their Great Famine (although looking back over the list, they’ve had a few).

So they’ve been acquiring foreign land for this purpose since the mid 90s in places like Cuba and the Philippines. This is not new. That they will be looking to Australia is. To continue the Settlers allusion, Australia can be thought of as two of the world’s largest quarry and plains tiles.

As the second podcast goes into detail to explain, although Australia is commonly regarded as drought stricken (which it still is in the wheat and rice-producing parts, but it’s hopefully emerging from this), its northern states particularly have great potential for agriculture (ie, it rains there in tropical amounts). And it’s very underdeveloped in this aspect. It’s more that the rice and wheat growers are partially where they are (ie, under drought) for historical reasons as much as anything else, and prefer the southern climate. Australia stands to gain from it in a small way (93% of all foreign owned land in Australia is Crown lease-hold, rather than freehold, and at least Victoria imposes a 20% additional land tax on foreign owners [2]), but it’s probably not how the government would like to see things going down. It’s far better for them to actually export it to China, and make use of the Free Trade Agreement they’re currently negotiating. I wouldn’t be surprised if the Australians do pass some more protectionist laws in this regard. Its use-of-local-labour and property ownership laws and are strong when it comes to foreign entities. Additionally in the north most land isn’t freehold but leasehold [3]. That makes things easier for the government, but they are still very much constrained by the free market. A lot of farmers there are hurting from the drought, and China’s willing to spend large on long term assets.

Other countries China’s interested in (and there are a few, in the interests of risk management by diversification) include that other big grain bowl of the world, Russia. But it’s one thing for them to talk about buying land in Australia, and another in their large, border-sharing neighbour. The Russians are even more sensitive about foreign ownership of land, and if the new president Dmitry Medvedev is anything like his patron Putin (who was rather strident in nationalizing gas and oil supplies), then he probably isn’t going to be a strong advocate for free market liberalization and foreign investment. The Stratfor podcast concludes that if similar farm purchases happen in Russia, we won’t be hearing much about them due to this sensitivity. I expect that if it does start happening in Australia, particularly near an election cycle, we’ll be hearing plenty about it. The Australian reaction will be worth watching.

This can be regarded as an over-cautious strategy in some ways. China’s economy will soon be the world’s largest again. It’ll have money to buy its food for a long time. So this is interesting (even if it doesn’t happen) in how it shows how sensitive the Chinese government is to food pricing. To keep a population that large happy, you need a lot of bread, and a lot of ciruses.

[1] If anyone’s feeling particularly generous and for some reason wants to spend several hundred dollars on a birthday present for me next month, I’d appreciate a StratFor subscription. Failing that, I’ll just keep on going with their free content features. Their daily podcasts I can particularly recommend as being a neat way to go beyond the typical media depths (shallows) of analysis.

[2]  Doug Cocks – Use with Care: Managing Australia’s Natural Resources in the Twenty-First

[3]  http://www.bmr.gov.au/education/facts/tenure.htm

John Resig just released something rather neat

John Resig just released a rather awesome Javascript library that implements the Processing language.

Jaw status: dropped. My empathy for the Reddit commenter who had nothing more to say on this release than “I give up on programming now.”

Make sure you check out the extensive list of demos. The long predicted competitor to Flash that Javascript + <canvas> could be may be soon upon us. Good. Or at least, some neat games should come out of it.

The line between post-modernism and madness…

… is a social construct. Or so it would seem from the story of the lecturer at Dartmouth college who decided to sue her students for harassment when French Literary Theory and how it applies to science didn’t go down too well with them.

From the Wall Street Journal:

Priya Venkatesan taught English at Dartmouth College. She maintains that some of her students were so unreceptive of “French narrative theory” that it amounted to a hostile working environment. She is also readying lawsuits against her superiors, who she says papered over the harassment, as well as a confessional exposé, which she promises will “name names.”

Ms. Venkatesan’s scholarly specialty is “science studies,” which, as she wrote in a journal article last year, “teaches that scientific knowledge has suspect access to truth.” She continues: “Scientific facts do not correspond to a natural reality but conform to a social construct.”

The journal article in question. I can’t imagine why her science students objected to any of these arguments.

[COI declaration: I tried to read Foucault a few weeks ago. It was a 10 page essay. I got 3 pages in before I gave up.]

Via Gawker and Julian Sanchez.

Graceful Degradation, or Progressive Enhancement?

There’s a question of design philosophies in software that describe two diametrically opposite ways of theoretically getting the same results: Top-down or bottom-up? Traditionally we’re supposed to do the former, designing the big picture first and then filling in the details until we’ve built all the way down from abstracted design to concrete reality. We usually do the latter, building little lego bits and then trying to connect them into a structure approximating the original design.

But in a sense in the world of web application design, where “best practice” isn’t just a moving target but one moving in quite different directions, the opposite is in effect. We’re doing top-down experience design, when we should really be doing bottom up. The distinguishing issue is that on the web, we’re not just creating one design, we’re creating a suggested design that will then be rendered in a whole multitude of ways.

Normal practice in web design/development is to work out what you want to functionally do, then make the call on what technology (Flash, Shockwave (remember that?), Java, AJAX, ActiveX, PDF, or even Silverlight) would be best for making that happen, evaluating the “best” as a measure of time, expense, longevity, security, and market support. And then if time allowed, you started designing fallbacks for clients without those technologies.

Chris Heilmann has done a good job advocating the opposite philosophy of progressive enhancement. This is the philosophy that involves you starting your site/web-app design with the lowest common denominator, and you produce a functional product at that tech level. If it can’t be done, you need a good reason for it to be so. Then you progressively layer on “richer” technology. It’s the humble and unassuming philosophy: you don’t presume more than you must about your user and their circumstances.

They’re two opposing philosophies that theoretically should give the same results. You start high-tech and work backwards, or you start low-tech and move forwards.

The problem that works against this is Hofstadter’s law: Work has a knack of taking longer than you expect. Unexpected new things to work on arise, and then you start budgeting your time and triaging things you shouldn’t. In the first design model, you would design low-bandwidth HTML versions of your all-Flash site. Unless a new feature request came in and you had to implement that first in the Flash. Eventually you just give up and require that your clients all use Flash. Then you wonder why Google isn’t doing such a hot job of indexing your site anymore. Or you bite the bullet and spend a lot of time doing things properly. As soon as you start prioritizing the high-tech experience as the primary and complete version, you’re just constraining yourself against future flexibility. And then you sometimes end up irrationally justify that primary experience in places that shouldn’t really exist.

The positive reasons for progressive enhancement then start flowing out of varied examples. There’s increasing numbers of users who use something like the Flashblock extension (because I’m sick of Flash-based ads, especially the ones that start streaming video, sucking bandwidth without your permission). Similarly, people have taken to using NoScript, an extension that imposes a white-list on allow Javascript. And don’t forget the disabled. Screen readers for the visually-impaired do a really bad job of handling Javascript. So does the Google web spider, for that matter. Or take the iPhone, a suddenly popular new platform that completely eschewed Flash. If you had invested into a site that required Flash, you were inaccessible. If you had built a site around progressive enhancement, you were much more well equipped to support mobile Safari. So adopting a philosophy of progressive enhancement in these cases improves support for niche users, accessibility, search engine coverage, and an unforeseen new platform.

This means things like coding HTML that’s completely functional without Javascript, or Flash. They’re technology it’s often reasonable to assume the average client will have. But unless you can really justify it, you shouldn’t.

It involves things like not using links with href="javascript:doThis()" or onClick event handlers hard coded into their HTML. Instead just give the links decent ids and then add the event handlers dynamically from Javascript. It’s not hard to do, if you do it right the first time.

There are some surprising offenders in this class. Try adding accepting a friend request on Facebook with Javascript turned off. You can’t actually click the button, and there’s no reason that should be so. Why did I run into that?[1] Well, if you’re the site owner, does it matter?

I had a Dynalink switch with firmware that broke the rule too. It used Javascript-powered links for its navigation, instead of plain HTML. I wouldn’t have noticed, if it weren’t for the Javascript not actually working on browsers that weren’t Internet Explorer. There was no earthly reason for those links to use Javascript, and every time I had to load up IE (particularly if it involved a reboot to Windows to do so) just to configure my switch, it didn’t do much for my opinion of Dynalink.

If you’re a web developer and you’re not already doing this or haven’t heard of the idea before, I strongly encourage you to read Chris’ full article on progressive enhancement. If you haven’t, but you’re exercising sound development principles (separation of code and content, observing standards, using semantically sensible markup, designing with accessibility in mind etc) you’re probably already most of the way there. But do skim over it all the same. It’s a descriptive philosophy that successfully captures much of what we should already be doing, but for reasons that fallen under different hats previous.

So Trent Reznor’s feeling generous

NIN logo

It was only a few days that I described how I bought NIN’s latest album Ghosts for US$5 on the strength of Trent Reznor putting the first quarter up for free, and selling it in a high quality DRM-free format. Then he released a single, Discipline, for free. Discipline was alright, but I’ve been loving the instrumental Ghosts. Makes for great programming or reading music (ambient yet interesting, without vocals or annoying bits), and it’s long enough for it to suck you in with its 110 minute length.

I can’t call Ghosts his latest album anymore. The ID3 tag on the Discipline file said to visit NIN.com on May 5. Reznor seems to have been bottling up the creative urges over the years, waiting until he could be released of the shackles of his contracts, because he’s somehow just released another album. Ghosts was only released on March 2. It’s called The Slip, and includes Discipline. And like the single, the album’s free to download. I would offer a first listen review right now, but I’m at Uni at the moment and going to wait until I can download it from somewhere where the bandwidth doesn’t cost so much (a situation I was reminded of the oddity of last week by a new PhD student here from the Netherlands complaining that we have to pay at all. Would Page and Brin have been able to start Google in a CS dept where they were paying 2.5c/MB during the day? But I digress).

There’s a promised CD version coming soon, but the monetization strategy (beyond just generating good will and fan interest) came in another email (downloading from NIN involves handing over your email address. It’s a pretty fair trade):

Nine Inch Nails is touring the US and Canada this summer. Premium tickets for all NIN headline dates will be made available to registered nin.com members in advance of public on sales. Pre sale tickets are personalized with the members legal name printed on the face of the ticket and ID will be required for pickup and entry into the venue on night of show. Pre sale ticket supplies are limited and available on a first come, first serve basis. Our goal is to put the best tickets in the hands of the fans and not in the hands of scalpers and/or brokers. Register at nin.com and check the performance page for additional tour updates.

And then it lists 26 concert dates and venues. It’s similar to the higher quality purchase options that were offered with Ghosts, which went all the way from a $5 download, to a $10 double CD, to a $75 deluxe edition, to a US$300 limited edition collector’s box which probably included a handcrafted figurine of Trent or something. They sold out of all 2500 of those. Clearly Reznor’s realised the value in catering to both the long end of casual listeners and the short end of dedicated fans. He’s now doing what many have predicted will be the best long term new business model for music: give away the music to act as a promotion for the concerts, and be varied in the product range you offer.

Update: Downloaded the album, and it’s great.