It’s been a long time since I was last involved in making or promoting a website. I did several as an occasional and casual job during high school of varying quality for businesses that ranged from very small range to small. The most dynamic they ever got was one site with a single page that could be edited by the owner, courtesy of a hacked TinyCMS. But it managed to turn into my first ‘proper’ job when I worked over one summer for a local software development firm, which at the time had a headcount of about 200. It was a pretty normal office boy job, except for my primary task which was to work on ways of improving the company’s website ranking in search engines.
My thinking was at the time that all that mattered was Google, but as this was when they weren’t yet a household name I wasn’t able to easily convince my boss of this. So most of my time was spent browsing the forums at Webmaster’s World and the affiliated SearchEngineWatch, and given how little influence I would have on the site itself, learning how inadequate all the measures I could recommend or implement would be. Quickly retreating was the age when meta tags in a page actually meant something. Remember the advertisements for programs that would automatically submit your website to hundreds of different search engines? This was the tail end of that era. Anything beyond those sort of frivolities was not something I could really change. I remember spending a lot of time after that on tasks like writing Python scripts to check the site’s ranking on various search queries, filing papers in binders, and burning CDs. Everything but improving search presence. And in fact the Python was just an excuse to program, to do something to keep my mind interested.
That was 5 or 6 years ago. So when Cameron told me that he was going to be working on the website for the new Campus Church at the University of Canterbury, I was more than happy to offer my support. Collaboration on a project’s easy when your bedrooms share a wall. And then in a completely objective manner, I pushed him towards using a web framework I had been wanting to try out on a real project for some time. As it turned out, Django’s a rather fantastic little framework, so that turned out well and we’re very happy with the results. I might have to dedicate another post to singing its praises. jQuery too.
It’s five or six years since that summer job and the trends that were emerging then have solidified. And by trends I really mean just one trend, and that was that Google is all that matters in search, particularly since the others have all since adopted variations of the PageRank algorithm into their own engines. Before I continue it’s probably worth noting that search exposure is only one prong of a publicity campaign. Our target demographic (university students at a particular university) is extraordinarily suitable for a Facebook campaign, especially with how cheap Facebook ads are at the moment, and of course we’re planning for more traditional advertising (read, talking to people) come the rush of Orientation Week. At the moment we’ve had more visits from Facebook than from search engines. But onwards with the search angle.
But of course, as the recent Scientology Google-bombing shows (FYI, Scientology is a dangerous cult), by far the most important factor in your Google PageRank is still inbound links. This I feel is still the case despite the mass PR drop late last year. I remember from my summer job suggesting ways to get links to the site on more popular sites. It didn’t happen. But something has slightly changed since then which can hamper such efforts, and that’s the emergence of nofollow.
If you’re not familiar with it, the short of it is that if you add “rel=’nofollow'” to an HTML link tag, it acts as a flag to Google that the link shouldn’t count as any sort of recommendation of quality/vote of confidence. Google of course uses links as its primary measure of a site’s value. The more links to a site, the more importance it has. Links from important sites are more valuable than links from non-important sites. The primary rationale at the time was to take the reward out of blog comment spamming, which was rather big back then.
There’s some already bookmarklets like this already available when you search Google for “nofollow bookmarklet”, but mine’s slightly more flexible because it allows for the rel attribute to have multiple values as well as nofollow (as it increasingly will have as more semantic linking is adopted).
So install it, and see the hidden web of distrust in front of you. The user-content driven sites are most interesting.
YouTube doesn’t trust its users. I can’t really blame them:
Neither does Wikipedia, which makes a lot of sense for them:
But legendary geek news site Slashdot’s a bit more ambiguous. I can understand the nofollow on links in comments (although maybe it’d be a good motivation for users if they were to remove it on comments moderated above a certain level), but if they allow it on the standard user profile Homepage link, why not the front page link in the user’s name?:
The Slashdot has a PR of 9/10. That’s massive, and it seems somewhat of a wasted resource that they not use it better. Its younger competitors Reddit and Digg (both PR 8/10) don’t use it, and they’re even automated unlike Slashdot, who still use human editors to approve all front page items.
Maybe the web’s ready for a finer-grained method of trust annotation?
And how do I bring this ramble full circle? Well we go back to my summer job and one thing I tried to suggest several times. Well there were several things I tried to suggest several times, including maybe not storing the list of all the client website usernames and passwords in plaintext for all – myself included – to see on the internal network. But another thing I did suggest was a press release designed to appeal specifically to the Slashdot crowd, and thus earn a juicy front-page link. Announce their then-secret work on making a Linux port. Get that burst in traffic and the link. It would’ve worked. But it didn’t happen.