Rick Hurst Web Developer in Bristol, UK

Menu

Month: March 2011

Social media withdrawl tools part 1

I have occasional periods where for one reason or another I want to distance myself from social media for a bit. One of the things that usually stops me is the idea of missing communication aimed specifically at me. Facebook has the option to control what types of communication (e.g. a message) result in an email, and twitter has the option to send you an email when someone direct messages you, but (as far as I know) there is no direct way to be informed when you are “mentioned” on twitter.

Here is one technique to get round this – presuming you use an RSS reader of some sort (I use google reader), you can go to search.twitter.com, search for your username and then subscribe to the RSS feed for the results.

Now simply uninstall your twitter client and step away from the firehose, and if someone mentions you, you’ll know next time you read your RSS feeds. Less immediate, but that’s part of the point!

My Idea of March – a decentralised microblogging/ chat system

Chris Shiflett has suggested a blog revival, and i’m having one of those days, trapped between sorting out domestic chores and procrastination, and not being entirely productive, so what better time to blog than right now!

One of the current topics being discussed is how twitter have asked developers to stop creating new twitter clients (for the non-technical a “twitter client” is a program or “app” such as tweetdeck that lets you use twitter on your computer or phone without visiting the twitter website). Apart from being an annoyance to people who want to create new clients, many developers including myself are starting to read between the lines, that this is a sign of the corporate machinery clanking into action to control how twitter is used, so they can more efficiently monetise it.

I think this is kind of inevitable – i’ve never seen a service adopted by the masses so quickly, and most long term users will remember when twitter struggled to scale as it suddenly became a network of multi-millions rather than a few hundred or a few thousand.Any social network system like this needs a critical mass of users to make it useful – but looking at my own needs, that critical mass consists of a couple of hundred of people who are mostly friends and colleagues in the web/ digital media industries. I don’t use the “trending” stuff and i’m not interested in following celebrities, so a critical mass for me would be for all or most of those people to adopt another system. Yesterday there were mumblings about identi.ca, and sure enough you’ll see that i’ve reserved my username (after some signup confusion and accidentally signing up to a mailing list instead).

Identi.ca is an open source solution and I look forward to seeing what develops with it, but I can’t help feeling that the real solution (especially for the technically inclined) is a decentralised system, just like blogs have always been – you host your blog either on your own site, or using a service such as blogger, and people consume them directly or via RSS readers. This is another example of geek-led innovation – the geeks were doing it first, the mainstream followed later.

I’ve had a quick look around for a decentralised micro-blog system, and it seems there are a few already out there, but before I look closer to see if any of them would fit my needs, here’s how I envisage it working:-

  • You self-host a micro-blog on your site, it’s just like a blog, but each post has a 140 character limit
  • The micro-blog has an RSS feed, a variant that can optionally include extra info such as “in reply to” and “location”
  • You use a self-hosted micro-blog aggregator to follow other people’s micro-blogs (this could of course include a twitter stream, using RSS not their restrictive API)
  • The aggregator could be on your site, or even running locally on your machine, and of course you could build ANY DAMN CLIENT YOU LIKE to view, interact with, and post to your micro-blog
  • You could hook into most of the existing services for things like link shortening, image hosting etc.
  • Private messaging would need some thought, but it’s essentially like having a contact form on your website (and the same spam considerations).
  • Popularity contest “follower” stats would be optional – it would actually be pretty difficult to work out how many people are following you, other than analysing RSS stats, which can be misleading.
  • Global search would be tricky without a central database, but I rarely use that.
  • No central point of failure means the platform would be very resilient, and there would be no massive server-farm or staff to fund.
  • No owner means the users run it, people may develop services around it to make money, but there would be no central owner
  • The core tech should be really, really simple – the basic service should need no integration with services or API’s, or software installation requirements which may scare people off.

So, once the geeks have invented the platform, there would be a similar barrier of entry to participating as there is to starting a blog – you would either need to install it, roll your own, or sign up with one of the hosted services I can imagine popping up a few months later. Therefore mainstream adoption would be much, much slower and because of the way I want to use a service like this, that isn’t a bad thing.

Looking at my twitter profile, I currently follow 372 people, a number that i’d like to get down, but social etiquette dictates that I only unfollow people if they get ridiculously noisy or off-topic. I bet if I analysed how many people I regularly interact with or find unmissably interesting I would get that number down to less than 100. If I ever get round to building it, I wonder if I could persuade 100 geeks to try it? Of course as mentioned earlier, I could build an aggregator to follow peoples twitter streams via RSS, so I don’t necessarily have to have other people adopt the platform immediately, so maybe that’s where I will start – I wonder if scraping RSS feeds counts as use of the twitter API?

archived comments

Some great ideas here Rick. After another fail whale this evening decentralized seems like a good idea.

Joe Leech 2011-03-16 22:08:33

The de-centralized nature of XMPP would work really well as a solid back end for a new service such as this. It already supports (via the pubsub mech) the idea of posts as well as private messages and has the advantage of support for private as well as public multi user chats. All running on privately or publicly owned servers communicating with each other using an open source protocol. Sounds ideal!

There’s also the additional advantage that XMPP is real time so no polling.

You just need to design the experience to feel less like “IM” and more like twitter which would be achievable.

Stefan 2011-05-25 11:17:10

A few gotchas when your Drupal site is being deployed behind caching servers and proxy_pass

I recently launched a Drupal based site that forms part of the website for a global company (being a freelancer working through a design agency with an NDA, I can’t talk about it any more than that, or stick it on my portfolio unfortunately!). I though i’d make a few notes about some of the issues I had to overcome.

The site itself is hosted on a dedicated VPS, but served to the world through akamai caching servers, which means that everything is cached by default. Therefore in this set-up, the CMS is only available at a different URL, where caching is bypassed. Gotcha no.1 is that cookies set and read via PHP do not work in this scenario*. Fortunately Javascript cookies can still be used.

In addition to the caching, the site is served up as part of a much bigger site /deep/down/in/the/url/structure, so proxy_pass is used (before caching) to rewrite the paths. Gotcha no.2 is that base_path() in drupal picks up the path of the origin server, so I had to add a condition like this in my settings.php (excuse wrapping, really must sort this site out for code samples):-


if($_SERVER['HTTP_X_FORWARDED_HOST'] !=''){
$base_url = 'http://'.$_SERVER['HTTP_X_FORWARDED_HOST'].'/path/to/my/proxied/site';
}


The clue to gotcha number 3 is in that last example. $_SERVER[‘HTTP_HOST’] reports the host of the origin server, rather than the public host, so if it is used anywhere in your code, it may cause issues. I ended up adding a function getHost() that I use to return the appropriate host, depending on where the site is being viewed (once again excuse formatting):-


function getHost(){
$host = $_SERVER['HTTP_HOST'];
if(isset($_SERVER['HTTP_X_FORWARDED_HOST'])){
$host = $_SERVER['HTTP_X_FORWARDED_HOST'];
}
return $host;
}


Gotcha No.4 was image paths in optimised css – both the Drupal-provided CSS caching and some external stuff I had using minify – these both rewrote the image paths to use those from the origin server. This meant that CSS applied images worked in the non-cached editing environment but not on live. I haven’t come up with a solution for that one yet, other then to leave some of the stylesheets non-minified.

* maybe there is a solution to this, but assume in this case there is little or no scope to request server config changes.