Dalton Caldwell recently posted about his experience using Orkut back in the day. It was a great place, until the community changed away from what he wanted. In the rest of the post, he talks about Critical Mass and Network Effects.
In short: In order for a social site to become sustainable, it needs to have critical mass. Users come back, because enough is going on for them to come back. However, this continues to create value, leading to characteristically hockey-stick growth. This value can’t be sustained due to restraints in how the medium is structured and limits of our own to process information. This is the fun part, the challenge that a lot of these sites are struggling with (Facebook, Twitter, etc). Users won’t stop becoming connected to more users, whether it is close new friends, or no one you know (ie a new follower on Twitter). In order to sustain critical mass discourse, some things need to change.
The one’s that been stuck in the back of my mind, is to decrease the amount of “entropy” created by the increase of information in the network.
The larger your graph becomes and the more information you share, mean you start looking at sharing differently. ie, with each new piece of information being added to the network (in form of updates, photos or new connections), it adds value, but returns start to diminish. More information means more to consider.
Let me try and explain by a metaphor (and perhaps a lame platitude):
To get a car driving at top speed, it is easier and faster to start from 1st gear. Once it kicks in, acceleration increases, changing gears (value is still being added), top speed increases… up to a point where any addition to the top speed means new problems arise (that wasn’t evident before). Driving faster means greater possibility of accident and more considerations to take into account. Instead, the car and driver tend toward an equilibrium of speed that is sustainable.
The same with social networks, however these sites don’t look at how to trend towards critical mass discourse equilibrium. Recently however, it’s begun to change. There is less informational baggage hanging about.
Facebook, with it’s new timeline feature, summarises old content. Unless you know where to find it, it’s nigh impossible to just scroll through it all. Users who you haven’t been involved don’t show up in your feed, everything is becoming more and more filtered. Facebook is running at critical mass. It accelerated from a small college site, adding features along the way and growing into the behemoth it is today.
Twitter is a great example as well. The system is thankfully still relatively simple (not too much to consider). It is and always has been focused on “What’s happening”. This is fleeting, and it shows. There is less information bogging new interactions down. It is much easier to run at critical mass with a system like Twitter, however they aren’t doing any filtering. The result shows: critical mass discourse occurs in groups less than 150 (http://arxiv.org/pdf/1105.5170).
What it comes down to is that running a complex system at critical mass is difficult. You won’t know whether the system you built will be the cause of its own death, because just as the car shows its new troubles at higher speeds. Trying to change such a system at such a stage can cause user confusion, so instead they’ll just have to change constantly.
P.S. I’m currently researching information overload on microblogging services. Sometimes it is just nice to write in a non-academic way and be able to use unbacked opinions (that hopefully will become scientifically backed opinions).