Crowdsourcing III: One More To Tell You About, Recap & Results of The Playlist Experiment
Bravo. You guessed it. They were all built using crowdsourcing.
You got that because you read my first 2 articles on crowdsourcing, right?
Well just in case you didn’t then read below for a quick recap.
Couldn’t care less either way? Then what are you doing here? Did you do a search on the Eiffel Tower?
OK I lied about the Eiffel Tower, but the other two are bang on.
Here’s where we got to so far (for those of you who are actually, definitely and genuinely interested in Crowdsourcing):
Recap 1: Crowdsourcing: What it is and Why You Should Care
In the first article on Crowdsourcing, we covered what it is and why you should care. Well, you should care because it’s not just a growing trend, it’s all around us in today’s world. We are constantly providing feedback to big companies either knowingly or unknowingly and that’s shaping their products and services – and that’s crowdsourcing.
Recap 2: Some Crowdsourcing Examples
In the second article on Crowdsourcing I gave you 10 examples. I tried to keep these varied and interesting so you can get an idea of the scope of this thing called crowdsourcing. Obviously there are more examples and I’ve been sent a few since writing that article, but there is one more in particular that I wanted to tell you about here for no other reason other than I think it’s a pretty cool example of the power of crowdsourcing. It’s a mapping app called Waze and is just a great example of how crowdsourcing can be really useful. Will it topple Google Maps? Probably not. Does it even need to? I have no idea but I thought it seemed like a pretty cool concept. Here it is…
Another Great Crowdsourcing Example: Waze
What better way to trigger a movement to a cool crowdsourced app than to take a loyal ‘crowd’ (happy iPhone users), hide one of the most popular apps (Google Maps) and replace it with your own useless version (the very underwhelming Apple Maps).
Nice one Apple.
So iPhone users have been flocking to turn-by-turn GPS navigation app Waze in droves.
So what is Waze? It crowdsources real-time traffic and accident data so users don’t sit impatiently on the highway any longer than they need to (and they can feel good about helping all the cars behind them). It even allows users to close off roads (i.e. if a road is closed off in real life for any reason – accidents, floods, on-going work, debris… then they can update the app). This way, through the power of crowdsourcing, Waze can change maps in real time to reflect reality.
The road closing feature is simple from a usability perspective but very complex behind the scenes – to prevent people from screwing up the app’s reliability, the Waze algorithm will only mark a road closed after a certain number of drivers ranked highly in the app have also reported it.
Pretty cool, right?
Another Simple Crowdsourcing Example: IMDB
OK I couldn’t resist giving you one more example because it’s something I use a lot (in fact we have an ‘IMDB rule’ for choosing a film between friends with wildly different taste when on our skiing holidays but that’s a different story…) and it’s simple but very effective – IMDB.
The reason that popped into my head is that, like Waze, it weights contributions using a ‘trust factor’ algorithm for user submissions. I have no idea what that algorithm is by the way, but I know it’s there.
So when I want to know how good a film is, what it’s about etc, I punch the title into IMDB and it tells me all about it, including that all-important rating and how many people have given their verdict. I very much enjoyed watching a superb film recently after checking out it’s rating on IMDB first: The Intouchables.
Another great example of crowdsourcing. That particular film has a score (at the time of writing) of 8.6 (which is VERY good for IMDB) from over 161,000 users.
Note: I tend to find the scores in IMDB pretty reliable but have noticed if ever I disagree with a rating it tends to be a small disagreement but more likely than not for a new film. My thinking (and I could be wrong) is that new films generally tend to get higher scores so that’s perhaps one thing to watch out for. Probably to do with the excitement of it being a new film.
But again, a pretty good example of the power of crowdsourcing.
Can you think of any more?
Our Own Little Crowdsourcing Experiment: The LTG Playlist
So just for a bit of fun and to give a real and participative example we ran our own small crowdsourcing experiment for 7 weeks. I promised that I’d share with you some of my thoughts, the results and findings from that experiment…
Until now the page with the playlist on it has not been publicised at all except via email with those taking part in the experiment. It’s been there on the blog but hidden from appearing on the articles homepage when published (yes I have some tricks up my sleeve to do that kind of thing) and used only for this experiment. This mainly to keep the experiment and mails back to me in my inbox contained – but feel free to leave comments or even add your own votes if you like now that the experiment is finished. You can see the playlist, all of the songs and the majority of the people who took part (in the graphic) on the LTG Playlist Page.
So first I’ll tell you the results of a simple survey we conducted with the participants of the LTG Playlist experiment (the experiment by the way had 213 participants, some more active than others) and then I’ll add some observations of my own in conducting this experiment and how it relates back to the overall crowdsourcing topic we were exploring.
LTG Playlist Survey Responses
Q1: Do you listen to the LTG Playlist?
Subtext: Did we create (or prove that you can easily create) a useful product?
50% said Yes (of these, exactly half listened only once and the other half more than once)
42% said Not yet but they intended to
8% said They hadn’t and didn’t intend to (even though they took part in the experiment and the survey)
Conclusion: Though only really meant as a proof of concept, with this simple experiment we actually created something of use – i.e. a playlist people do actually listen to (I’d assume for the 8% that don’t they were interested in the mechanics of the experiment only, as they still took the time to complete the experiment and the survey even though not listening to the music at all).
Q2: How many new songs did you learn from this exercise?
Subtext: Was there a decent learning opportunity presented via the crowdsourcing option?
100% (i.e. all) of those who listened learned at least 2 new songs from taking part in this experiment.
The minimum number of new songs learned was 2 and the maximum number of new songs learned 11.
Conclusion: Through taking part in a crowdsourcing experience the participants had the opportunity to learn something new (in this case new songs based on other people’s music knowledge and taste).
Q3: Are there any songs you dislike on the playlist, if so, which ones?
Subtext: Due to it’s ‘open’ nature does the crowdsourcing experiment risk a product with a lower perceived quality due to the diversity of the inputs
87.5% chose the response of ‘Other’ and left comments, most of which were explaining their choice not to pick a number, such as ‘None that I hated’, ‘I generally don’t dislike music’ or ‘None’.
12.5% said they disliked tracks, in each case only choosing 1 track they disliked. Most disliked tracks were track 1 and track 7.
Note: In our experiment, had it continued, this could obviously have been further refined by ‘voting off’ less popular tracks via the crowd, though we didn’t take it that far.
Q4: Can you share any thoughts, stories or insights you’ve had from this experiment?
Subtext: Feedback from participants perspectives
Here are some of the responses:
“Shows that the sum of the parts is more than the individual effort. I’m impressed by the ratio of people on the list / people actively taking part. It’s much higher than I would have originally estimated. Also shows how breaking things down into smaller tasks helps to achieve more. The crowd sourcing element is very interesting….”
“Exposure to some new music, part of something new and learning about crowdsourcing.”
“I was interested to see if my song would be picked. I’ve been interested to see what went on the list.”
“I’m not really sure I understand it”
“Honestly, I am not fond of being part of an ongoing experiment that I need to take time out of my day to participate when I am not sure it will directly effect me in a positive manor. I love the idea of crowd-sourcing but not to the point where I must do something every week. Once is fine.”
“It is interesting to see people get on board. Also I am intrigued if there is a tipping point where it really goes big.”
“It was interesting to see out of all the songs that could have been chosen that I knew quite a few of them. For example John Denver, Roger Whittaker. Louis Armstrong, I was surprised to see older songs in the list.”
“There is a wide variety of genres and time periods represented by the list. A lot of them are recent (last 10 years) songs, but there are still others that are from the way back machine.”
“Because we had two pretty big projects in mid January I’ve missed some of what has happened with this experiment. I understand and like the concept. We even have plans for doing some crowd sourcing with another project that is in our pipeline now.”
“This has taken me back to the days where we used to make mix tapes and really treasure our songs – like digest them – have them encompass a memory, or an entire period in our lives. It’s filled me with the nostalgia of those days where summer breaks were summed up in albums and trips abroad meant getting together playlists for the car or the plan. I miss the days where music really meant something, something that was precious and weighed in our hearts and minds.”
“Really great to see how people engage – music brings them together, man!”
Special Thanks To…
I’d like to say a REALLY big thanks to Andrew C., Andy H., Andy A., Barry, Carole, Angela, Hedeel, Jonathan, Razwana, Sylvia & Yvonne who were the most active participants of this experiment by far (aside from a few who were really actively participating but wish to remain anonymous):
Many many thanks guys!!
LTG Playlist Experiment – Further Observations
1: It’s just a playlist
At the end of the day, despite the experiment being interesting and a success, it’s just a playlist.
In fact, the proof of concept in quickly and easily creating a playlist via crowdsourcing was satisfied with the very first question when I asked my email list ‘What’s your favorite song?’.
The LTG Playlist was created there and then in minutes. The on-going experiment added videos, a page, votes, a graphic and who knows and could have gone on to create an actual product – e.g. a CD which you could buy online.
The reason we didn’t go that far is because I didn’t feel there was enough momentum (more on that shortly).
In conclusion, the initial crowdsourcing experiment (‘What’s your favorite song?’ -> LTG Playlist in minutes) was great but perhaps the on-going experiment for those interested should have been something in a different direction or with more scope rather than continuing with a playlist.
2: It’s all about the numbers
I’m an even bigger fan of crowdsourcing after this little experiment. Even with such a small and basic thing, there are several lessons to be learned. One area which is very interesting is around the engagement of those involved. Though we had over 200 participants in this experiment, most of these were very passive and the ‘core’ of very active participants was relatively small. With larger numbers, say 2000 participants, then I’m certain that the core would have been significantly more powerful with a more self-fulfilling engagement – i.e. perhaps that ‘tipping point’ mentioned in one of the feedback comments above would have been reached.
With the size of the experiment being as it was, it was probably right to run it for a short time and stop when we did, even though there were possibly a few other things we could have done with it – this was only ever meant to be a small experiment. Interesting though that someone mentioned a tipping point as I’m sure that’s the sort of thing that can happen very easily via crowdsourcing when the numbers start to get just a little bigger. I still think what we’ve achieved here with this small experiment shows us quite a lot.
3: People are busy and need incentives
One of the comments was quite clearly ‘bothered’ by the on-going experiment. I find this fascinating as they didn’t need to take part and could have easily ‘opted-out’ at any time, but instead actually took the time to tell me via the survey that they didn’t have time. Amazing. That being said, I do respect that we have busy lives. Nobody has the right to ‘expect’ anything and the nature of this particular experiment was most definitely a ‘join in if you want’ kind of endeavor.
So it’s worth noting that anything such as this has to have a clear outcome that people understand (WII-FM: What’s In It For Me?) – by the way the answer to that question in this case was always meant to be simply “Being part of a small community experiment and just seeing where that takes us…” and you can’t make it clear enough to people the ‘rules of engagement’ – (even if you do, people are busy so they may ignore them).
4: Any kind of interaction is good interaction
From day 1 of this, i.e. the very first mail I sent asking for people’s favorite songs, I got some great interaction. I got to know some of my readers much better, some of whom I’d now call ‘friends’ rather than ‘readers’, I got to learn some new things (and some really nice new songs myself – I’d never heard of Angus & Julia Stone) and some great feedback not only for this experiment but in general.
There’s really no such thing as negative feedback, just feedback. Everything has a reason and I respect that, so please do continue to give me your honest feedback, whatever it is.
The feedback and experience was overall extremely positive with a couple of very touching conversations thrown in, particularly when it comes to music reminding people of loved ones or awakening particularly treasured moments in their lives (one of these happened by accident via a song chosen by somebody else).
Honestly, it’s been a privilege to be a part of this and to get to know some of you a little better than before.
What a fun experiment. 50% is a good number of people who listened to the playlist. I think it would be a good number even if it was lower than that. Maybe helping create something makes people more likely to interact with it later.
I’ve never heard of Waze before. That’s a cool app. Since I have an iPhone I think would love to use this app since Apple’s maps are definitely not as good as Google’s. That’s a great idea to use algorithms to get trust. Seriously, I think you can get an algorithm to do just about anything. I know I’m exaggerating, but they do seem to be more important than ever. They’re at least helpful in Waze and IMDB.
I definitely think the WIFM concept is key here – by the end, I was wondering where this was going. It’s fun to start off with seeing where it goes, but after email 3, I was thinking ‘OK Alan, either close this thing down or let’s make it huge’.
It was fun to participate, and to hear songs I hadn’t heard before.
Are you going to crowdsource your way to something else?
– Razwana
Fair enough 😉
I’m not sure how huge a playlist can get but the whole point of crowdsourcing is that if it was going to grow, it has to be via the effort of the crowd & I just wasn’t ‘feeling’ it at the decision point around 6 weeks, hence chose your 1st option and made the following week the last one.
Probably, though I have a LOT of projects at the moment (some of which you know about) so not in the near future (though I guess technically there are elements of crowdsourcing in a few of the things I’m doing).
Interesting Recap, Alan,
It is fun to see what became of the experiment.
I can’t remember if in the beginning when you asked for a favorite song if you stated there would be a crowd sourcing project. Or, did you mention that after the songs had been collected?
Hi Yvonne,
the initial question was asked to 100 people from our email list with the simple question ‘what’s your favorite song’. I got something like 22 or 23 replies fairly immediately, so in minutes a playlist was created (there were then a few more that trickled in that we added on the end to make the list you see now).
The question was asked before I wrote the first article on crowdsourcing and to feed into that post on crowdsourcing so that I could show in a very real way just how easy and effective it can be. It was all we really needed to show you the concept in action.
Someone then suggested we continue (can’t remember who) either in the comments or via email, so that’s where the on-going experiment came from. If the ‘crowd’ had been keen enough, we could easily have produced a ‘shelf’ product with this – but as I said above, it was just an experiment.