Re-thinking the way review scores work (Vergecast 054)
Hey guys, I just want to throw an idea out there real quick.
When discussing reviews on the 1year anniversary of the Vergecast episode, I enjoyed getting a breakdown of why you use a number system and the point of it all for the user.
Scores don't work well. You cannot encompass an entire device or experience into a number that will mean the same thing for everyone. It's simply not possible. Isn't the score supposed to help the user? Yet it causes more uproar than help, as well as encourages users not to read the actual review, which is detrimental. Here is my proposal; Dynamic scores.
How would that work? Well, Verge scores are already almost there. Products are separated into different pillars (design, software, etc) and numbers are assigned to each, which then generate a 'main' score.
How about putting it into the users' hand instead? I often read reviews on Verge that read something like "if you find ___ important, this is great. If you want ___, you might be disappointed", etc. Why not just plug that into the scoring system?
Have a scale for each category that the user can input how important that capability is to them on an 10-point scale. 1 being 'I don't need that capability/this isn't very important to me' to 10 being 'this is a vital capability for me/this is very important to me'. Have several 'categories' or 'capabilities' (screen, build, software, OS, hardware specs, etc) at the bottom of the review that a user simply fills out, to generate their own personal score based on what they personally find important or unimportant in a device.
There would obviously have to be a back end system of numbers that the reviewer gives, but it wouldn't be front and center "this is the score". It would be personalized, and make more sense. And of course there could be an aggregate score averaging everyone's personal score, as well. But that isn't even necessary, the score is in-between the lines; people should read more.
Anyway, I hope this idea makes sense, I described it as best I could, but I really think something like this could be a great new way to re-invent the way product review scores are given. Different products mean different things for different people, and so should a score.
Just a thought. Thanks!