The Verge User Reviews: First Impressions

I am excited to see that the staff of The Verge understand that the usability and features of the website are just as important as the content itself. We have been hearing, from Joshua and others, that the website will be dynamic, enhancing over time, and it is great that User Reviews is one of the first post-go-live features added.

Of course, with any major change to a website, the response from end-users will often be extreme. Us techies are generally very opinionated people and the staff of The Verge are the lucky individuals that get to pick and choose if each criticism is over the top, or something to really consider.

Having said that, below are my Initial Recommendations of User Reviews, based on an obviously short use time of the new feature. Overall, I think the tool is great and I am very excited to start using it. Most of the recommendations below are not so much critiques, as they are ideas on how to ensure the website is obtaining accurate observations from end-users, with minimal effort. Get ready for a lot of opinions...

1. Category Descriptions

I really enjoy the slimline look of the review charts on The Verge. They provide for a very clear, concise, and bold view that allows readers to quickly understand the rating system. However, now that end-users will be reviewing, I believe it would be helpful to show short descriptions for each category. Perhaps in a mouse-over or just below the category header. This will help ensure consistency among reviewers. There is currently no guidance on the review page (or even linked from the review page) discussing the categories, the 1-10 rating descriptions that the staff utilize as described on the How We Score page, etc.

While most of the categories are straight forward (Battery Life) others may not be (Ecosystem, Software). Take Software for example, are we only discussing the Software that comes with the device itself or all software available for the device? I consider the Ecosystem to encompass the inter-connectivity of my life (email, calendar sync, music sync, etc.), but I can imagine some people may group Software Availability (such as App Store vs. Android Market) under Ecosystem while others under Software.

2. Final Score

A few thoughts about the Final Score:

  1. To be consistent with the verbiage utilized within The Breakdown section of each review, consider naming this Overall Score rather than Final Score ("we reserve the right to tweak the overall score"). I believe Final Score indirectly implies a direct calculation (average) where as Overall does not (just a personal thing, of course). I also feel like I am taking an exam when I see the word Final, it seems so End of Days.
  2. Allow for a more granular Final Score (0.1)...The Verge staff get to utilize a decimal, why can't we?!? If I want to review two products I either have to give them the same exact rating (which would be unusual) or rate one as 10% higher or lower than the other - that's a big swing. Consider the iPhone 4s vs Galaxy Nexus, for example - are they the exact same rating to most people (probably not) but is one 10% better than the other (perhaps not...).
  3. While I agree that the Final Score does not need to be a direct average of the sub-scores, I believe it would help end-users if the Average was shown on the rating scale at all times (maybe a gray arrow just below the bottom scale). I realize the average is calculated and then allows me to change it, but it would be nice to see, at all times, how I am rating overall compared to the even-weight average. For example, if I rate the product and see that the average is 8.6, I can then easily look up at my sub-scores and decide that the Performance category (for example) is most important to me and should be weighted more. Since I can easily see I gave it a 10, I will slide my Final score up accordingly. The average should not show to readers, I just think it helps provide good guidance during editing. Often times I think people have an overall score in their mind before even beginning the review, but if they complete the sub-categories and realize they average is a much different number, it may make them think twice about why they are forcing the Final score to be something different.

3. Average Calculation (don't forget the zero!)

There are many websites that don't allow for ratings of zero, which has always bothered me! Sometimes, although rare, a user really does want to assign a zero. In The Verge's implementation, a 0 rating is handled as null, meaning it will not show up as a line item when viewed by others and it is not included as a 0 when determining the average for Final Score. I may be in the minority here, but I would prefer to require a response to all categories and allow for scores of zero. I can maybe see this being abused by trolls, but it is no different than them just selecting 1, currently.

4. Random...

  1. Consider using a slightly thicker line to allow users to more easily click on the line rather than click-drag. Doesn't have to be much thicker, but a little would be helpful.
  2. There is a bug where, in a few situations, I was able to select a rating of 1 but the "1" did not populate over on the right-hand side of the scale. I don't know if it didn't register my rating or if it just didn't update the GUI accordingly. I am currently using Firefox on Win XP. I'll try to reproduce on my Mac tonight.
  3. In Joshua's announcement post, the sub-scores (7, 6, 5, 8, ...) do not align vertically with each other...it appears this may have already been fixed on the website itself as I am unable to reproduce.
  4. Consider including the number of reviewers along with the Score for each product header. I look at a User Review rating with 17 reviews much differently than one with 235. I know the count is listed down in the details, I just think it is important enough to promote up top.