Lumia 900 and the broken scoring system.
When reading the review of the 900, you are given the impression that the WP marketplace is stagnating or dead. That's a perception that permeates the review. In fact, it's a perception that isn't just a "Verge" thing. It's also fair that someone doesn't 'like' the Windows Phone Marketplace.
Here's the problem though.
When you break it down, there are two types of information, quantitative and qualitative. The first being the "numbers", the 50% of people bought brand X type of data.
The second is qualitative, which is more based on personal opinion and experience. The "how much do you like it" factor.
The quantitative data on the Windows Phone marketplace, and Nokia's apps break down like this.
In the 6 months between the reviews of the Lumia 800 and 900 on the Verge, the marketplace actually doubled in size, from 40,000 apps to 80,000 apps. In fact the marketplace is seeing an accelerated growth rate, to the point where Microsoft just announced that app submissions are going to take a week to process instead of their older three day goal.
By the numbers, 1.5 years after launch, apps growth in the Windows Phone market is outpacing Android by a good 25% or so, and is only about 10% behind iOS. In fact, it would be accurate to say that the Windows Phone Marketplace is the second fastest growing app ecosystem behind the App Store, and that the trajectory of this growth suggests it's growing, not slowing. This is a not insignificant piece of quantifiable data.
When it comes to Nokia specifically, in the time between the two Lumia reviews, Nokia has improved some of the brand specific apps, as well as launched new ones (not touched on in the review).
In the cases of both the ecosystem at large and Nokia specifically, there have been quantifiable improvements.
Another set of numbers:
The Verge scored the Lumia 800 ecosystem a 6 out of 10
The Verge scored the Lumia 900 ecosystem a 4 out of 10
From the quantitative data, there is absolutely zero reasons why the 900's Ecosystem score dropped by 33%. In fact it should be either the same, or marginally higher, because it has demonstrably improved in the 6 months between the Lumia 800 and Lumia 900.
If however, the scores are completely subjective, then any reviewer can choose the number he wants to assign to it, and be done with it.
The problem with that is that when you have multiple reviewers, opinions of a particular ecosystem can vary wildly, so ignoring quantifiable information in favour of a purely subjective score will create unneccessary disparity between two similar products.
A score on "Battery" is easy, as it is almost purely quantitative. Create a standard usage model, and see how long the phone endures, relative to other devices in the category.
A score on "Design" is similarly easy, because it's almost purely qualitative, though an argument could be made about the aesthetics vs. build quality.
This makes an "Ecosystem" hard to score, as it needs to take into account both real numbers and personal opinion. As it stands, it appears to readers that there is absolutely zero internal guidelines in how The Verge weighs the objective and the subjective when it comes to the ecosystem score.
The Verge rating process:
this is a non-weighted average, we reserve the right to tweak the overall score if we feel it doesn't reflect our overall assessment and price of the product. Read more about how we test and rate products.
This suggests an attempt at objectivity, or using quantifiable data, in the individual score categories, with the reviewer reserving the right to adjust the overall score as they see fit. In the case of the Lumia 900, it feels that the Ecosystem (and possibly Software), were unfairly downgraded to get closer to the final score Josh wanted to deliver in his overall rating.
Though some of the reaction has been extreme, to say the least, I don't think Josh and the Verge crew should be either surprised or disappointed at the negative reaction in general (though I'm sure they're pleased with the traffic).