A message forum for general discussion. Please come and chat with others! |
I feel like this concept has actually been discussed, albeit many, many years ago. Back in my mod days (which ended 8 years ago), I was involved in these sorts of discussions on a regular basis, and it informed some of the reviewing newsletter topics that would later end up in "Reviewing Handbook" ![]() I think that reviewing reviewers could work as a voluntary measurement for the public reviews. I see a few factors that should be considered in an overall score: length, the credits given to individual reviews by a specific member, ratings by anyone that wants to rate and perhaps a survey of raters selected at random. (Note: When I say ratings, I don't mean star ratings. I think a different numeric measurement would need to be employed; perhaps sporting events where results are determined by judges' scores could be a useful starting point.) A degree of automation would need to be employed to calculate the overall scores with surveys being the most manual part. That said, I'd be inclined to lean on the surveys a bit more, as they can suss out if a line by line edit is really more helpful than providing 2-3 suggestions. Sometimes I've only provided 2-3 suggestions but have delved into providing quotes from the piece where I see room for improvement (along with my rewrite suggestions). A survey could level the playing field in this regard, especially if the people completing the survey come from a diverse crop of site members (not just the most active). In short, I think reviewing the reviewer is something that could be done, but I think we'd need to break away from the star rating mindset in order to make it work. ![]() For all your quirky needs, stop by the "Gift Stop" ![]() |