Talk:Wikipedia Statistical Survey
Ideas for quantitatively evaluating edits being made on Wikipedia
Does anyone have any proposals for ways to evaluate the quality of edits made by users? My initial concept is that we look at how much of each edit made is still there after a certain period of time. Another consideration may be to give more weight to edits that remained after a large number of other edits were made - as opposed to edits which remain, but are one of the most recent edits so it's unclear if anyone has even noticed them yet. The logic in this weight would be that the more edits occur following a particular edit, the more likely the particular edit was of a high enough quality it no longer warranted attention. I am open to variations on this concept, as well as any other ideas on how we might quantitatively measure the quality of edits being made. Trevor Parscal 21:34, 25 February 2009 (UTC)
- The most fundamental question as far as quality goes is whether an edit was reverted; those that are reverted are almost always vandalism or similar problematic edits (using an article as a "sandbox" for testing, for example). Ideally the Wikipedia database would have a hash total for page versions the way it does now for images (for images, this detects duplicates), but this hasn't been implemented; still, there are ways to do this - see, for example, this graph.
- After that, while it is possible to measure how long any given text lasts (see this academic paper, for example), that raises the question of So what?.
Suppose the results of the number crunching are:
Survivability of edits | Editor group (number of edits in July 2008, excluding reverts of vandalism) | |||
---|---|---|---|---|
1 - 9 | 10 - 99 | 100 - 999 | 1000+ | |
Less than one month | 60% | 30% | 15% | 10% |
One to three months | 30% | 30% | 30% | 30% |
More than three months | 10% | 40% | 55% | 60% |
What actionable conclusions would be able to be drawn from data like the above? That less frequent editors aren't added stuff that's as of good quality as high-volume editors? (Again, so what?) Or suppose the data shows that there isn't much variation in quality among the groups - what implications does that have for usability - that there are no problems with it? John Broughton 17:28, 27 February 2009 (UTC)
Improving Wikipedia's accuracy: Is edit age a solution?
From the abstract for the article of that title (December 2007):
- Overall, the results do not provide support for the idea of trusting surviving segments attributed to older edits because such edits tend to add more material and hence contain more errors which do not seem to be offset by greater opportunities for error correction by later edits.
Click rate of edit tab
Hi. It is known if click rate of edit tabs has gone up? Emijrp 15:02, 22 March 2010 (UTC)