Converging to "Good" Content

I've been trying to intersect a variety of conversations lately -- a Wikipedia board member on the editing and document model for his work; our SunSpace engineers in terms of the value of community equity; and editing work with two co-authors on a book about WordPress (an interesting task given the existence of the WordPress Codex.

In each case, the key question is "How do we know we're converging or improving the quality of what we have?" This problem shows up in various ways, depending upon the editing context. SunSpace is our internal wiki for collecting technical expertise about our products, services and industry applications of them. It has all of the benefits of a wiki (ease of editing, multi-author editing, revision history) but also the downsides as well (frequent editing, no clear indication of whether the new version is better than the old, and the occasional keyboard error that displaces valuable content).

One of the biggest problems with a large wiki, with an even larger volume of rapidly changing content, is that it's hard to ascribe value to what's in it. We've tied a notion of community equity to SunSpace, giving equity kickers for creation, re-use, and participation. The last point is the creative one, because it encourages ranking, voting and manipulation of what's in the wiki. There's very little value in being a write-only memory; there's tremendous value in knowing what stimulated conversation, contention, and competition. What I like about community equity is that it captures the value of expertise, and rewards interactions, not just outputs.

A team meeting two weeks ago with the SunSpace engineers got me thinking about a long-standing discussion I'd had on the editorial and content management model for Wikipedia. In short: how do you know that the quality of an entry is improving? It's possible to tie a ranking engine like community equity to a Wikipedia entry, and use references (people who land on the page and read it) as well as voting (like/dislike buttons) to measure the surface area quality of the entry.

But now add a time element to it, and look at the overall equity of the entry as it undergoes revision and extension. Is it trending upward, in which case the crowdsourcing of the content is a valuable effort? Or is it gyrating, perhaps with a large dynamic range, indicating that successive edits are trending toward opinion and interpretation, and less based on facts or measurable, objective evidence. If enough people like or appreciate the net changes to an entry, then it's "good enough" even if it rubs the original entry author or subsequent editors the wrong way. I had this exact experience adding my own thoughts to the Wikipedia entry on Princeton's Colonial Club, where I felt capturing a bit of the 70s and 80s would flesh out a much more recent history. Seems like the page authors didn't agree with me, and my edits soon vanished. I'd prefer if the decision was made by the readers and consumers of the page, rather than an arbitrary editorial board. Of course, for content that triggers the whipsawing of public opinion, it's time to bring in the professional encyclopedic editors.

Wikis don't obviate the need for good content publishing and production processes, as we've learned with our SunSpace work, but they do give us a platform in which to build and measure equity in a broad sense.

Comments:

Post a Comment:
Comments are closed for this entry.
About

Hal Stern's thoughts on software, services, cloud computing, security, privacy, and data management

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today