An Italian court has convicted three current and former Google executives of privacy violations stemming from a 2006 incident involving students at an Italian high school who uploaded a video of they made while bullying a schoolmate with Down syndrome. You can read more on that story on NPR or at the New York Times.
In essence, the charge alleges privacy violations by Google in connection with the posting of the video. Google says it took down the video within two hours of being notified by police and that it cannot be held responsible to monitor all uploaded content by users given that approximately 20 hours of video alone are uploaded worldwide every minute. The Italian court argued that while the video topped the “most entertaining videos” category with 5,500 hits and 800 user comments, Google should have noticed the video sooner. Google further argued no one would file charges against the mail carrier for the content of letters delivered or against a telephone operator for what’s said during a telephone call.
So who’s to blame?
Given the open nature of the Internet, I tend to side with Google (at least in this case) that they should not be held responsible for individual user’s actions. If they were, where would it end?
What to do?
Social networking sites live and breath by user participation. Users drive the sites by sharing content and the content itself drives the growth of new users. The way I see it rather than making Google responsible for what others are doing why not give some of that responsibility back to those who are already out there? There are plenty of ways Google could tap back into the Internet itself for the help it needs to police its content while at the same time demonstrating its intention to be a good corporate (and on-line) citizen.
Enter the “Google Content Squad”
Google already provides mechanisms for users to flag offense content on sites like YouTube and others. Such flags alert Google to take action to review suspected content more closely. However, this requires that individual users who find content to be offensive 1) know that they can flag it in the first place (and by doing so know what that means), and 2) be willing to actually do it rather than just shaking their heads and clicking off to something else.
What if Google were to establish a volunteer program to allow users to join a “team” of Internet content police, like a volunteer Neighborhood Watch to police content in their spare time. Google could establish the criteria for what sorts of content to flag and how. And as an incentive to participate, Google could track team member contributions, including the quality of those contributions, then award points or credits towards “stuff” (i.e. digital media downloads, participating vendor products, etc.) for high performers. I’m sure more creative minds than mine can think of other kinds of rewards that motivate.
Google looks to the world-wide on-line community to provide the content it indexes and from which it makes $$$ BILLIONS $$$ in advertising revenue. Why not ask Google to make it easier (and advantageous) for that same community to help them in return? At the end of the day, we may all be better off for it.
What do you say, Google?
We’re listening . . . and watching.