That is perhaps the most benign interpretation of the AP's recent announcement that it would institute a "wrapper" technology to prevent infringement of its copyrighted material, including headlines and abstracts.
Like most bloggers, my initial response was that the AP's management must be made up of nincompoops. While that may well be true, I'm not sure that what the wire service is proposing will actually have any effect on what those of us who follow "Fair Use" practices do when we cite AP content.
In an interview with Jane Seagrave, the AP's senior vice president of global product development, in the Columbia Journalism Review, Ryan Chittum obtained the following quote:
"We want to stop wholesale misappropriation of our content which does occur right now--people who are copying and pasting or taking by RSS feeds dozens or hundreds of our stories. Are we going to worry about individuals using our stories here and there? That isn't our intent. That's being fueled by people who want to make us look silly. But we're not silly."
Okay, so it would seem that bloggers and small content aggregators are not the target, then. But this where I believe the AP as gotten itself confused. Seagrave estimates the lost revenue caused by this "wholesale" piracy of its content as "in the tens if not the hundreds of millions."
If she means dollars, I'd beg to disagree, at least until we see the evidence. CJR's Chittum notes wryly that a "firmer estimate" would be nice, something we can all agree on.
In over a decade and a half of surfing content sites, I have yet to see but a handful of examples of the kind of cut and paste journalism decried by Seagrave and other AP execs. And the ones I've discovered are mostly on tiny web operations overseas, not professionally produced web pages with robust business models.
It's hard for me to imagine these pilferers are able to turn much of a profit, if any; oftentimes they don't even have ads on their mini-sites, so making money seems not to be their main agenda. All of the big operators in online media â€" Google News, Yahoo, etc. â€" already pay the AP for the rights to its heds & content blurbs. So if money is being left on the table at those destinations, AP needs to negotiate more lucrative contracts, not blame the Internet at large for this imaginary wholesale highway robbery that nobody seems able to quantify.
In a related development this weekend, Saul Hansell reported in The New York Times that the Silicon Valley start-up Attributor is approaching media companies with "an automated way for newspapers to share in the advertising revenue from even the tiniest sites that copy their articles."
But Attributor's approach is based on an assumption that most of the so-called "pirate sites" use ad networks like Google's AdSense to collect revenue. Attributor, therefore, plans to scan only those Web pages on behalf of its partner publishers. From there, it apparently plans to coax the ad networks in order to compel a revenue-share with the copyright owner.
This potentially sounds like a more promising approach than the AP's software wrapper concept, because at least it is targeted at the pirates' sweet spot, where they try to collect their rent money. But once again, I have to question how much money is at stake here. Attributor cites an internal study that indicates publishers are losing $250 million a year to the pirates.
Before buying into what this startup (which has been funded by three VC rounds to the tune of $20 million) is offering, publishers, including the AP, may want to carry out a somewhat more definitive survey of how their content is circulating on the web, rather than relying on what may turn out to be phantom estimates.