Friday, July 08, 2005

P2P will eat itself

P2P seems like such a good way to distribute content - legal and licenced or ripped and stolen - but it suffers from it's very democratic nature.
There are simply too many competing systems, too many versions of the same file and too many unstable connections to make it a practical and reliable way to distribute anything.

Soluitions like BroadCast Machine (which still needs porting to IIS !) make it really simple to put content out there and seed it without any complicated setup or experience needed, but when it comes to finding content it's another matter.

Say I was looking for a copy of a public domain documentary. I know the name and I visit the various tracker sites and eventually find not one coherent source but half a dozen different trackers, all in different states of repair. Not one of them is guaranteed to be complete and although they may all be sourced from the same original there's no way I can tie them together to work as a team.

Sure, it's a great idea.... one file shared between thousands of peers.
But when you have thirty trackers each with a dozen different seeds and no way of consolidating them / recognising they are one and the same file that's being shared it's little better than 1:1 downloads, easpecially when over half the downloaders are just leeching and not sharing.

The record and movie industries are worried about P2P. I wonder if they just need to wait and watch the continued fragmentation make them unworkable, or at least stop them ever gaining the critial mass that the proponents claim will topple 'big media'.

What will work is in closed networks (such as the cable operators) where they can use the customers connection and always-on settop box to distribute ondemand content in a secure and controlled manner and know they're getting optimum use of the system.

No comments: