Coral | Evil Genius Chronicles

Coral

September 22 2004 | 1 min read

I'm kind of torn here. I found via inbound links this post from Nathan McFarland. He's the guy who proposes automatically Coralizing urls as they get downloaded (and did a patch to get_enclosures to do just that.) I'm not sure that's a good idea. My main problem is that the big media people might freak out about these unauthorized mirrors, all of which have a full copy of their "content" (or bundles of passion if they are lucky). It might sound stupid, but big companies and their lawyers get worked up about stupid things and crush good ideas for reasons just like this. By automatically Coralizing the URLs, that takes away from any publisher the ability to publish as they want.

In a technical sense, what happens if your script forces to one caching system and the URL publisher uses a different one? What if I publish my URLs using freecache, and the script automatically Coralizes? What if I'm paying Akamai to mirror for me, and explicitly don't want anyone else doing it? What if Coral goes down and/or has the plug pulled on the project?

I think this needs to remain in the hands of the publisher of the feed. However, that's not to say that I think Coralizing is bad, just that the automatic forcing to that via the scripts is. As an experiment, I'm publishing today's MP3 for direct download as a Coralized link. I'm not redoing any older ones, because that will force all the iPodder scripts to redownload them (because it will have a different URL.) I'll do this one that way and see how it goes.