I’m kind of torn here. I found via inbound links this post from Nathan McFarland. He’s the guy who proposes automatically Coralizing urls as they get downloaded (and did a patch to get_enclosures to do just that.) I’m not sure that’s a good idea. My main problem is that the big media people might freak out about these unauthorized mirrors, all of which have a full copy of their “content” (or bundles of passion if they are lucky). It might sound stupid, but big companies and their lawyers get worked up about stupid things and crush good ideas for reasons just like this. By automatically Coralizing the URLs, that takes away from any publisher the ability to publish as they want.

In a technical sense, what happens if your script forces to one caching system and the URL publisher uses a different one? What if I publish my URLs using freecache, and the script automatically Coralizes? What if I’m paying Akamai to mirror for me, and explicitly don’t want anyone else doing it? What if Coral goes down and/or has the plug pulled on the project?

I think this needs to remain in the hands of the publisher of the feed. However, that’s not to say that I think Coralizing is bad, just that the automatic forcing to that via the scripts is. As an experiment, I’m publishing today’s MP3 for direct download as a Coralized link. I’m not redoing any older ones, because that will force all the iPodder scripts to redownload them (because it will have a different URL.) I’ll do this one that way and see how it goes.

Published by


Dave Slusher is a blogger, podcaster, computer programmer, author, science fiction fan and father.

2 thoughts on “Coral”

  1. On a practical note, I just tried using Coral to cache and then fetch a file. It seems a bit flakey. On my first attempt, it tried to use server It gave me an HTTP 500 error. Next attempt used IP address That worked OK, albeit slowly (59.64K/s).

    Then it got worse. Next attempt went to – I was getting 312.51 bits/s (that BITs per second).

    Then gave me 2.32K/s

    Something’s not right (and my server sits on a 1000 Mbit/s link to the world’s research networks). The file I’m attempting to transfer is

    or, coral’ised as

  2. So on Gordons topic, Coral is a bit flakey. No worse than torrent in my mind, but flakey. I would not use it as the sole way to get files, but it is an easy way to take some load off of the servers. I personally have never experience the very slow times, Just the bad responses. I get a bad response I move on to the non-coralized URL. That is why a coralized script is good, but a coralized feed is not. Coral is not reliable. But it is a way to save bandwidth.

    And that is why I don’t have Dave’s problem with Coral going away. (Nor with the cache of a cache problem — It doesn’t work you fallback) Nothing should rely solely on coral, so if the whole network goes down, the content will still get delivered. That is why I like Coral over Torrent. Torrent locks authors in. It changes feeds. And things do permanently drop off BitTorrent. That is a huge problem when there is no fallback.

    As for the legal/control issues – well those are the first criticisms of coralizing that I see as persuasive. But those issues are already faced by the Coral network – I see that as their problem. And if they go down because of it, well we have a fallback.

Comments are closed.