|aggregate rss or atom feeds|
aggregate rss or atom feeds
Jun. 8th, 2005 @ 04:27 pm
I want a tool or service that i can give multiple rss and/or atom feed url's to, and get back an aggregation.
LiveJournal itself is pretty great at this... but I need the aggregation to be programmatic and on-the-fly.
This could be best solved with source code. I'd imagine someone might rent a service that does this well already.
Open to ideas. YOUR ideas.
|Date:||June 9th, 2005 12:13 am (UTC)|| |
not sure i understood what you want.. do you want a script or something that will collect items from several feeds and present them in a unified feed? is that it?
i don't know any service that will do that.. but i've coded a small php script using mysql that collects news from several feeds and inserts the items on a mysql table.
the whole script needs to be revised and fixed.. so it's not ready for "publication" yet.. but if you need some help on one or two issues, i'll be glad to help. although i think what you really wanted was a service which does that, not code one yourself. am i right?
if so, i have no idea if one exists or not.
you have it right. i would welcome a script or a service. someone has done the work of aggregating multiple xml feeds into one html presentation. ultimately i will do it myself but i hope there is code and/or an existing service somewhere.
my LAMP skill could be sharper. i'd love to just take a look at what you've done.
|Date:||June 9th, 2005 12:24 am (UTC)|| |
i can show you but i'd prefer to do it privately. do you have msn? if not, irc will do.
use this address: andr3.pt (a
On the fly?
That'd be a bad idea - a few refreshes and you'd find your script banned from many of the feeds for hitting them too often. Plus, with more than a couple feeds, it'd take ages for the script to execute.
i could cache the rss / atom content, of course. why would the script be slow?
|Date:||June 9th, 2005 01:11 am (UTC)|| |
because you'd be making n connections in a row, n being the number of feeds you're viewing.
like i told you on msn, to have the script only update feeds on certain intervals not only do you avoid getting banned but you also lift the weight on each time it fetchs the xml file for a feed.
Using PHP (for example) to connect to an external site and pull its contents down usually takes at least a second, often quite a bit more if it's a slower site.
So, if you were doing it on-the-fly, you'd have a couple seconds times the number of feeds. I read about 120 feeds, so that'd be a probable minimum of two minutes to process. More likely 5-10, too.
So, it'd absolutely have to be cached, and you'd likely want the update script running as a background cron job.
I had hoped to do these queries in parallel, but you're right that there's a bit more delay here than I'd thought.
|Date:||June 9th, 2005 01:10 am (UTC)|| |
do they support exporting your feed's entries in xml format?
|Top of Page
||Powered by LiveJournal.com|