Hi guys, if you visit http://my.yahoo.com you'll notice that they show RSS feeds as "dynamic news" on their websites. My website www.pinkpt.com has a RSS feed. Does anyone know how I can offer the news to other websites using this method?
Search Google for 'rss2html' and you will find packages that allow you to syndicate content from a feed to another website. So you could have other people use that package to syndicate your content on their website. Also there is a service called RSS Digest that does something similar.
You can see it in action on a couple of my sites. I use one of the PHP RSS2HTML packages. Here is a site about General Aviation News and the Beech Aero Club (look middle left) and the International Aeronauts League (middle right) both syndicate the the content.
Carp can convert your rss into javascript, or you can use it to display a newsfeed. There's also magpie.
I use one on my SEO site (http://search-engine-optimization.online-site.net) where the front end pulls the rss feed from the blog behind it to form the frontpage. I'll have the tgz of the script lying around somewhere if you want it.
Looks great guys. I installed rss2html but do you guys run into speed issues? http://www.pinkpt.com/rss2html/rss2...ews/sitefeed.xml&sample-template=&MAXITEMS=10 When I load that... it takes a bit to load... and when I called it on a site using php includes... it slowed down loading times by a lot! --> Is this the correct way to call it?
Here's an example of Carp output. I haven't tried magpie myself, but carp is pretty quick. That page is really slow, but your atom feed is actually pretty quick. http://blog.bkweddings.com/wp-content/carprss.php I think to work with atom feeds you have to use Grouper Evolution - which is another product from the same company, but it can use a regular rss feed if you have one.
Thanks everyone for their suggestions and replies ^_^ I'll try to implement them and let everyone know how it goes.
Speed can certainly be a problem when pulling data from a third party site. Caching the results and reusing them for an hour or so can help significantly. In my case I get around it by the fact that all of the sites are running on the same server. I also cache the feeds for 90 minutes.
You'd want to cache the results not only for speed, but to avoid hitting the feed site every time someone loaded your page. Most sites (Slashdot is one example that does this automatically as far as I know) will block your IP if you request their feeds more than once an hour.
Yeah, I run cron jobs to grab feeds periodically and write the results into an HTML file that I then include in my PHP code.