One question I got at the O'Reilly conference from someone who has 20 million users: "how can I make syndication scale?" He's concerned by a few things:1) RSS and Atom feeds are pulled down by news aggregators like the NewsGator I use every hour. Multiply that by 20 million people and that's a bandwidth bill that's many times higher than it is via a web browser (because browsers only visit occassionally, and not every day).
2) RSS and Atom feeds pull down all the content every time, even content that hasn't changed since last time. Is there a way for him to only send down a feed that's changed, or even better, a partial feed with only new items? (My feed sends down 75 items each time whether new or not).
He says that these concerns are keeping him from adopting RSS or Atom at this time. I'm not using his name here because he didn't want his company to be identified, but he's a very senior person at a very well known company with very high traffic (and, yes, I would love to subscribe to this company's syndication feeds).
I was going to comment for the hell of it. Off the top of my head I figured cookies and a script to generate RSS feeds would work. But Dave Winer said:
Actually HTTP anticipated some of your friends' concerns. If the feed hasn't changed, and if the client and server are "304-aware" (many if not most are) nothing goes over the wire other than a very short message that says "Nothing changed."Tell your friend, if you want, that we'll work with him on getting a scalable system deployed, if he hits a scaling wall. We've been looking for a case to do this with. Even the most popular feeds don't get so much traffic in 2004 to make a big difference in bandwidth. UserLand is hosting the NY Times feeds, for example, and they're quite popular as you might imagine, and they're holding up fine. I'd be happy to talk with them.
And he's going to talk to them.