If you want to rsync to a server you need to have the rsync daemon running on that server. If you want "bidirectional" then you need to have the rsync daemon running on both servers. Or, you could use ssh as your rsync client in which case you wouldn't need to have an rsync daemon on either machine. Your question was quite brief, so please clarify more if I misunderstood it.
Well, let's say web1 web2 web3 web4 I want to run rsync on each box but keep the most current version. For example, lets say each box gets their own updates, so want to ensure that the other boxes have the most current files. Ideas?
Ah, I see. Thats messy. I would suggest that you pick one server as your master to do your updates on and then call rsync as clients to that server. Otherwise you can run an rsync server on every machine and create a cron job on each machine to run rsync every hour or so, to each machine. But that means you'll be doing tons of rsync tries an hour, because no server is the 'master' server, which means every single server will be rsyncing to every other server. It should work though, as it will update to the most recent version of the file.
The problem with the "master" idea is that what happens when the master goes down? I would imagine that creates an even messier situation. We are trying to create a situation for a client that allows for truly fail safe computing.
I have never used a solution like you need, but Unison is a unix program that will do what you want I think. http://www.cis.upenn.edu/~bcpierce/unison/ It has a similar rsynch technology, which reduce transfer and compress files.
When your master goes down, you could promote a slave to master. Or, do without the master until you rebuild it. You could even rebuild by rsyncing from the slave on a one shot basis.