I've been looking at some server specs and would appreciate it if someone could explain what is the difference in bandwith between a server that offers transfers of 1 MBPS vs 10 MBPS Unmetered.
Get the 1mbit connection. It's like the size of the pipe (and doesn't have anything to do with the server itself, it's got to do with the internet connection itself). If you have a 1mpb feed, then one connection can download at about 1mbps. Two connections can download at 0.5mbps each. All connections share an equal fraction of the total bandwidth. With a 10mbit feed, you could have 20 users each getting 0.5mbs feed, again because all the users share an equal fraction of the total bandwidth allocated. to look at it another way, 10mbs is 10 megabits per second. That means: 10megabits/secondX 60X60=36000mbits per hour. 36000mbits/hour /8 bits/byte = 4500megabytes per hour=4.5gig per hour available for download. So over 100 gigs per day if the connection was wide open. A 1mbit connection would allow for 1/10th of that or 10 gigs per day if used wide open. (I've heard the rule of thumb is that a 10mbit connection will download about 400gigs per month if used wide open). To put this in perspective, 1mbit/8mbits per mbyte=.0625 megs per second download = 62.5K per second. So if your pages are healthy sizes say at 30K each, you can download 2 pages per second, every second of the day. And of course, not only is it unlikely that you have that kind of traffic, but most requests don't come in simultaneously like that and even if you got three requests all bang on at exactly the same instant, it'd just take something like 1.3 seconds to serve each page instead of 1 second. Not noticeable. If your current host has 'mrtg' graphs, that will show how much bandwidth you're using. Otherwise you're likely guessing. Unless you're using serious bandwidth on a site (and I'm not talking 25 gigs per month, but a lot higher) then the 1mbit connection should be fine for you.