Hi guys i apologies in advance for the noob question but i am confused and need it anwsered lol When it comes to monthly bandwidth usages, whats the difference between gig and Mbps??? Basicaly, i am transfering my site from one host to another! my site is grow so need more room and bandwidth. My current host gives me 1,500 gigs (which has turned out to be nothing!) and i am moving my site to an unmetered dedicated server with 10 Mbps a month bandwidth. Now, is 10 Mbps the speed of the transfer? or an amount of gigs i can use a month?? the more traffic i have (bandwidth used) do i need more Mbps? please help thanks in advance tony
I don't recall the exact math. I can give you a general explanation. When you buy 1500 gigs of bandwidth that is your allowence for the month. When you buy a pipe size, as in 10Mbps unmetered in theory you get a lot more bandwidth but you may have less when you need it. Pushing 10 Mbps*60*24*30= x is your max per month if your traffic is evenly spread over a 24 hour period which it never is. Here is a link that may help you do the math. http://lyberty.com/encyc/articles/kb_kilobytes.html
While 10Mbps is more banwidth ~3200GB, it limits the pipe speed to maximum 10Mbps, since you can't go over 10Mbps you would notice a slow down during the peak hours, best to look at your current graph and see if you have ever exceeded the 10Mbps, if you have you are better off getting a server with someone who offers 3TB or more on 100Mbps port. 1Mbps ~ 325GB of data transfer per month Mbps - Gbps - Xbps is reference to port speed and not bandwidth KB - MB - GB is usually used as measure of bandwidth
with that amount of transfer, I would take 100mbps port, not 10mbps. Ask your new vendor, if you can take 3000GB transfer/month on 100mbps switch port
Jesus...this guys a dumbo, not knowing the difference between Mbps & gigs, it's like not knowing the difference between petrol & diesel, 1 will work fine the other will mess your engine up!
Hi, The difference between the two is more complicated than you think. The first thing we should clear up is what you mean by bandwidth is more accurately called throughput. The amount of traffic you can transfer. Bandwidth is the theoretical maximum amount of data you an transfer through a medium at one time. 10Mbit/s is an example of a measure of bandwidth. 1,500 GB is somewhat ambiguous but I will assume that the host means 1500G (1G = 1000^3 Bytes) We will also assume that 10mbps is actually 10Mbit/s (1Mbit = 10^6 Bits) 8 Bits = 1 Byte. 10Mbit/s * 8bits/byte = 1.25MiB/s 1.25MiB/s * 60s/min = 75MiB/min 75MiB/min * 60min/h = 4,500MiB/h 4,500MiB/h * 24h/d = 108,000MiB/d 108,000MiB/d * 30d/m = 3,240,000MiB/m 3,240,000MiB/m * 1024MiB/GiB = 3164.0625GiB/m So 3,164GiB is the theoretical maximum you can push through a 10Mbit/s connection. In real life if you really pushed it you might be able to get around 50%-80% (1,582GiB/m - 2,531.2GiB/m) actual usage if you tried really hard. I would suspect your average usage to be much lower. You also need to think about if you need to be able to burst... Are you ever going to have enough visitors at one instance that 1.25MiB/s isn't enough to satisfy all your users? The guy came here to ask a question not to be insulted. You should be more tolerant as there a certainly areas in which we all are woefully ignorant. I'm also compelled to point out that your comparison is does not fit the topic at hand. Not knowing the difference between 10Mbit/s and 15000Gb/month will not cause any damage to the site or require the site to be dismantled and thoroughly cleaned. At the worse possible case it may result in extra charges or the site being disabled by the host.