Hello, I have a site of a free images hosting hosted in a VPS: 2GB RAM DDR3 CPU: 4 Core AuthenticAMD, QEMU Virtual CPU version 1.0.50 cpu MHz : 2199.998 cache size : 512 KB With my site i use Cloudflare, a free CDN service that cache me all the images. My site is growing very fast and now every day i counsume 4 TB of bandwitdh with 18.000.000 hits. (With cloudflare i consume low Gb of BW because it caches all) Now i overloaded their server, so they disabled my account, that's normal. I have like 1 week and after they will block my account. So now i must handle 18.000.000 request with apache. Is that possible? I tried to disable Cloudflare, and my site goes offline cause of lot of request. I think that it's not a problem of VPS, my CPU load max is 20% and i have always 20% of RAM free (this when i disable cloudflare). Apache crash with this message "server reached MaxClients setting, consider raising the MaxClients setting". Only if i reactive Cloudflare, the site goes up. These are my config about Maxclients: <IfModule prefork.c> StartServers 5 MinSpareServers 5 MaxSpareServers 10 ServerLimit 150 MaxClients 150 MaxRequestsPerChild 150 </IfModule> I used netstat too see the connection when the site goes offline, at 1700 connection. (it overcome the 2200 but with site offline, or with trasfer very slow, like half site loaded after 5 minutes) I used apache server-status, here the statistic. This log is when cloudflare is active, if i deactive it in little seconds there are hundres (maybe thousand) apache connection. http://pastebin.com/CCfCFrhW This log is when i disable cloudflare (i printed this log after 1 second after i disabled cloudflare, because after 5 seconds site goes offline) http://pastebin.com/9iycVkjb Hope you will propouse some solution, Thank you
I'm running now the site with ngnix and seems to be better, i mean that there isn't crash. But always very very slow site. How i disable Cloudflare after 1 minute the site start to become very slow like 1 minute to load the page. MY QUESTION; There is the possibility to block the bandwitdh (load) for an images that for example overcome 200 request in a minute? Because some images are posted on popular site like Facebook, and i want that only first 200 users can see the image, the others must wait some times to see it. This i think can save some bandwitdh and server load. Thank you for help.
"Because some images are posted on popular site like Facebook, and i want that only first 200 users can see the image, the others must wait some times to see it." Can't think of any easy way to do this & this would kind of defeat the purpose of sharing the images in the first place. Have you looked at actually upgrading your server with your host?
There is definitely some tweaking to the apache configuration that can be done, but it would be a bandaid solution. Damon (the Damon from CloudFlare?) is correct: you're going to have to upgrade with that amount of traffic. You will find some good information about tuning apache, including a rough formula for setting MaxClients, here: http://2bits.com/articles/tuning-the-apache-maxclients-parameter.html That said, I think a VPS is not the best solution for your situation. VPSs have limitations in a number of areas beyond just RAM or CPU so you might consider a dedicated server. You don't mention server-side caching. Even though you're using CloudFlare you might well benefit from a cache like Varnish. Finally, consider the application your website is built with. If you're using a CMS to manage your website or image sharing you might want to look into an opcode cache.
"(the Damon from CloudFlare?)" That would be me "Even though you're using CloudFlare you might well benefit from a cache like Varnish." I've heard good things from customers using both CloudFlare and Varnish (have to do a slight tweak in Varnish...at least based on what customers have told me).
Thank you guy for the help, i really appreciate it. I'm going to test varnish, i installed it on port :80 that link to nginx web server on port :8080 Now if i use "curl -I http://www.s3.mysite.com/" HTTP/1.1 200 OK Server: nginx/1.2.2 Content-Type: text/html; charset=utf-8 Vary: Accept-Encoding X-Powered-By: PHP/5.3.15 Set-Cookie: PHPSESSID=96fed0300a3b707760d23609d5c7c619; path=/ Expires: Thu, 31 Dec 2037 23:55:55 GMT Cache-Control: max-age=315360000, public Pragma: no-cache Set-Cookie: login_password=deleted; expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/; domain=mydomain.com Content-Length: 14294 Date: Fri, 10 Aug 2012 22:45:34 GMT X-Varnish: 226451812 Age: 0 Via: 1.1 varnish Connection: keep-alive I'm using this configuration: pastebin.com/FmYrtMt3 Another time thank you for helping. Best Regards
I need a little help about balance server. ab -kc 1000 -n 10000 http://www.s4.example.com/index.html (protected with cloudflare) If i use this command on Cloudflare i saw that like 50% of request are "Failed requests". I think that this is a protection against spam and avoid overload. While if i try the command on the same website but not protected by cloudflare, i see that my webserver nginx answer with 100% of Complete requests. This can be good, because the nginx server + varnish do a great job, but if there are thons of requests can be problems. How enable this settings like Cloudflare? Thank you
This is my benchmark onl localhost: ab -c 100 -n 10000 This is ApacheBench, Version 2.3 <$Revision: 655654 $> Copyright 1996 Adam Twiss, Zeus Technology Ltd, Licensed to The Apache Software Foundation, Benchmarking localhost (be patient) Completed 1000 requests Completed 2000 requests Completed 3000 requests Completed 4000 requests Completed 5000 requests Completed 6000 requests Completed 7000 requests Completed 8000 requests Completed 9000 requests Completed 10000 requests Finished 10000 requests Server Software: nginx/1.2.2 Server Hostname: localhost Server Port: 80 Document Path: /banner.jpg Document Length: 168 bytes Concurrency Level: 100 Time taken for tests: 4.947 seconds Complete requests: 10000 Failed requests: 0 Write errors: 0 Non-2xx responses: 10007 Total transferred: 4182915 bytes HTML transferred: 1681176 bytes Requests per second: 2021.41 [#/sec] (mean) Time per request: 49.470 [ms] (mean) Time per request: 0.495 [ms] (mean, across all concurrent requests) Transfer rate: 825.72 [Kbytes/sec] received Connection Times (ms) min mean[+/-sd] median max Connect: 0 1 2.6 0 26 Processing: 0 48 74.5 26 1120 Waiting: 0 34 50.2 21 1031 Total: 0 49 74.6 27 1121 Percentage of the requests served within a certain time (ms) 50% 27 66% 40 75% 50 80% 58 90% 108 95% 176 98% 307 99% 386 100% 1121 (longest request)
It appears that are looking for creative ways to limit access to certain parts of your website. This is counter to what most people want from a website. The bottom line is that if you have a high level of traffic and your server is choking you're going to have to upgrade once you exhaust all of your optimization options. If you're unwilling or unable to upgrade to a more suitable server you might want to look into mod_cband. From the module author: "mod_cband is an Apache 2 module provided to solve the problem of limiting users’ and virtualhosts’ bandwidth usage. The current versions can set virtualhosts’ and users’ bandwidth quotas, maximal download speed (like in mod_bandwidth), requests-per-second speed and the maximal number of simultanous IP connections (like in mod_limitipconn)"