Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

limiting bandwidth usage

Options
  • 24-05-2007 11:44am
    #1
    Registered Users Posts: 648 ✭✭✭


    hi
    i have a high traffic website that im trying to limit bandwidth on.
    it ran over the limit again this month

    i know making the pages small in size and use as less images as possible is the best way.. but is there any other tricks of the trade to do this ?


    tnx


Comments

  • Registered Users Posts: 1,452 ✭✭✭tomED


    Not really sure what your best options are on this apart from those obvious ones you've mentioned.

    What about using your robots text file to block some search engines spidering the website? That obviously depends on how much you value each search engine, but they can really eat up your bandwidth.

    You'll probably notice your site goes over its bandwidth usage limits after google has been on.

    I'm not saying ban googlebot, but what search engines do you get little traffic from that use up a lot of bandwidth?

    Another option is to improve your websites monetisation so that you can at least pay for your bandwidth usage! :)

    My two cents....

    PS I'm not condoning the banning of search engine bots - just merely stating it as an option! :)


  • Registered Users Posts: 37 simplybetter


    Hi ChicoMendez,

    I had the same problem as you before and this is what I did:

    1. Use a good basic graphics package (paint shop pro etc) to optimize you images - you can cut them down to almost half an un-optimised one.

    2. If you use large JavaScript files that don’t get called all the time put them into a separate file and only call them when needed, works for CSS code etc as well

    3. Google bots can be hungry but they are also responsible for directing a lot of your traffic - using a robots.txt file make sure the google bot is only trawling the folders / file you want it too.

    4. If it is your own server / you could as your hosting provider to, check that HTTP compression is turned on

    5. You could even have you images hosted by flickr etc


    try it and let us know


    cheers


  • Registered Users Posts: 7,739 ✭✭✭mneylon


    Server-side compression can help


  • Registered Users Posts: 15,065 ✭✭✭✭Malice


    If you are hosting videos or other large files, could you reduce access to them? Maybe provide them to registered users only?


  • Closed Accounts Posts: 2,161 ✭✭✭steve-hosting36


    If you're mainly worried about going over your limits from a financial point of view - you can throttle your connections to ensure you stay within limits, or it might be worth approaching your host - they might sponsor or otherwise support your site if it has a relevant audience.


  • Advertisement
  • Closed Accounts Posts: 270 ✭✭CoNfOuNd


    Out of interest, how much bandwidth are you using a month?


  • Registered Users Posts: 648 ✭✭✭ChicoMendez


    tnx for all your replies...
    no there is no large media files...

    -i did notice the editors were not compressing the new jpgs so they were 6 times the size they should have been
    -no there are no large media files
    -ive also enabled GZIP Page Compression


    - bandwidth is currently at approx 300gb
    - unique visits at 300,000 which we are hoping to grow to 1 million before the end of the year

    tnx


  • Registered Users Posts: 2,031 ✭✭✭colm_c


    If you're going for 1 million hits - you'll need to increase your bandwidth allowance otherwise you'll hit the limit fairly early. I'd also look at your website's software/database and hardware - with that many concurrent connections you're bound to have some issues with performance.


Advertisement