Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Colocation costs - worthwhile for me?

  • 29-07-2011 8:22pm
    #1
    Registered Users Posts: 1,977 ✭✭✭


    Hi,
    I'm running four servers which are constantly drawing 1.2kWh each. It adds up to about 5kWh in total.

    The machines are towers as opposed to rack mount.

    I don't want to fork out an extra 3.5kWh on airconditoning so I'm going to cool it based on extraction of hot air with a powerful fan. There will be intakes in the walls then for the cool air to replace it. I'm relying on the fact that the average temp. in Ireland during the summer is 19-20c and I only have have the room at 25 ambient.

    In total it's going to cost me about 500 a month to run. I'm just wondering if it would be economical to switch to co-location. I know my cooling solution is not ideal but there are not mission critical and will automatically clock down should the ambient temp outside be too warm.

    However I do have the option of an air conditioner for 500 installed if really required.

    I did ring one group in Galway and they were 550+VAT I think, however they had to add on another 500 due to the kW required.

    I was talking to a company earlier on today and they were advising using a lot of technical terms - not sure if they were trying to scare me into buying their solution but they brought up hot spot issues and some term "Delta T". They were assuming my setup was mission critical though even though I told them it was ok to clock down when the weather was too warm (rare in Ireland I think). The hot spot idea did scare me a little though.

    It's a 6x4 meter room and the air can be extracted up 2250m3/h with my fan.

    Any advice appreciated. Thanks.


Comments

  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    I think I'd be more likely to put more smaller fans in the cases of the towers rather than one big fan in the room they're in - the motion of the air through the cases is what you're trying to arrange. Even in a data centre with colo racks, the point of the racks is to have the hardware in the airflow within the room, not to keep the entire room cool.

    Remember, the room's ambient might be as low as 10-12C, while at the same time the CPU and hard drives could be closing on 100C if not exceeding it (I've had my laptop shut down in a cold room with a 100C overtemp during the cold snap this winter because of impeded airflow in and immediately around the case).

    And if you really think you have a problem, you can always buy liquid cooling systems from places like scan and overclockers as off-the-shelf kits (they're not terribly experimental anymore).

    And for four servers already built as towers, it'll probably be cheaper to add cooling to the towers than it will be to sort out colocation, buy new servers in 19" rack form factors and reinstall everything.

    Though... if it's not mission critical, have you considered buying managed servers? You could probably get enough hardware from various sites (and please don't ask for comparisons, as that's against the charter; but be aware it's an option) for less than your monthly running budget, and they'd be in a datacenter with cooling and so on.


  • Registered Users Posts: 1,977 ✭✭✭euser1984


    See next post.


  • Registered Users Posts: 1,977 ✭✭✭euser1984


    Sparks wrote: »
    I think I'd be more likely to put more smaller fans in the cases of the towers rather than one big fan in the room they're in - the motion of the air through the cases is what you're trying to arrange. Even in a data centre with colo racks, the point of the racks is to have the hardware in the airflow within the room, not to keep the entire room cool.

    Remember, the room's ambient might be as low as 10-12C, while at the same time the CPU and hard drives could be closing on 100C if not exceeding it (I've had my laptop shut down in a cold room with a 100C overtemp during the cold snap this winter because of impeded airflow in and immediately around the case).

    And if you really think you have a problem, you can always buy liquid cooling systems from places like scan and overclockers as off-the-shelf kits (they're not terribly experimental anymore).

    And for four servers already built as towers, it'll probably be cheaper to add cooling to the towers than it will be to sort out colocation, buy new servers in 19" rack form factors and reinstall everything.

    Though... if it's not mission critical, have you considered buying managed servers? You could probably get enough hardware from various sites (and please don't ask for comparisons, as that's against the charter; but be aware it's an option) for less than your monthly running budget, and they'd be in a datacenter with cooling and so on.

    Great reply.
    Interesting what you say about CPU and HD heat up. I could put a big boxfan pointing at the side of the cases (vents there for airflow), I was also thinking about putting four 120mm fans on the side to blow air directly onto the graphics cards (render farm). btw these towers are specifically designed for high air flow - I can't see switching to rack mounted as affordable at the moment.

    If the room heated up to 35c would a boxfan still cool the cards? Somebody told me that even though the air blowing through a fan seems cooler on your akin it's not.

    I suspect that the heat in the room will continue to rise exponentially though without extraction as the machines are running at full speed 24/7 at 1.2kWh each thereabout.

    How about sucking the air out from the ceiling with conveniently placed holes in the wall where it pulls external air in onto the side of the machines which do already have a fan to blow directly onto the cards (not a powerful fan though hence the idea of boxfans above).

    Then if required I could also put boxfans blowing the external air directly onto the machines.

    Though at 2230m3/h extraction at the ceiling I suspect there will be a good draft already. Just throwing out ideas here though.

    Looking forward to your opinion. Thanks.


  • Registered Users, Registered Users 2 Posts: 40,038 ✭✭✭✭Sparks


    euser1984 wrote: »
    Interesting what you say about CPU and HD heat up. I could put a big boxfan pointing at the side of the cases (vents there for airflow)
    Well, it would help somewhat I suppose, but you really want the fans inside the case. A fan on the outside, even only a few inches away, will blow air onto the case and it'll mostly bounce off the case and away into the room. Comparatively little will get into the vents -- what you'll be doing is basicly cooling the physical case itself rather than the component parts. Remember the motherboard is on standoffs away from the case so getting heat off the CPU to the case isn't viable (hence the large market in fans that mount direct to the CPU). It might work to a degree for the hard drives if you haven't used flexible grommets for vibration dampening when mounting them, but again, you're trying to dump a lot of heat through a small connection area by conduction alone, which isn't going to be that effective.
    Mount fans inside the cases (and go nuts, put as many in there as you can) and you'll have air flowing over the components themselves and you'll be cooling them directly.
    Or , get a liquid cooling system and do the same thing but with a fluid that has a higher thermal carrying capacity than air does.
    I was also thinking about putting four 120mm fans on the side to blow air directly onto the graphics cards (render farm).
    Most high-end graphics cards have their own fans mounted on them already, but yes, the more 120-140mm fans in the case the better (it'll be noisy, but that's not a problem with an office server normally).
    btw these towers are specifically designed for high air flow - I can't see switching to rack mounted as affordable at the moment.
    If you can't switch to rackmounts, you pretty much have to rule out colo; most data centers aren't set up for tower boxes, but only for 19" racks.
    If the room heated up to 35c would a boxfan still cool the cards
    35C is still colder than the components you'll be cooling, so yes. Until the ambient temperature is hitting 50C, you'll still be cooling your components. Though honestly, if you were anticipating getting temperatures of 35C for any extended period of time, I'd look into liquid cooling. It might be overkill, but it's hardly pricey - overclockers have a cpu cooling kit that'll handle most cpus for €70.
    I suspect that the heat in the room will continue to rise exponentially though without extraction as the machines are running at full speed 24/7 at 1.2kWh each thereabout.
    Are you sure each server is actually actively pulling 1.2kW each? I mean, I know you can have a PSU rated that high, but your average CPU running at full tilt won't pull that much power (a fifth of that, maybe for an i7 system). Your average hard drive will pull 8-10 watts when being accessed. High end graphics cards, yeah, they'll pull a few watts allright, but to really pull 1.2kW, you'd need something like an i7 running at full tilt, along with a RAID array and three or four high-end nVidia GPUs running in SLI mode, all going at full pelt. That's... rather an extreme load :D
    How about sucking the air out from the ceiling with conveniently placed holes in the wall where it pulls external air in onto the side of the machines which do already have a fan to blow directly onto the cards (not a powerful fan though hence the idea of boxfans above).
    Honestly? Unless you have a monumental amount of hardware in the towers, all running flat-out and actually actively pulling over a Gw all the time, just opening a window would be enough ventilation for a room that size. Hell, you might even get away without even that. You just want the worst of the heat to be taken out of the components, and case fans or liquid cooling will do that far better than any room ventilation system will; and until your ambient temperature averages above 30C, you don't have real cooling issues. Not for a non-mission-critical server, anyway. At an ambient of 30C, the air is still 20-40C cooler than the components you're trying to cool with it.


  • Registered Users, Registered Users 2 Posts: 3,088 ✭✭✭Static M.e.


    Agree with above.

    Buy a thermometer, stick it in the room and watch the temperature.

    With four servers I really wouldnt bother with much in a 6x4 room. Raise them off the ground to avoid dust and spread them out a little.

    See where that gets you first. Cooling systems are expensive to buy, run and maintain.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 115 ✭✭Boskonay


    Out of interest, what are the machines used for?


  • Closed Accounts Posts: 1,188 ✭✭✭UDP


    Buy a thermometer, stick it in the room and watch the temperature.
    With regards this, OP purchase a "TEMPer USB Thermometer" from ebay - should be less than 20 euro (I purchased one for less than 10). Then combine that with a USB cable extension lead and this software. Then you will have a full graph/log of the changes in temperature.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 92,367 Mod ✭✭✭✭Capt'n Midnight


    most systems these days have temperature sensors just use them
    even if not then you may be able to get temps from HDD's


    air flow

    damp cloths or similar so you get some cooling - back in the middle east over a thousand years ago they routed the incoming air through an underground passage. can you use something like a chimney to get a draught to help air flow


Advertisement