|
|
Subscribe / Log in / New account

OT: Biggest expense

OT: Biggest expense

Posted Aug 4, 2009 14:21 UTC (Tue) by man_ls (guest, #15091)
In reply to: Notes from the Montreal Linux Power Management Mini-Summit by elanthis
Parent article: Notes from the Montreal Linux Power Management Mini-Summit

What? I would have sworn that making the payroll was a bigger expense than power costs. Especially in organizations with lots of development going around. Thing is, I never paid attention to energy costs so it may well be. Just curious: is it an impression of yours, or do you have numbers?


to post comments

OT: Biggest expense

Posted Aug 4, 2009 20:41 UTC (Tue) by dlang (guest, #313) [Link] (3 responses)

we had a discussion on this on the lopsa mailing list and found that the cost breakdown still seems to be

people
servers
power

even allowing for 2x power consumption (to cover cooling, etc) servers on a 3 or so year replacement cycle would still cost more than the power they consume over that time (assuming max power draw the entire time)

power is a significant cost, and since it shows up as a single line item it jumps out at people, but it's still not as bad as people are making it out to be.

OT: Biggest expense

Posted Aug 4, 2009 22:09 UTC (Tue) by man_ls (guest, #15091) [Link] (2 responses)

It makes a lot of sense. Maybe at Google things are different, due to a couple of factors:
  • Their legendary ability to automate things: admins manage thousands of servers each. After a quick search it seems that the exact figure is 20k servers per admin.
  • Their huge purchasing power: if they buy 20M servers at a time I guess that they get a special price. Hey, if the used their own house brand it would be a big one, if not the biggest one in the industry; after all they do that with web servers.
For the rest of us things are different. At 0.10$/kWh, one server using 1kW (a high powered beast) at all times costs ~900$/yr. For a 100k$/yr admin (fully loaded) the breaking point is at ~110 high-powered servers per admin -- kind of the industry average according to the first link. You have to manage more per admin to spend more on power than on people, so in an IT department with any development payroll should win hands down.

Similarly, if each server costs 3k$, the breaking point is with a lifecycle of just ~3 years. I would say that either machines cost more or use less juice, so servers should be above power too.

OT: Biggest expense

Posted Aug 4, 2009 22:30 UTC (Tue) by dlang (guest, #313) [Link] (1 responses)

I think that for most shops the server to admin ratio is well below 110:1

if you have any serious uses you have at least two people (probably 3) so that you have someone available all the time (with vacations, sick time, etc). a _lot_ of places which meet this criteria have fewer than the 220-330 servers that would be needed to maintain that ratio.

this ratio is also very dependent on how many different variations of server configurations that you have. google gets such phenomenal numbers of servers per admin by the fact that they have _lots_ of any one configuration. if they only had a couple thousand servers per configuration they would need far more admins than they do ;-) they also don't have their admins deal with failures, they just shut down the failed systems.

In many ways I would rather have another 50 servers to manage that fit in one of my existing baselines than to add 1 special exception box that is completely different.

OT: Biggest expense

Posted Aug 13, 2009 1:41 UTC (Thu) by deleteme (guest, #49633) [Link]

Well Google admin aren't really in charge of 20k servers but of 5-50 computing clusters that are used by developers and G* applications as a server.

One baseline is good but not acheivable.


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds