Blog


Your Categories
Information Infrastructure EII TCO/ROI Hardware Uncategorized Green IT Development

Green IT

At the IBM Systems and Technology Group analyst briefing two days ago, IBM displayed three notable statistics:

 

 

  1.        The global amount of information stored has been growing at 70-100% per year the last 5 years, with the result that the amount of storage has been growing by 20-40% per year;
  2. .       The amount of enterprise expenditures for datacenter power/cooling has grown by more than 10-fold over the last 15 years, with the result that these expenditures are now around 16% of system TCO – equal to the cost of the hardware, although well below the also-rising costs of administration;
  3. .       Datacenter energy usage has doubled over the last five years.

 

 

These statistics almost certainly underestimate the growth in computing’s energy usage, inside and outside IT. They focus on infrastructure in place 5 years ago, ignoring a highly likely shift to new or existing data centers in developing countries that are highly likely to be more energy-inefficient.  Also, they ignore the tendency to shift computing usage outside of the data center and into the small-form-factor devices ranging from the PC to the iPhone that are proliferating in the rest of the enterprise and outside its virtual walls. Even without those increases, it is clear that computing has moved from an estimated 2 % of global energy usage 5 years ago to somewhere between 3 and 4%.  Nor has more energy usage in computing led to a decrease in other energy usage – if anything, it has had minimal or no effect at all. In other words, computing has not been effectively used to increase energy efficiency or decrease energy use by more than marginal amounts – not because the tools are not beginning to arrive, but rather because they are not yet being used by enterprises and governments to monitor and improve energy usage in an effective enough way.

 

And yet, there have been voices – mine among them – pointing out that this was a significant problem, and that there were ways to move much more aggressively, since the very beginning.  I remember giving a speech in 2008 to IT folks, in the teeth of the recession, stressing that the problem would only get worse if ignored, that doing something about it would in fact have a short payback period, and that tools for making a major impact were already there. Here we are, and the reaction of the presenters and audience at the STG conference is that the rise in energy usage is no big deal, that datacenters are handling it just fine with a few tweaks, and that IT should focus almost exclusively on cutting administrative costs.

 

All this reminds me of a Robin Williams comedy routine after the Wall Street implosion.  Noting the number of people blindly investing with Bernard Madoff, pronounced “made off” as in “made off with your money”, Robin simply asked, “Was the name not a clue?” So, I have to ask, “energy usage”:  is the name not a clue?  What does it take to realize that this is a serious and escalating problem?

 

The Real Danger

 

Right now, it is all too easy to play the game of “out of sight, out of time, out of mind.” Datacenter energy usage seems as if it is easily handled over the next few years. Related energy usage is out of the sight of corporate. Costs in a volatile global economy that stubbornly refuses to lift off (except in “developing markets” with lower costs to begin with), not to mention innovations to attract increasingly assertive consumers, seem far more urgent than energy issues. 

 

However, the metrics we use to determine this are out of whack. Not only do they, as noted above, ignore the movement of energy usage to areas of lower efficiency, but they also ignore the impact of the Global 10,000 moving in lockstep to build on instead of replacing existing solutions.

 

Let’s see how it has worked up to now. Corporate demands that IT increase capabilities while not increasing costs. The tightness of the constraints and the existence of less-efficient infrastructure causes IT to increase wasteful scale-out computing almost as much as fast-improving scale-up computing, and also to move some computing outside the data center – e.g., Bring Your Own Device – or overseas – e.g., to an available facility in Manila that is cheaper to provision if it is not comparably energy-optimized at the outset. Next year, the same scenario plays out, only with even greater costs from rebuilding from scratch a larger amount of existing inefficient physical and hardware infrastructure. And on it goes.

 

But all this would mean little – just another little cost passed on to the consumer, since everyone’s doing it – were it not for two things; two Real Dangers. First, the same process impelling too-slow dealing with energy inefficiency is also impelling a decreasing ability of the enterprise to monitor and control energy usage in an effective way, once it gets around to it.  More of the energy usage that should be under the company’s eye is moving to developing countries and to employees/consumers using their own private energy sources inside the walls, so that the barriers to monitoring are greater and the costs of implementing monitoring are higher.

 

Second – and this is more long-term but far more serious – shifts to carbon-neutral economies are taking far too long, so that every government and economy faces an indefinite future of increasing expenditures to cope with natural disasters, decreasing food availability, steadily increasing human and therefore plant/office/market migration, and increasing energy inefficiency as heating/cooling systems designed for one balance of winter and summer are increasingly inappropriate for a new balance. While all estimates are speculative, the ones I think most realistic indicate that over the next ten years, assuming nothing effective is done, the global economy will reach underperformance by up to 1% per year due to these things, and up to double that by 2035.  That, in turn, translates into narrower profit margins due primarily both to consumer demand underperformance and rising energy and infrastructure maintenance costs, hitting the least efficient first, but hitting everyone eventually.

The Blame and the Task

While it’s easy to blame the vendors or corporate blindness for this likely outcome, in this case I believe that IT should take its share of the blame – and of the responsibility for turning things around. IT was told that this was a problem, five years ago.  Even had corporate been unwilling to worry about the future that far ahead, IT should at least have considered the likely effects of five years of inattention and pointed them out to corporate.

 

That, in turn, means that IT bears an outsized responsibility for doing so now. As I noted, I see no signs that the vendors are unwilling to provide solutions for those willing to be proactive.  In the last five years, carbon accounting, monitoring within and outside the data center, and “smart buildings” have taken giant leaps, while solar technologies at whatever cost are far more easily implemented and accessed if one doesn’t double down on the existing utility grid. Even within the datacenter, new technologies were introduced 4 years ago by IBM among others that should have reduced energy usage by around 80% out of the box – more than enough to deliver a decrease instead of a doubling of energy usage. The solutions are there. They should be implemented comprehensively and immediately, as, by and large, has not been done.

 

Alternate IT Futures

 

I am usually very reluctant to criticize IT.  In fact, I can’t remember the last time I laid the weight of the blame on them. In this case, there are many traditional reasons to lay the primary blame elsewhere, and simply suggest that IT look to neat new vendor solutions to handle urgent but misdirected corporate demands. But that begs the question: who will change the dysfunctional process?  Who will change a dynamic in which IT claims cost constraints prevent it from “nice to have” energy tools, while corporate’s efforts to respond to consumer “green” preferences only brush the surface of a sea of energy-usage embedded practices in the organization?

 

Suppose IT does not take the extra time to note the problem, identify solutions, and push for moderate-cost efforts even when strict short-term cost considerations seem to indicate otherwise. The history of the past five years suggests that, fundamentally, nothing will change in the next five years, just as in the past five, and the enterprise will be deeper in the soup than ever.

 

Now suppose IT is indeed proactive. Maybe nothing will happen; or maybe the foundation will be laid for a much quicker response when corporate does indeed see the problem.  In which case, in five years, the enterprise as a whole is likely to be on a “virtuous cycle” of increasing margin advantages over the passive-IT laggards.

 

Energy usage. Is the name not a clue? What will IT do? Get the clue or sing the blues?

A recent blog post by Carol Baroudi heralds a sea change in the responsibilities of IT – or, if you prefer, a complication in IT’s balancing act. She notes that “bring your own device”, the name given to the strategy of letting employees use their own smartphones and laptops at work, rather than insisting on corporate ones, may have major negatives if the enterprise is serious about recycling devices. In effect, Carol is pointing out that allowing employee computing to cross corporate boundaries may have bad effects on corporate efforts to achieve sustainability, and IT needs to consider that.

 

In my experience, these considerations are very similar to those of a previous IT balancing act: IT’s responsibility to provide support to its users balanced against the enterprise’s need to maintain the security of internal computing and data – security whose breaches may threaten the health or even existence of the enterprise. Thus, IT’s past experiences may help guide it in balancing sustainability and the other needs of IT.

 

However, I would assert that adding sustainability to IT’s balancing act should also require a real rethinking of existing balances between all three elements for which IT will be responsible:  support, security, and sustainability. Moreover, I would argue that the result of this rethinking should be a process redesign, not an architectural one, that makes all three elements more equal to each other than they have been before – in balance as an equilateral triangle, not a random intersection of three wildly unequal lines. Finally, I would claim that a best-practices redesign will deliver far more benefits to the enterprise than “business as usual.”

 

Below, I will briefly sketch out how I believe each element should change, and first steps.

 

Redesigning IT Support

 

Support is an often-underestimated part of IT’s job. Many surveys in the past found it useful to distinguish between three IT jobs: keeping the business running, supporting users (internal and external), and helping the corporation achieve competitive advantage. Over the last 10 years, as software has become become critical to competitive advantage across a wider and wider range of industries, IT “innovation for competitive advantage” has begun to put its other two jobs in the shade. However, an enormous piece of IT’s part in achieving “innovation for competitive advantage” is to support the developers, corporate strategists, and managers who are the ones designing and creating the product and business-process software that delivers the actual advantage. In other words, the support that IT provides to end users is key to achieving two out of three of its jobs.

 

On the other hand, experience tells us that support of internal end users without control over the computing they are doing is extremely difficult and also dangerous. The difficulty comes from the fact that the average employee spends little time making sure the organization knows what his or her computing devices (including smartphones), Web usage, and software is – and so support is usually guesswork. Today’s danger now comes from the fact that unexpected computing threatens to cause downtime and security leaks. Sustainability will add “carbon leakage” – the tendency of employees to shift to unregulated devices and software that produce greater emissions when controls that slow them down are placed on the data center.

 

To a certain extent, IT can piggyback on today’s security software in dealing with the new sustainability demands – by adding monitoring of “carbon leakage”, for example, to existing asset management protections against property theft. But IT support processes must also be redesigned to incorporate sustainability considerations. IT Developers must bear their share of “going sustainable” by tilting their development form factors towards devices with lower emissions. Product designers must be encouraged or restricted in the direction of sustainability when designing new products. Corporate strategists should be made to factor IT sustainability into their strategic decisions such as rightsourcing. End users should be encouraged and restricted likewise, both in their use of IT resources and in their uses of personal computing resources for corporate purposes – Carol’s example.

 

Such a process redesign demands as a prerequisite some sort of overall sense of what internal end user carbon emissions are (or whatever other sustainability metrics are appropriate), and how they are changing. My sense is that organizations now understand that they need to draw a line between a particular resource or process and its emissions, and have some handle on all on corporate assets in the data center and corporate headquarters countries (including IT asset management and disposal). The biggest needs right now are to understand IT and employee computing resources outside the data center, and to get IT’s hands around the corporation’s capital across geographical boundaries – how computing and heating relate to emissions in the developing countries, for example.

 

Rethinking IT Security

 

“Scare stories” like theft of a company’s private data are constantly in the news, making the importance of IT security relatively easy for corporate to understand – even if they don’t necessarily want to spend on it. At the same time, when security is implemented, its philosophy of “better safe than sorry” carries its own dangers. My favorite quote in that regard is Princess Leia’s remark in the original Star Wars movie: “The more you tighten your grip, Tarkin, the more star systems will slip through your fingers.” That kind of dynamic plays out in several ways: the inability of companies to see what’s going on outside, because they are not constantly, unconsciously, exchanging information for information; the lowered productivity of employees, as they fail to bring to bear on today’s problems the new technologies that IT could not possibly anticipate supporting, and are therefore excluded by security; and the tendency of employees when too much control is exerted over one form of computing to flow to others that are easier to use but harder to keep track of – such as personal laptops instead of network computing.

 

When it comes to sustainability, security cuts both ways.  On the one hand, as noted above, sustainability needs the kind of visibility into and control of emissions that security provides for corporate data and computing. On the other hand, sustainability badly needs to emphasize the carrot instead of security’s stick, else cultural resistance will make “carbon leakage” endemic. And the converse is also true: “bring your own device”, even if it can be made to incorporating personal recycling reliably, makes security’s job harder.

 

To be fair, IT security has made enormous strides over the last 20 years in its ability to achieve fine-grained availability of apps and data to the outside while protecting proprietary information. Still, I believe that the new equilateral triangle requires not only adjustment of security and sustainability to each other’s needs, but also a shift in the balance between IT’s support tasks and its security efforts. Today’s reactive, controlling approach to security simply hinders too much the organization’s ability to be agile in an environment that is far more uncertain and fast-moving than ever before, as well as the organization’s ability to respond to what are likely to be greater and greater demands for more and more sustainable business practices.

 

The change in the security component, therefore, should be threefold. First, security software should be made much more “virtual.” By that I don’t mean that the applications it monitors should become more “virtual” – that’s happening already. Rather, I mean that the security itself should as far as possible be protecting logical, not physical, objects. In a sense, that’s what already happens, when you talk about security in a service-oriented architecture: you monitor a particular cluster of apps as a whole, no matter what platforms they are split across. So, slowly but surely, organizations have begun to do so – and they should speed it up. However, I also mean that IT should apply the same thing to things like land, buildings, and equipment. IT support needs this, in order to support efficiently across geographies. IT sustainability needs this, in order to efficiently link people, corporate resources, and emissions. Above all, when disaster strikes, the “virtual office” needs instant security as it moves to another location.

 

The second security rethinking should, I would say, take its lead from the Arab Spring. An interesting article in the MIT Technology Review showed how rebels maintained their security in the face of intensive assaults by switching media rapidly – moving from cell phones to Facebook to face-to-face and back. Underlying the concept is the idea of “rolling” or “disposable” security, in which the organization is constantly adding new things to be protected and leaving others behind as less important. Obviously, this can’t be carried too far, as some run-the-business apps can never be unprotected. However, it does give less of a feeling of being controlled to the employee, as some things become less controlled – as long as the shifts are done automatically, as new versions of the security software arriving with Continuous Delivery development processes, and without creating “bloatware.” I am not talking about constant security patches; I am talking about constant changes in what is being protected.

 

The third security rethink is to incorporate the idea that sometimes sustainability may mandate less (controlling) security instead of more. Employees are often ahead of management in their enthusiasm for sustainability – witness IBM incorporating a sustainability strategy as one of the top four only after employees told them they wanted it. Therefore, security to ensure corporate sustainability initiatives are being followed will just have to take second place to IT support for corporate and employee sustainability efforts. In other words, security levels will have to be carefully dialed down, where possible, where sustainability is involved.

 

Reimagining IT Sustainability

 

In many ways, the sustainability component of our equilateral triangle has the least design adjustment to make. Mostly, that’s because so much of IT’s sustainability component has yet to be implemented (and in some cases, defined). Emissions metrics are still in their early stages of incorporation into IT-available software; the proper relationship between the carbon-emissions focus and other anti-pollution efforts is not clear; and sustainability of a “carbon-neutral” organization’s business and IT model is still more a matter of theory than of real-world best practices.

 

Nevertheless, I would still recommend an exercise in reimagining what IT sustainability should be and how it should relate to IT support and IT security, because I believe that the organizations I talk to continue to underestimate the wrenching changes that lie ahead. Certainly, as late as a year ago, few corporations were talking about the effects of massive drought in Texas (anticipated by global warming models) to their data centers there. They do not yet appear to be considering the effects on employee hiring of loss of flood zone home insurance as insurance companies decrease their coverage in those areas in anticipation of further climate effects such as the ones that have driven up their disaster coverage costs sharply over the last 5-10 years. And this is not to mention similar once-in-100-years occurrences that have been taking place all over the rest of the globe in the last year and a half. Enterprises in general and IT in particular are wrapping their heads around what has happened so far; they do not yet appear to have wrapped their heads around the likelihood of a twofold or tenfold increase in these occurrences’ impact on the organization over the next 10 years.

 

IT needs to reimagine sustainability as if these effects are already baked in – as indeed they appear to be – but future effects beyond that are not. To put it in sustainability jargon, IT needs to add adaptation to the mix, but without compromising the movement toward mitigation in the slightest. Effectively, in the middle of a near-recession, IT needs to add additional costs to implement virtual software and the “virtual office”, while maintaining or increasing present plans to spend on decreasing carbon footprint. Decreasing carbon footprint has a clear ROI; adaptation well ahead of time to future disasters does not. Still, as the saying goes, pay me now, or pay me a lot more later.

 

What reimaging sustainability means, concretely, is that IT sustainability itself should incorporate IT efforts to support a more agile software-driven enterprise via more rapid implementation of “virtual software” – and should point that software squarely at physical assets that are difficult to move, like offices, inventory, and tools. Also, IT sustainability software should incorporate security (and vice versa) in terms of roles instead of people, resource types instead of physical plant and equipment. As an old saying put it, “in danger, the poor man looks after his few possessions first; the rich man looks after himself,” knowing that equivalent possessions can be bought later in another place as long as he survives. Likewise, for the corporation with massive resources, IT sustainability wisdom lies in agilely adapting when disaster strikes as well as seeking to prevent further disasters, not betting everything on riding out the storm with possessions intact where you are.

 

The Triangle’s IT Bottom Line

 

The key benefits of setting up an equilateral triangle of IT support, security, and sustainability should be apparent from my discussion above:

 

1.       Improved IT and business agility, with its attendant improvements in competitive advantage and long-term margins;

2.       Improved insurance against disaster and attack risks;

3.       Overall, reduced costs, as energy and efficiency savings more than counterbalance the added costs of adaptation.

 

So my recommendation to IT is that they run, not walk, to the nearest recycling center and Recycle their old IT support-security act; then Reuse it in a new equilateral-triangle strategy that balances support, security, and sustainability more equally; and use the new strategy to Reduce costs, risks, and inflexibility. Reduce, Reuse, Recycle: I bet that strategy will be sustainable. 

I recently read a post by Jon Koomey, Consulting Professor at Stanford, at www.climateprogress.org, called “4 reasons why cloud computing is efficient”. He argues (along with some other folks) that cloud computing – by which he apparently means almost entirely public clouds – is much more beneficial for reducing computing’s carbon emissions than the real-world alternatives. As a computer industry analyst greatly concerned by carbon emissions, I'd like to agree with Jon; I really would.  However, I feel that his analysis omits several factors of great importance that lead to a different conclusion.

 

The study he cites compares the public cloud -- not a private or hybrid cloud -- to "the equivalent". It is clear from context that it is talking about a "scale-out" solution of hundreds and thousands of small servers, each with a few processors. This is, indeed, typical of most public clouds, and other studies have shown that in isolation, these servers do indeed have a utilization rate of perhaps 10-20%. However, the scale-up hundreds-of-processors servers that are a clear alternative, and which are typically not used in public clouds (but are often used in private clouds), have a far better record.  The most recent mainframe implementations, which support up to a thousand "virtual machines", achieve utilization rates of better than 90% -- a three times better carbon efficiency than the public cloud, right up front.

 

The second factor Jon omits is the location of the public cloud. According to Carol Baroudi, author of "Green IT For Dummies", only one public cloud site that she studied is located in an area that has a strong record of electricity that is carbon-emission-light (Oregon). The others are in areas where the energy is "cheaper" because of fossil fuel use. That may change; but you don't move a public cloud data center easily, because the petabytes of data stored there to deliver high performance to nearby customers doesn't move easily, even over short distances. Corporate data centers are more movable, because the data storage sizes are smaller and they have extensive experience with "consolidation". While until recently most organizations were not conscious of the carbon-emission effects of their location, it appears that companies like IBM are indeed more conscious of this concern than most public cloud providers.

 

The third factor that Jon omits is what I call "flight to the dirty". High up-front costs of more efficient scale-up servers leads unconsciously to use of less energy-efficient scale-out servers. Controls over access to public and private clouds and data centers, and visibility of their costs, moves consumer and local computing onto PCs and smartphones. Apparent cheapness of labor and office space in developing nations leads companies to rapidly implement data centers and computing there using existing energy-inefficient and carbon-wasting electrical supplies. All of these "carbon inefficiencies" are not captured in typical analyses.

 

Personally, I come to three different conclusions:

1. The most carbon-efficient computing providers use scale-up computing and integrated energy management, and so far most if not all of those are private clouds.

 

2. The  IT shops that are most effective at improving carbon efficiency in computing monitor energy efficiency and carbon emissions use not only inside but outside the data center, and those inevitably are not public clouds.

 

3. Public clouds, up to now, appear to be "throwing good money after bad" in investing in locations that will be slower to provide carbon-emission-light electricity -- so that public clouds may indeed slow the movement towards more carbon-efficient IT.

 

A better way of moving computing as a whole towards carbon-emission reductions is by embedding carbon monitoring and costing throughout the financials and computers of companies. Already, a few visionary companies are doing just that. Public cloud companies should get on this bandwagon, by making their share of carbon emissions transparent to these companies (and by doing such monitoring and costing themselves). This should lead both parties to the conclusion that they should either relocate their data centers or develop their own solar/wind energy sources, that they should move towards scale-up servers and integrated energy management, and that they should not move to less costly countries without achieving energy efficiency and carbon-emission reduction for their sites up front.

 

This post was originally written last fall, and set aside as being too speculative. I felt that there was too little evidence to back up my idea that “accepting limits” would pay off in business.

 

Since then, however, the Spring 2011 edition of MIT Sloan Management Review has landed on my desk.  In it, a new “sustainability” study shows that “embracers” are delivering exceptional comparative advantage, and that a key characteristic of “embracers” is that they view “sustainability” as a culture to be “wired into the business” – “it’s the mindset”, says Bowman of Duke Energy. According to Wikipedia, the term “sustainability” itself is fundamentally about accepting limits, including environmental “carrying capacity” limits, energy limits, and limits in which use rates don’t exceed regeneration rates.

 

This attitude is in stark contrast to the attitude pervading much of human history. I myself have grown up in a world in which one of the fundamental assumptions, one of the fundamental guides to behavior, is that it is possible to do anything. The motto of the Seabees in World War II, I believe, was “The difficult we do immediately; the impossible takes a little longer.” Over and over, we have believed adjustments in the market, inventions and advances, daring to try something else, an all-out effort, something, anything, can fix any problem.

 

In mathematics, they, too, believed at the turn of the century that any problem was solvable:  that any truth of any consistent, infinite mathematical system could be proved. And then Kurt Godel came along and showed that in every such system, either you could not prove all truths or you could also prove false things, one or the other. And over the next thirty years, mathematics applied to computing showed that some problems were unsolvable, and others had a fundamental lower limit on the time taken to solve the problem that meant that they could not be solved before the universe ended. By accepting these limits, mathematics and programming have flourished.

 

This mindset is fundamentally different from the “anything is possible” mindset. It says to work smarter, not harder, by not wasting your time on the unachievable. It says to identify the highly improbable up front and spend most of your time on solutions that don’t involve that improbability. It says, as agile programming does, that we should focus on changing our solutions as we find out these improbabilities and impossibilities, rather than piling on patch after patch. It also says, as agile programming does, that while by any short-run calculation the results of this mindset might seem worse than the results of the “anything is possible” mindset, over the long run – and frequently over the medium term – it will produce better results.

 

It seems more and more apparent to me that we have finally reached the point where the “anything is possible” approach is costing us dearly. I am speaking specifically about climate change – one key driver for the sustainability movement. The more I become familiar with the overwhelming scientific evidence for massive human-caused climate change and the increasing inevitability of at least some major costs of that change in every locality and country of the globe, the more I realize that an “anything is possible” mentality is a fundamental cause of most people’s failure to respond adequately so far, and a clear predictor of future failure.

 

Let me be more specific: as noted in the UN scientific conferences and recent additional data, “business as usual” is leading us to a carbon dioxide concentration of 1000 ppm in the atmosphere, of which about 450 ppm or 150-200 ppm over the natural amount is already “baked in”. This will result, at minimum, in global increases in temperature of 5-10 degrees Fahrenheit, which will result, among other things, in order-of-magnitude increases in the damage caused by extreme weather events, the extinction of many ecosystems supporting existing urban and rural populations – because many of these ecosystems are blocked from moving north or south by paved human habitations – so that food and shelter production must both change their location and find new ways to deliver to new locations, movement of all populations from locations on seacoasts up to 20 feet above existing sea level, and adjustment of a large proportion of heating and cooling systems to a new mix of the two – not to mention drought, famine, and economic stress. And these are just the effects over the next 60 or so years.

 

Adjusting to this will place additional costs on everyone, very possibly similar to a 10% tax yearly on every individual and business in every country for the next 50 years, no matter how wealthy or adept. Continuing “business as usual” for another 30 years would result in a similar, almost equally costly additional adjustment.

 

Our response to this so far has been in the finest tradition of “anything is possible”. We search for technological fixes under the belief that they will solve the problem, since they appear to have done so before. Most of us – except the embracers – assume that existing business incentives, focused on cutting costs – but these costs have not yet occurred – will somehow respond years before the impact begins to be felt. (Embracers, by the way, actively seek out new metrics to capture things like carbon emissions’ negative effects) We are skeptical and suspicious, since those who have predicted doom before, for whatever reason, have generally seemed to have turned out to be wrong. We hide our heads in the sand, because we have too much else to do and concerns that seem more immediate. We are distracted by possible fixes, and by their flaws.

 

The “embrace limits” mindset for climate changes makes one simple change: accept steady absolute reductions in carbon emissions as a limit. For example, every business, every country, every region, every county accepts that every year, its emissions are to be reduced by 1% in that year. If a business, that business also accepts that its products’ emissions are to be reduced by 1% in that year, no matter how successful the year has been.  If a locality does better one year, it still is expected not to increase emissions the next year. If a country rejects this idea, investments from conforming countries are reduced by 1% each year, and products accepted from that country are expected to comply.

 

But this is a crude, blunt-force suggested application of “embrace limits”. There are all sorts of other applications. Investors will no longer invest in equities that seem to promise 2% long-term returns above historical norms, and will limit the amount of their capital invested in “bets,” because those investments are overwhelmingly likely to be con jobs. Project managers will no longer use metrics like time to deployment, but rather “time to value” and “agility”, because there is a strong possibility that during the project, the team will discover a limit and need to change its objective.

 

Because, fundamentally, climate change is a final, clear signal that Godel has won. Whether we accept limits or not, they are there; and the less we accept them and use them to work smarter, the more it costs us. 

IBM’s launch of its new sustainability initiative on October 1 prompted the following thoughts: This is among the best-targeted, best-thought-out initiatives I have ever seen from IBM. It surprises me by dealing with all the recent reservations I have had about IBM’s green IT strategy. It’s all that I could have reasonably asked IBM to do. And it’s not enough.

Key Details of the Initiative

We can skip IBM’s assertion that the world is more instrumented and interconnected, and systems are more intelligent, so that we can make smarter decisions; it’s the effect of IBM’s specific solutions on carbon emissions that really matters. What is new – at least compared to a couple of years ago – is a focus on end-to-end solutions, and on solutions that are driven by extensive measurement. Also new is a particular focus on building efficiency, although IBM’s applications of sustainability technology extend far beyond that.

The details make it clear that IBM has carefully thought through what it means to instrument an organization and use that information to drive reductions in energy – which is the major initial thrust of any emission-reduction strategy. Without going too much into particular elements of the initiative, we can note that IBM considers the role of asset management, ensures visibility of energy management at the local/department level, includes trend analysis, aims to improve space utilization, seeks to switch to renewable energy where available, and optimizes HVAC for current weather predictions. Moreover, it partners with others in a Green Sigma coalition that delivers building, smart grid, and monitoring solutions across a wide range of industries, as well as in the government sector. And it does consider the political aspects of the effort. As I said, it’s very well targeted and very well thought out.

Finally, we may note that IBM has “walked the walk”, or “eaten its own dog food”, if you prefer, in sustainability. Its citation of “having avoided carbon emissions by an amount equal to 50% of our 1990 emissions” is particularly impressive.

The Effects

Fairly or unfairly, carbon emission reductions focus on reducing carbon emissions within enterprises, and emissions from the products that companies create. Just about everything controllable that generates emissions is typically used, administered, or produced by a company – buildings, factories, offices, energy, heating and cooling, transportation (cars), entertainment, and, of course, computing.  Buildings, as IBM notes, are a large part of that emissions generation, and, unlike cars and airplanes, can relatively easily achieve much greater energy efficiency, with a much shorter payback period. That means that a full implementation of building energy improvement across the world would lead to at least a 10% decrease in the rate of human emissions (please note the italics; I will explain later). It’s hard to imagine an IBM strategy with much greater immediate impact.

The IBM emphasis on measurement is, in fact, likely to have far more impact in the long run. The fact is that we are not completely sure how to break down human-caused carbon emissions by business process or by use. Therefore, our attempts to reduce them are blunt instruments, often hitting unintended targets or squashing flies. Full company instrumentation, as well as full product instrumentation, would allow major improvements in carbon-emission-reduction efficiency and effectiveness, not just in buildings or data centers but across the board.

These IBM announcements paint a picture of major improvements in energy efficiency leading, very optimistically, to 30% improvements in energy efficiency and increases in renewable energy over the next 10 years – beyond the targets of most of today’s nations seeking to achieve a “moderate-cost” ultimate global warming of 2 degrees centigrade, in their best-case scenarios.  In effect, initiatives like IBM’s plus global government efforts could reduce the rate of human emissions beyond existing targets. Meanwhile, Lester Brown has noted that from 2008 to 2009, measurable US human carbon emissions from fossil fuels went down 9 percent.

This should be good news. But I find that it isn’t. It’s just slightly less bad news.

Everybody Suffers

Everyone trying to do something about global warming has been operating under a set of conservative scientific projections that, for the most part, correspond to the state of the science in 2007. As far as I can tell, here’s what’s happened since, in a very brief form:

1.       Sea rise projections have doubled, to 5 feet of rise in 80 years. In fact, more rapid than expected land ice loss means that 15 feet of rise may be more likely, with even more after that.

2.       Scientists have determined that “feedback loops” such as loss of the ability of ice to reflect back light and therefore decrease ocean heat, which loss in turn increases global temperature, are in fact “augmenting feedbacks”, meaning that they will contribute to additional global warming even if we decrease emissions to near zero right now.

3.       Carbon in the atmosphere is apparently headed still towards the “worst case” scenario of 1100 ppm. That, in turn, apparently means that the “moderate effect” scenario underlying all present global plans for mitigation of climate change with moderate cost (450 ppm) will in all likelihood not be achieved[1]. Each doubling of ppm leads to 3.5 degrees centigrade or 6 degrees Fahrenheit average rise in temperature (in many cases, more like 10 degrees Fahrenheit in summer), and the start level was about 280 ppm, so we are talking 12 degrees Fahrenheit rise from reaching 1100 ppm[2], with follow-on effects and costs that are linear up to 700-800 ppm and difficult to calculate but almost certainly accelerating beyond that.

4.       There is growing consensus that technologies to somehow sequester atmospheric carbon or carbon emissions in the ground, if feasible, will not be operative for 5-10 years, not at full effectiveness until 5-10 years after that, and not able to take us back to 450 ppm for many years after that – and not able to end the continuing effects of global warming for many years after that, if ever[3].

Oh, by the way, that 9 % reduction in emissions in the US? Three problems. First, that was under conditions in which GNP was mostly going down. As we reach conditions of moderate or fast growth, that reduction goes to zero. Second, aside from recession, most of the reductions achieved up to now come from low-cost-to-implement technologies. That means that achieving the next 9%, and the next 9% after that, becomes more costly and politically harder to implement. Third, at least some of the reductions come from outsourcing jobs and therefore plant and equipment to faster-growing economies with lower costs. Even where IBM is applying energy efficiencies to these sites, the follow-on jobs outside of IBM are typically less energy-efficient. The result is a decrease in the worldwide effect of US emission cuts. As noted above, the pace of worldwide atmospheric carbon dioxide rise continued unabated through 2008 and 2009. Reducing the rate of human emissions isn’t good enough; you have to reduce the absolute amount of human, human-caused (like reduced reflection of sunlight by ice) and follow-on (like melting permafrost, which in the Arctic holds massive amounts of carbon and methane) emissions.

That leaves adaptation to what some scientists call climate disruption. What does that mean?

Adaptation may mean adapting to a rise in sea level of 15 feet in the next 60 years and an even larger rise in the 60 years after that. Adaptation means adapting to disasters that are 3-8 times more damaging and costly than they are now, on average (a very rough calculation, based on the scientific estimate that a 3% C temperature rise doubles the frequency of category 4-5 hurricanes; the reason is that the atmosphere involved in disasters such as hurricanes and tornados can store and release more energy and water with a temperature rise). Adaptation means adjusting to the loss of food and water related to ecosystems that cannot move north or south, blocked by human paved cities and towns. Adaptation means moving to lower-cost areas or constantly revising heating and cooling systems in the same area, as the amount of cooling and heating needed in an area changes drastically. Adaptation means moving food sources from where they are in response to changing climates that make some areas better for growing food, others worse. Adaptation may mean moving 1/6 of the world’s population from one-third of the world’s cultivable land which will become desert[4]. In other words, much of this adaptation will affect all of us, and the costs of carrying out this adaptation will fall to some extent on all of us, no matter how rich. And we’re talking the adaptation that, according to recent posts[5], appears to be already baked into the system. Moreover, if we continue to be ineffectual at reducing emissions, each decade will bring additional adaptation costs on top of what we are bound to pay already.

Adaptation will mean significant additional costs to everyone – because climate disruption brings costs to everyone in their personal lives. It is hard to find a place on the globe that will not be further affected by floods, hurricanes, sea-level rise, wildfires, desertification, heat that makes some places effectively unlivable, drought, permafrost collapse, or loss of food supplies. Spending to avoid those things for one’s own personal home will rise sharply – well beyond the costs of “mitigating” further climate disruption by low-cost or even expensive carbon-emission reductions.

What Does IBM Need To Do?

Obviously, IBM can’t do much about this by itself; but I would suggest two further steps.

First, it is time to make physical infrastructure agile. As the climate in each place continually changes, the feasible or optimum places for head offices, data centers, and residences endlessly change. It is time to design workplaces and homes that can be inexpensively transferred from physical location to physical location. Moving continually is not a pleasant existence to contemplate; but virtual infrastructure is probably the least-cost solution.

Second, it is time to accept limits.  The effort to pretend that we do not need to accept the need to reduce emissions in absolute, overall terms, because technology, economics, or sheer willpower will save us, as we have practiced it since our first warning in the 1970s, is failing badly. Instead of talking in terms of improving energy efficiency, IBM needs to start talking in terms of absolute carbon emissions reduction every year, for itself, for its customers, and for use of its products, no matter what the business’ growth rate is.

One more minor point: because climate will be changing continually, adjusting HVAC for upcoming weather forecasts, which only go five days out, is not enough. When a place that has seen four days of 100 degree weather every summer suddenly sees almost 3 months of it, no short-term HVAC adjustment will handle continual brownouts adequately. IBM needs to add climate forecasts to the mix.

Politics, Alas

I mention this only reluctantly, and in the certain knowledge that for some, this will devalue everything I have said. But there is every indication, unfortunately, that without effective cooperation from governments, the sustainability goal that IBM seeks, and avoidance of harms beyond what I have described here, are not achievable.

Therefore, IBM membership in an organization (the US Chamber of Commerce) that actively and preferentially funnels money to candidates and legislators that deny there is a scientific consensus about global warming and its serious effects undercuts IBM’s credibility in its sustainability initiative and causes serious damage to IBM’s brand. Sam Palmisano as Chairman of the Board of a company (Exxon Mobil) that continues to fund some “climate skeptic” financial supporters (the Heritage Foundation, at the least) and preferentially funnels money to candidates and legislators that deny the scientific consensus does likewise.

Summary

IBM deserves enormous credit for creating today comprehensive and effective efforts to tackle the climate disruption crisis as it was understood 3 years ago. But they are three years out of date. They need to use their previous efforts as the starting point for creating new solutions within the next year, solutions aimed at a far bigger task: tackling the climate disruption crisis as it is now.



[1] Recent studies suggest that in order to limit warming to 5 degrees centigrade or 9 degrees Fahrenheit (via the 450 ppm atmospheric carbon dioxide long-term limit), carbon emissions must be limited to an average of 11 billion tons per year, perhaps less. The only scenario under which that clearly happens is global implementation of supplying a majority of energy needs from non-fossil fuels, almost immediately. Few if any countries presently have in place a plan that will make that happen within the next ten years. And most models generating scenarios do not take into account positive feedback loops.

[2] I am talking here about the US; the worldwide rise will be slightly lower. In 2009, atmospheric carbon dioxide reached 395 ppm at maximum, up about 41% from 150 years ago, and at its present rate of increase of over 2 ppm per year, should reach 400 ppm in 2011.  Because of feedback effects, scientists predict that this rate of increase will continue to grow. Growth in 2008, at 2.93 ppm, was the highest on record.

[3] For example, one scientist studying one of the most promising types of “geo-engineering” indicates that it will have little if any impact unless emissions are dramatically reduced before the geo-engineering is applied.

[4] I have left out a host of other adaptations with less obvious effects, such as wildfires, floods, destruction of more than 70% of species, and so on.

[5] See, for example, Heidi Cullen, “The Weather of the Future”, although she is overly optimistic, since her data goes only to the beginning of 2009 and her conclusions are scientifically conservative.

Green IT Revisited
04/12/2010

Paul Krugman has just posted an alarming article updating the consensus of reputable scientists and economists on global warming. Along with a book about which I recently did a blog post, it provides a 20,000-foot view of the increasing clarity of global warming’s likely future effects and of the paths that the world will take as they seek (or fail to seek) to mitigate (not eliminate) those effects.

 

A very broad-brush summary is that spending on “green” carbon-dioxide-emission reduction can take one of three paths:

 

  1. Do nothing. In that case, businesses of all stripes will need to decide what to do about urban areas no longer supported by their physical infrastructure (because the traditional climate will have migrated northward/southward with little ecosystem/farm ability to adapt); about serious droughts and more extreme weather events with much greater damage to the economy, especially in emerging markets; and about much greater political friction, as haves and have-nots quarrel more bitterly over changes that will disproportionately affect the have-nots.
  2. What Krugman calls the gradualist approach, in which government-led cap-and-trade or carbon-tax initiatives are slowly phased in, probably with loopholes, over the next 15-30 years. In that case, businesses should anticipate spending much energy ensuring that the resulting phase-in is as favorable as possible for a given industry – but also, there is a good possibility that they will still face the dilemmas outlined in Path 1, in slightly less dire form, over the next 30 years.
  3. What Krugman has dubbed the “big bang” approach, in which cap-and-trade or carbon taxes go into full force in the next 2-3 years, worldwide, with special attention paid to reducing coal emissions and production sharply. This might result in a global temperature increase of “only” 4 degrees Fahrenheit, which would avoid most of the dire effects cited in case 1. However, businesses will find it less productive to advocate for particularly industries in that case, as their time will be fully taken up by figuring out how to handle the new industry pecking order and minimizing the additional “tragedy of the commons” costs.

 

But what does this mean for IT, here and now, in the middle of struggling to get out from under the worst economic crisis since the Great Depression?

 

Let’s start with the way these three approaches will affect what business requires of IT, over, say, the next five years. If the “do-nothing” approach is adopted, and carbon emissions continue to increase as in the last 40 years, it is likely that the worst consequences will not take place until after 5 years’ time. Thus, businesses content to maximize profits in the short run will demand little change from IT in the next five years, while proactive businesses will seek to virtualize their IT infrastructure as rapidly and comprehensively as possible, to reduce its dependence on a particularly global physical location.  IT should also consider decreasing outsourcing to particular emerging-country political hot spots, as these will be increasingly risky places to do jobs such as programming.

The “gradualist” approach shares with the “do-nothing” approach a significant likelihood that the worst will happen – but probably not until after 5 years’ time.  Therefore, as before, pure short-run businesses will not demand climate-related changes in their IT, while proactive ones will seek faster virtualization. However, businesses will also have to compete with businesses in other industries to minimize the relative effect of caps or taxes. IT will play a key role in that effort, since it is the most effective place to monitor (and possibly adjust) emissions to minimize these effects. That is, IT will provide the emission sensors; the business will decide what to do, based on the sensor data. Thus, IT should anticipate a greater need for green-specialized data-analysis software, energy-efficient data centers, and grid-based physical-plant monitoring software and hardware.

 

In the “big bang” approach, there is much less need to prepare for climate change, but much greater need to prepare for changes in the industry pecking order that cannot be avoided. IT should be prepared for much more rapid transition of the business into less energy-intensive or carbon-dioxide-releasing fields than we have ever seen before. Increased agility, even as a coal business transforms itself into a water-power one or an oil company into a solar/wind-power colossus, is demanded of IT. That means applicability of business-critical software to new accounting methods and decreases in the cost of merging with other companies’ IT.

 

The flip side of what business requires of IT is what IT can supply to the business in order to aid carbon-dioxide-emission reduction. As noted above, there is no immediate need to carry out these “reduction actions,” and the “tragedy of the commons” ensures that the business will bear few of the long-term costs. However, failure to carry the reduction actions out now will mean some additional costs later, as well as a clear handicap in the marketplace compared to smarter rivals.

 

The key fact to keep in mind in this area is that while industries are nominally not the source of increasing carbon emissions – at least not lately (according to some figures, emissions from companies’ internal processes, overall, have been increasing by about 0.6-1% per year, far less than “consumer” emissions) – they are also judged by consumer usage of their services; and that varies by industry. An oil company’s manufacturing processes may be far more energy-efficient than 10 years ago; but lower relative oil costs and economic growth translate into rapidly increasing use of gas in transportation, which in turn leads to a strong political focus on automotive gas mileage and on the role oil companies play in auto emissions.  Internal IT, by contrast, has seen a much faster ramp-up in energy use in the typical company; but, so far, it remains a far less visible symbol of consumer excess.

 

IT’s role in being part of the solution, not part of the problem, is therefore the same no matter what political approach is adopted. The future holds increasing carbon-dioxide concentrations in the atmosphere, whatever the approach; and, no matter what the approach, every business with consumer-emission-affecting products and services needs to do everything possible in mitigating the effects of these products.

 

IT’s approach to helping a business do something about climate change, therefore, is (a) focused on consumer products and (b) industry-specific. If an industry such as aerospace, travel, or utilities has a large impact on consumer emission-affecting behavior, IT needs to help the business help the customer to reduce emissions, as part of the products and services. Thus, IT needs to help set up customer-emission-monitoring software, ways of making the product more energy-efficient (rationalizing and fine-tuning the electrical grid, or figuring out more energy-efficient vacations that are still satisfactory to the consumer), and measurements of energy emissions that enable both internal and external improvements.  Moreover, these software solutions need to be far more global and extra-organizational than ever before, because it is very easy to “flow emissions to the least regulated spot”, which accomplishes nothing while seeming to the business to solve its problem.

 

One final point: even during the recession, total computing-related emissions appear to have continued to climb at a rapid rate – somewhere between 20% and 50% yearly worldwide. While some of this climb can be excused as a byproduct of moves to decrease other/larger emissions, somewhere around 10 years down the line IT itself will see more stringent energy limits that cannot be outsourced to developing countries or traded for advances in other areas such as transportation. A global, cross-organizational and cross-market approach to a business’ emission profile may not pay dividends until then; but it will indeed pay dividends.

 

Overall, then, this economic “perfect storm” that IT and the business are now seeing as a reason to downplay green efforts is, in fact, more like the fable of the mythical frog who is placed in water that is gradually heated, and who fails to notice the increase at any point until it is dead and cooked. Whatever the political approach, climate change pain is already baked in, and quicker IT adjustment is better than failure to notice the increasing heat over the next 5 years.  What the IT strategy should be, depends on the political approach, industry, and consumer; that there should be an IT green strategy, does not.

 
Wayne Kernochan