Recent Updates Toggle Comment Threads | Keyboard Shortcuts

  • greenclouddc 8:02 am on August 3, 2012 Permalink | Reply  

    IT As Electricity: We need to start thinking differently 

    Cloud computing is the next step on our IT journey. It has tremendous potential, certainly when you consider the budgetary impact of the shift to the cloud. Companies are quickly discovering the benefits and increased profits offered by the cloud and, whether it is part of a conscious plan or not, they are taking advantage of this.

    The next step is to supply IT just like electricity. More efforts will be needed to bring IT up to an equivalent degree of reliability: 100% available and not 99.99%. IT as electricity also means that it’s going to take more than just the cloud; you will have to verify each element of the finished product. That is going to mean redesigning the current IT concept in which we continue to build further on the same basic components. Every time there is an issue, a painkiller is provided that treats the symptoms but not the causes and this painkiller approach is the most widespread method. People continue to build further on the same basic technology, even when it has become woefully insufficient. But tackling the true cause of a problem will help solve a lot more trouble spots and lead to greater innovation. To make an analogy with the world of medicine, we’re talking about a single remedy that can solve all viral infections instead of numerous medications that treat the symptoms of each of these infections.

    This may have a huge impact on our daily ICT. Companies will be able to work faster and more efficiently. The activation of IT for new clients or employees will become as simple as turning on the lights. We will be able to save time each day because we will no longer have to face IT obstacles, which will allow us to concentrate better on our core activities. The IT used by small companies will become just as powerful as that of large corporations; just like the electricity that is used today is the same for everyone.

    Like with any (r)evolution, it is going to take a new way of thinking. The first pioneers are already paving the way. In practice, this mindshift may happen more quickly than we currently expect because the speed at which ICT evolves is constantly increasing.

     
  • greenclouddc 6:19 pm on February 24, 2012 Permalink | Reply
    Tags: Arctic, Arctic Circle, , Iceland, Raytheon   

    Hot and cold: new approaches to heat management 

     

    Facebook building an Arctic datacenter

    Facebook is the latest technology company to announce plans for an Arctic datacenter. Specifically, Facebook will build a datacenter in Lulea (Sweden), which is just south of the Arctic Circle where the average low in January is 3 degrees Fahrenheit. This will be Facebook’s first non-USA datacenter and it’s scheduled to open by 2012.

    Could Iceland power the Internet?

    Iceland, which is conveniently located between North America and Europe, is hoping that electricity from renewable resources can power the data servers that make the Internet work.

    Iceland as the “silicon geyser”

    In addition, ZDNet’s Back Office blog has recently profiled Iceland, where a cool temperate climate means that it’s not so cold that a huge amount of humidification is needed for a datacenter but where its also cool enough so that fresh-air cooling can be used throughout the year. In other words, IT may have Silicon Valley but Iceland is the “silicon geyser” for datacenters.

    Pictures of one of the world’s “coolest” datacenters

    Silicon.com also has a photo gallery of Verne Global’s datacenter campus in Keflavik (Iceland) – just outside the Arctic Circle. Iceland’s cold climate helps to keep the datacenter cool while the country’s geothermal and hydroelectric energy provides a clean and renewable power supply.

    Waste heat from Disney datacenter to provide heating

    Disneyland Paris is partnering with French energy provider Dalkia to turn waste heat from its datacenters into heating and hot water at a business park. Eventually, the district-wide heating network will supply green energy to buildings with a surface area of 600,000 square metres.

    Raytheon: Raising datacenter temperatures can cut energy use by 30%

    Defense and aerospace company Raytheon has found that raising datacenter temperatures can cut energy usage by as much as 30%. Moreover, Raytheon has been building up its cloud computing capabilities, which has also led to a reduction in hardware requirements and energy needs, plus the company has replaced older equipment with newer and more energy-efficient models.

     
  • greenclouddc 6:20 pm on February 21, 2012 Permalink | Reply
    Tags: AFCOM, , , Disaster recovery, Hurricane Irene, Information technology management,   

    Is your disaster recovery plan good enough? 

    Jill Yaoz, the CEO of AFCOM, a non-partisan association for datacenter management with more than 4,500 members worldwide, has recently written an article for Forbes’ CIO Central blog that mentioned some rather shocking statistics about the state of datacenter disaster recovery planning. Apparently, AFCOM had surveyed its members and came up with these startling results:

    • More than 15% of datacenters still do not have a plan for business continuity or disaster recovery.
    • Fifty percent of datacenters have no plan to replace damaged equipment after a disaster.
    • Two-thirds of datacenters still do not have a plan or procedures in place to deal with cybercrime.

    Given the rise of new cybercrime threats (see Operation Ghost Click) to datacenters and computer users alike along with increasingly violent natural disasters (e.g. Hurricane Irene and the earthquake that recently hit the eastern half of the USA), it comes as a surprise that so many datacenters are not prepared with a disaster recovery plan.

    Jill then went on to write about how important it is to have an adequate disaster recovery plan that actually takes into account how critical infrastructure and systems would be replaced if they are damaged in a disaster. And while this may sound easy to consider, Jill also noted that many organizations and their datacenters have been building upon their existing IT infrastructure on an ad hoc basis using different vendors and equipment for many years. This means that fixing any mess in the even of a disaster will require detailed plans as well as consideration of how a big disaster might impact a vendor and hence replacement and installation lead times.

    Likewise, virtualization, cloud computing and the rise of social networking sites (and their usage at work) has also complicated datacenter security. In other words, there are more potential holes for cybercriminals to exploit and potentially target your datacenter and its data.

    In other words and if you don’t have a disaster recovery plan that takes into consideration how critical infrastructure will be replaced along with the threat posed by cybercriminals, its time to go back to the planning board while there is still time.

     
  • greenclouddc 6:17 pm on February 17, 2012 Permalink | Reply  

    Best of the Web: Green datacenter and cloud computing news 

    Citigroup saves big on datacenter energy usage

    Despite the sluggish economy in many parts of the world, banks are investing in datacenters in order to save money. In fact, Citigroup’s datacenter program has resulted in savings of $6 million a year on energy costs alone plus a 3% reduction in its carbon footprint.

    Digital Realty now has 19 LEED-certified datacenters

    Datacenter solutions provider Digital Realty’s total LEED certified datacenter locations has risen to nineteen after being awarded three new LEED certifications for locations in Virginia and Georgia.

    Natural cooling is used by half of all data centers

    A study of 115 datacenters by non-profit organisation Green Grid reveals that nearly half of all datacenters now use so-called "economiser" natural air cooling units in order to cut costs and save energy.

    Green IT issues are clouded by Google’s sheer size

    The CTO of cloud provider Cirrus Stratus has recently written a thought provoking article that pointed out that Google’s huge energy bills are actually masking just how green cloud computing can be.

    The UK leads in private cloud adoption

    According to VMware, the UK is now one of the most advanced markets for datacenter virtualisation in Europe. In fact, some British enterprises now have 70% to 80% of their x86 estates virtualised.

    Sandia’s unique approach to cooling

    In order to cool the Red Sky supercomputer at Sandia National Laboratories, David Martinez developed a unique cooling system that combined the use of traditional raised-floor air cooling with refrigerant-cooled rear door heat exchangers. He then shared his experience at the AFCOM Datacenter World conference.

     
  • greenclouddc 6:10 pm on February 14, 2012 Permalink | Reply
    Tags: Carbon, , , Evaporative cooler, , Green Grid, Hewlett-Packard, Power usage effectiveness, Sustainability   

    Time for new sustainability metrics? 

    Nicolas Dube, a datacenter efficiency strategist at HP, was recently the subject of a lengthy video interview where he gave an overview of new datacenter sustainability metrics in addition to Power Usage Effectiveness (PUE). These new sustainability metrics include the following:

    • Energy Reuse Effectiveness (ERE). A metric that is focused on heat reclamation.
    • Water Usage Efficiency (WUE). A site-specific metric that covers water usage in evaporative cooling.
    • Carbon Usage Effectiveness (CUE). A metric that tracks total carbon impact. This metric also includes the amount of carbon used by the electricity supplied to the datacenter.

    Nicolas pointed out that the datacenter industry is increasingly moving towards looking at the big picture when it comes to measuring environmental impact. Hence, the new metrics he discusses do a much better job of assessing this total impact.

     

     
  • greenclouddc 6:06 pm on February 7, 2012 Permalink | Reply
    Tags: , Carbon footprint, , Energy, , Renewable, Technology, Urban area   

    Can your datacenter be green if it’s in a city? 

    Data Centre

    You might be wondering whether or not to move your datacenter out of the city to the suburbs or completely out of an urban location – especially if you are trying to lower your carbon footprint and be as green as possible. After all, there is a point in time where having a datacenter in a big urban area becomes a serious disadvantage plus there is a limit to just how green you can be when located in heavily populated areas.

    Just consider some of the following issues:

    • Construction Costs. Permits, construction and labour costs in a major urban area will inevitably cost more than they would in a non-urban area. Likewise and once your datacenter is operational, it will probably cost significantly more to hire and retain employees in a major urban area than it would in a non-urban area.
    • Space Issues. In urban areas where space is at a premium, your ability to expand will likely be limited by space constraints or the high cost of additional space. Moreover, you can probably forget about most green initiatives like having large solar panels or a small wind farm on site.
    • Power Issues. If your datacenter is in a major urban area, chances are you are paying more per kilowatt than you would otherwise pay in an area that is less urbanized. The reason for this is simple: While the power grid in an urban area is huge, your datacenter is probably putting significant stress on that grid. Moreover, your ability to expand a datacenter in an urban area may be capped because you simply will not be able to get enough electric power to power an expanded facility while options for green or renewable power might be limited.
    • Disaster Concerns. Since your datacenter is already taxing the power grid that may be operating at near capacity, just one minor hiccup like a summer heat wave or a winter storm could mean that you would need to go on backup power for a lengthy period of time. Moreover, residential households will usually have priority over businesses when it comes to having their power restored.

    In other words and if you want to have the greenest and most cost efficient datacenter as possible, it might be time to consider relocating outside the city and to a more remote location. (Hat tip: Green Data Center News.)

     
  • greenclouddc 6:16 pm on February 3, 2012 Permalink | Reply  

    Best of the Web: Green datacenter and cloud computing news 

    Apple plans a solar array for its $1 billion datacenter

    Last spring, Apple’s 500,000 square feet (46,500 sqm) Project Dolphin datacenter, which is five times the size of its datacenter in Newark (California), was opened to support the Apple iCloud service. And now Apple is planning a solar farm to provide green renewable power to that datacenter.

    Apple’s green plans do not prove popular with the neighbors

    However, Apple recently was burning and clearing 171 acres of green space to build its solar power array and this incensed local residents. According to the Hickory Daily Record, residents were given no advance warning before being surrounded by clouds of thick, black smoke along with fine floating ash.

    Apple’s new solar farm Maiden (North Carolina)

    http://youtu.be/pkKc30vNvsA

    Kaiser Permanente a top ranked green IT organization thanks to a green datacenter

    Kaiser Permanente was among Computerworld’s Top 12 Green-IT Organizations in large part due to its Napa (California) datacenter. Through cooling efficiency measures, Kaiser Permanente’s electricity costs savings at their Napa datacenter were $450,000 plus the company has earned a $300,000 incentive from the local utility.

    New algorithm might help datacenters control power costs

    Scientific Computing has recently reported that a three-person international research team has developed a straightforward algorithm for the optimization of server operations by balancing power with performance – thus helping datacenters control power costs.

    Allstate reduces 3,000 servers or devices in just 18 months

    For several years, Allstate Insurance has been pursuing various energy-saving initiatives that has led to a cumulative energy reduction of about 40% thanks to more efficient datacenter operations and virtualization. In fact and over the past 18 months, Allstate Insurance has eliminated nearly 3,000 servers or devices.

    Singapore’s datacenter edge

    Google and Softlayer Technologies are the latest MNCs to announce datacenter plans or setups in Singapore citing reliable and stable energy pricing, strong network connectivity, an educated population, business-centric intellectual property (IP) laws and a pro-business government.

     
  • greenclouddc 5:59 pm on January 31, 2012 Permalink | Reply
    Tags: Apple, , , , ,   

    Just how big is a Facebook datacenter? 

    Facebook is fast approaching the 1 billion users mark and perhaps the best way to appreciate just how big Facebook has become is to take a look at the company’s data center campus in Rutherford County (North Carolina) from the air. That’s exactly what North Carolina realtor Bill Wagenseller did after creating similar flyover videos of Apple’s data center in Maiden (North Carolina).

    His video shows one of the two planned 300,000 square foot (28,000 sqms) data center buildings as it nears completion plus the cleared space for the second building:

     
  • greenclouddc 7:26 pm on December 15, 2011 Permalink | Reply
    Tags: , , ,   

    Is IT’s future really in the cloud 

    Blue sky thinking in the boardroomIt’s been said that the future of IT is in software and cloud computing but it is also forgotten that cloud computing software must run on physical hardware and ultimately on physical hardware located in datacenters. In fact, Gartner’s latest datacenter forecast (Forecast: Data Centres, Worldwide, 2010-2015) has new figures to show that IT datacenter sales are heading in just one direction and that is up.

    Just consider some of the following and latest projections from Gartner:

    • Worldwide datacenter hardware spending will rise 12.7% from $87.8 billion in 2010 to hit the $98.9 billion (£62 billion) level by the end of 2011. Moreover, datacenter hardware spending is forecasted to reach $106.4 billion in 2012 and $126.2 billion in 2015.
    • Datacenter spending growth in emerging economies such as the so-called BRIC countries (Brazil, Russia, India and China) will be balanced by continued weakness in both Japan and Western Europe.
    • Datacenter storage will be the major driver for growth. In fact and despite the fact that only a quarter of datacenter hardware spending is on storage, approximately half of the spending growth will be from the storage market.
    • The largest datacenter category (those with more than 500 racks) will see its share of spending rise from 20% in 2010 to 26% in 2015. This growth will be driven by cloud computing along with a move away from internal datacenters to external datacenters.

    In other words, the future of IT might very well be in datacenters and datacenter hardware to run all of those cloud computing applications.

    Sources: eWeek and Gartner

     
  • greenclouddc 6:55 pm on December 13, 2011 Permalink | Reply
    Tags: , ,   

    R.A.I.D. is B.A.D. 

    close-up of an opened hard drive

    If you think your data is protected against bit rot because it’s on a RAID array, think again.

    Bit rot is a real problem. All storage media is at risk. Magnetic media like discs and tapes can lose data integrity over time. Even RAM is susceptible to cosmic rays. You may not even know about it because the disc keeps working but a bit or a byte here and there has changed.

    A typical home user with a 2TB disk has an almost 100% chance of bit rot. If it just changes a pixel in a video, it’s not going to be a problem. But if it’s in your database or operating system software, that failure could be much more significant.

    Today there is no protection. This problem is undetectable and then there is no means of recovery.

    You might think that RAID is the answer but it’s not. An Amplidata white paper, The RAID Catastrophe, explains why:

    • It doesn’t do hashing or CRC to verify the data.
    • It doesn’t track bit rot or correct it (nor do most operating systems).
    • There are no ‘previous versions’ to restore; just the current version of each file.
    • Storage volumes are increasing dramatically so a rebuild in the event of a total drive failure can be very lengthy (even a couple of days on 2TB HDD arrays).
    • Increasing the number of drives or the size of each drive increases the overall risk of failure; even if the MTBF for any individual drive seems low.
    • If one drive in an array fails, it is quite likely that a second one will also fail soon because RAID writes to all drives in the array at the same time.
    • RAID-6 double parity is a partial answer to this but it imposes a big performance overhead.
    • Forensic data recovery from RAIDs is very, very difficult, time-consuming and expensive. It’s like trying to turn an omelette back into the original eggs.

    There is an additional problem with RAID storage. If you have a problem with a drive, you replace it and the rebuild fails for some reason you have no way of recovering the data on the array. At least on a single disk, you can recover all the data except on the failed sector(s). This is another example of IT industry focusing on MTBF rather than on complexity or recovery time.

    The IT industry needs a radical rethink about storage. The old orthodoxy isn’t good enough. It may be heresy to admit it but perhaps we need to move beyond RAID and towards new, unbreakable storage technology like Amplidata’s.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.