A recent study, “Cloud Computing and Sustainability: The Environmental Benefits of Moving to the Cloud,” which was commissioned by Microsoft and conducted by Accenture and WSP, has found that companies running applications in the cloud can reduce their carbon emissions by 30% or more verses running the same applications on their own infrastructures.
Recent Updates Page 2 Toggle Comment Threads | Keyboard Shortcuts
This raises some important questions for cloud customers. They need to take care of where they host their infrastructure and services. They definitely need to talk to their suppliers and understand what’s going on. It isn’t enough to just run a virtual machine and assume everything’s safe.
Here are some of the things they should consider:
- Are virtual machines or data storage volumes encrypted and/or dispersed to prevent unauthorised access?
- Does each instance have its own private firewall (not just a perimeter firewall around the whole data center)?
- Are network interfaces monitored for suspicious behaviour? Again, on an instance by instance basis and not just for the whole data centre? Is there auto-shutdown or other automated responses in case of a problem?
- How are switches managed – through the same network (bad) or using serial out-of-bound connections (good)? If hackers can get into the switches, they can do a lot of harm.
- Does the hosting company use virtual LANs? This is not a good idea. Every customer needs to be on their own physical LAN connection or something like Infiniband where traffic is physical separated.
- Who monitors logs and other reports for suspicious activities? How often is this information checked?
It is possible to build secure infrastructure in the cloud but it takes work and very few providers are doing it today. This makes it all the more important for customers to check for themselves.
I was talking to a master hacker recently. These guys seem to be able to break almost any security system. For example, the Swiss government were working on an e-government project which they thought was unbreakable but a bunch of hackers got into it in a couple of weeks.
Everything can be hacked today. That’s just a fact. It’s a question of how much money you want to spend and whether you know the right people.
This hacker told me something big and scary. He reckons that 40 per cent of all internet routers are compromised. This means that hackers and online criminals potentially have access to a huge amount of the traffic that goes across the internet.
In turn, this means that they could break SSL encryption by becoming ‘the man in the middle’. They just have to use their compromised routers, fake the other side of the SSL key exchange mechanism and, bingo, everything you thought was encrypted isn’t any more. This includes credit card details, bank account details etc. etc.
I’m not saying it’s easy or commonplace but it does seem to be possible. What does this mean for security? It’s a big question.
The latest round in the green wars probably goes to Facebook, which has just announced two new green datacenter strategies on two different tracks. However, Google also continues to hold its own after announcing a couple of Google Earth projects that will map out rooftop solar power potential in California plus geothermal energy potential across the US.
The Green (Low Carbon) Data Center Blog has recently plotted the European data centers of Google and Facebook on a map of Europe. At the moment, Google has datacenters in Dublin, Hamina (Finland) and Belgium while Facebook will soon have a datacenter in Lulea (Sweden). Incidentally, Google also has three future sites in Asia (Singapore, Hong Kong, and Taiwan) planned – and apparently a fascination with the number three.
Microsoft has recently teamed up with the city of Quincy, Washington in order to retool the city’s water treatment infrastructure. As part of this partnership, a multi-million dollar water treatment plant that was built by Microsoft to support its datacenter will be leased to the city of Quincy for just $10 a year – a great deal for local taxpayers.
Datacenter Knowledge has noted that during the third quarter, Google spent $680 million on its datacenter infrastructure – less than what was spent in recent quarters. Datacenter Knowledge then pointed out that Google has just completed the first phase of its datacenter in Pryor (Oklahoma) and has brought its new facility in Hamina (Finland) online – meaning it probably will not need to spend as much in future quarters.
MTBF (mean time between failures) is a very misleading guide for companies who want maximum up time for servers, storage and network connections.
Most companies focus on MTBF exclusively but I think they’re wrong. The problem is the measurement but the lessons people learn from it.
They think that if only they make everything redundant they can make everything failsafe. Not so. By making things more complex, they make the problem worse.
For example, if you use multipath networking, you get a lot of complex wiring. Sure, the aggregate MTBF may be better but if there’s a problem its much harder to resolve. In other words, reducing MTBF actually increases repair time.
Instead, in the cloud especially, they need to think about simplicity first and foremost. The more you keep things simple, the more you keep uptime high. If something does go wrong, you have to have procedures in place to identify problems very quickly and replace faulty parts. But the job is easier.
For example, instead of a multipath, multi-redundant network in my data centre, I can have a primary switch and if there’s a problem, I can just cut over to my standby backup switch and carry on. It takes a few minutes and then I can focus on diagnosing the original problem.
It’s the different between trying to repair a jet engine on the ground or in the middle of a flight.
Of course, the big vendors prefer to make systems more complex because they’re more expensive. But sshhh! They don’t want you to know that.
There was a very interesting interview with EMC co-founder, Joe Tucci in Forbes recently. He was talking about disruptive technology and its power to turn whole industries on their head. Here’s a telling excerpt:
Waves of change come. They are massively disruptive. Those that miss the wave become irrelevant or worse. The minicomputer people barely saw Sun, Dell, Microsoft et al. coming, and when they did they didn’t believe it. And now Sun is gone. I really believe in my heart and soul that the post-PC era, along with the cloud, is the biggest wave yet. It will create more devastation and opportunity than all the others. I have no doubts that some big companies–I mean, really big companies–will come away not looking so powerful.
This is what green, unbreakable, cloud computing is all about.
Google has recently revealed a closely guarded secret: How much electricity its datacenters use. In fact, Google has revealed that their entire operation:
- Generated 1.46 million metric tons of carbon dioxide and consumed a total of 2,259,998 MWh in 2010.
- Continuously uses enough electricity to power as many as 200,000 homes.
- Continuously draws nearly 260 million watts, or roughly a quarter of a nuclear power plant’s output, with about 12.5 million watts going to power more than a billion searches a day.
- Drew 25% of its energy from renewable fuels in 2010 and this figure should reach the 30% level this year and 35% next year.
- Uses less than 1% of the total amount of electricity used by datacenters globally, which in turn accounts for 1.3% of total worldwide electricity usage.
It’s worth noting that Google had received some flak from environmentalists and Greenpeace in particular – who previously gave the company an “F” for transparency on its Cool IT leaderboard. On the other hand and as the New York Times recently noted, some analysts had speculated that Google did not want to reveal information about electricity usage because it was embarrassing or it might allow competitors to decode just how efficient (and big) the company’s datacentre operations are.
Nevertheless, Google’s disclosures about electricity usage could put pressure on other IT companies like Amazon, Apple, Facebook, Microsoft and Yahoo to reveal just how much electricity their datacentres are using along with the amount of carbon they produce.
Atos think they can, according to Data Center Knowledge, who have reported that Helsinki Energy and IT services provider Academica have teamed up to build a unique datacentre for global IT outsourcer Atos that will use cold sea water for cooling while waste heat will be piped via a heat pump to heat Helsinki buildings along with residents’ domestic hot water.