The dark side of the Internet

Published in Telecom Asia 26 Sep 2012 – The dark side of the internet

Imagine a life without the internet. You can’t! That’s how inextricably enmeshed the internet is in our lives. Kids learn to play “angry birds” on the PC before they learn to say “duh”, school children hobnob on Facebook and many of us regularly browse, upload photos, watch videos and do a dozen other things on the internet.

So on one side of the internet is the user with his laptop, smartphones or iPad. So what’s on the other side of the Internet and what is the Internet? The Internet is a global system of interconnected computer network that uses the TCP/IP protocol.  The Internet or more generally the internet is network of networks made of hundreds of millions of computers.

During the early days the internet was most probably used for document retrieval, email and browsing. But with the passage of time the internet and the uses of the internet have assumed gigantic proportions. Nowadays we use the internet to search billions of documents, share photographs with our online community, blog and stream video. So, while the early internet was populated with large computers to perform the tasks, the computations of the internet of today require a substantially larger infrastructure. The internet is now powered by datacenters. Datacenters contain anywhere between 100s to 100,000s servers. A server is a more beefed up computer that is designed for high performance sans a screen and a keyboard. Datacenters contain servers stacked over one another on a rack.

These datacenters are capable of handling thousands of simultaneous users and delivering results in split second. In this age of exploding data and information overload where split second responses and blazing throughputs are the need of the hour, datacenters really fill the need. But there is a dark side to these data centers. The issue is that these datacenters consume a lot of energy and are extremely power hungry besides. In fact out of a 100% of utility power supplied to datacenter only 6 – 12 % is used for actual computation. The rest of the power is either used for air conditioning or is lost through power distribution.

In fact a recent article “Power, pollution and the Internet” in the New York Times claims that “Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants.”  Further the article states that “it is estimated that Google’s data centers consume nearly 300 million watts and Facebook’s about 60 million watts or 60 MW”

For e.g. It is claimed that Facebook annually draws 509 million kilowatt hours  of power for its  data centers  (see Estimate: Facebook running 180,000 servers). This article further concludes “that the social network is delivering 54.27 megawatts (MW) to servers” or approximately 60 MW to its datacenter.  The other behemoths in this domain including Google, Yahoo, Twitter, Amazon, Microsoft, and Apple all have equally large or larger data centers consuming similar amounts of energy.  Recent guesstimates have placed Google’s server count at more than 1 million and consuming approximately 220 MW. Taking a look at the power generation capacities of power plants in India we can see that 60 MW is between to 20%-50% of the power generation capacity of  power plants  while 220 MW is entire capacity of medium sized power plants (see List of power stations in India”)

One of the challenges that these organizations face is the need to make the datacenter efficient. New techniques are constantly being used in the ongoing battle to reduce energy consumption in a data center. These tools are also designed to boost a data center’s Power Usage Effectiveness (PUE) rating. Google, Facebook, Yahoo, and Microsoft compete to get to the lowest possible PUE measure in their newest data centers. The earlier datacenters used to average 2.0 PUE while advanced data centers these days aim for lower ratings of the order of 1.22 or 1.16 or lower.

In the early days of datacenter technology the air-conditioning systems used to cool by brute force. Later designs segregated the aisles as hot & cold aisle to improve efficiency. Other technique use water as a coolant along with heat exchangers. A novel technique was used by Intel recently in which servers were dipped in oil. While Intel claimed that this improved the PUE rating there are questions about the viability of this method considering the messiness of removing or inserting new circuit board from the servers.

Datacenters are going to proliferate in the coming days as information continues to explode. The hot new technology “Cloud Computing” is nothing more that datacenters which uses virtualization technique or the ability to run different OS on the hardware improving server utilization.

Clearly the thrust of technology in the days to come will be on identifying renewable sources of energy and making datacenters more efficient.

Datacenters will become more and more prevalent in the internet and technologies to make them efficient as we move to a more data driven world

Find me on Google+

Re-imagining the Web portal

Published in Telecom Asia, Mar 16, 2012 – Re-imagining the web portal

Web portals had their heyday in the early 1990’s. Remember Lycos, Alta-vista, Yahoo, and Excite – portals which had neatly partitioned the web into compartments for e.g. Autos, Beauty, Health, and Games etc. Enter Google. It had a webpage with a single search bar. With a single stroke Google pushed all the portals to virtual oblivion.

It became obvious to the user that all information was just a “search away”. There was no longer the need for neat categorization of all the information on the web. There was no need to work your way through links only to find your information at the “bottom of the heap”. The user was content to search their way to needed information.

That was then in the mid 1990s. But much has changed since then. Many pages have been uploaded into the trillion servers that make up the internet. There is so much more information in the worldwide web.  News articles, wikis, blogs, tweets, webinars, podcasts, photos, you tube content, social networks etc etc.

Here are some fun facts about the internet – It contains 8.11 billion pages (Worldwidewebsize), has more than 1.97 billion users, 266 million websites (State of the Internet). We can expect the size to keep growing as the rate of information generation and our thirst for information keeps increasing.

In this world of exploding information the “humble search” will no longer be sufficient. As a user we would like to browse the web in a much more efficient, effective and personalized way.   Neither will site aggregators like StumbleUpon, Digg, Reddit and the like will be useful.  We need to have a smart way to be able to navigate through this information deluge.

It is here I think that there is a great opportunity for re-imagining the Web Portal. As a user of the web it would be great if the user is shown a view of the web that is personalized to the tastes and interests that is centered on him. What I am proposing is a Web portal that metamorphoses dynamically based on the user’s click stream, the user’s browsing preferences, of his interests and inclinations as the focal center.  Besides the user’s own interests the web portal would also analyze the click streams of the user’s close friends, colleagues and associates. Finally the portal would also include inputs from what the world at large is interested in and following. The web portal would analyze the key user’s preferences and then create a web portal based on its analysis of what the user would like to see.

This can be represented in the diagram below

We have all heard of Google’s zeitgeist which is a massive database of the world’s inclinations and tendencies.  Such a similar database would probably be also held by Yahoo, Microsoft, FB, Twitter etc.

The Web portal in its new incarnation would present contents that are tailored specifically to each user’s browsing patterns. In a single page would be included all news, status updates, latest youtuble videos, tweets etc he would like to see.

In fact this whole functionality can be integrated into the Web browser. In its new avatar the Web Portal would have content that is dynamic, current and personalized to each individual user. Besides every user would also be able to view what his friends, colleagues and the world at large are browsing.

A few years down the line we may see “the return of the dynamic, re-invented Web Portal”.

Find me on Google+

The emergence of Social Software as a Service (SSaaS)

Published in Telecom Asia-17 Feb 2012 as – The dawn of Social Software as a Service

We are in the midst of a Social Networking revolution as we progress to the next decade. As technology becomes more complex in a flatter world, cooperating and collaborating will not only be necessary but also essentially imperative. McKinsey in its recent report “Wiring the Open Source Enterprise” talks of the future of a “networked enterprise” which will require the enterprise to integrate Web 2.0 technologies into its enterprise computing fabric.

Another McKinsey report  “The rise of the networked enterprise: Web 2.0 finds its payday” states “that Web 2.0 payday could be arriving faster than expected”. It goes on to add that “a new class of company is emerging—one that uses collaborative Web 2.0 technologies intensively to connect the internal efforts of employees and to extend the organization’s reach to customers, partners, and suppliers”

Social Software utilizing Web 2.0 technologies will soon become the new reality if organizations want to stay agile. Social Software includes those technologies that enable the enterprise to collaborate through blogs, wikis, podcasts, and communities. A collaborative environment will unleash greater fusion of ideas and can trigger enormous creative processes in the organization.

According to Prof. Clay Shirky ofNew YorkUniversitythe underused human potential at companies represents an immense “cognitive surplus” which can be tapped by participatory tools such as Social Software.

A fully operational social network in the organization will enable quicker decision making, trigger creative collaboration and bring together a faster ROI for the enterprise. A shared knowledge pool enables easier access to key information from across the enterprise and facilitates faster decision making.

Enterprise Social Software enables to access a shared knowledge pool across the organization. Employees can share ideas, seek out expert opinion and arrive at solutions much faster.  Social collaboration tools can truly unleash a profusion of creative ideas and thought across the organization and enable better problem solving abilities.

Clearly the social network paradigm is new concept which needs to be adopted by any organization which wants a greater marker share and a faster time to market. In today’s knowledge intensive world the need for an enterprise strategy that is focused on enabling collaboration through the use of Web 2.0 becomes obvious.

However enterprises which would like to embrace Social Technologies would face the twin challenges of i) developing the application and ii) secondly deploying it on their own data center.

Enterprises would be faced with the typical “build-vs.-buy” quandary. Organizations that want to benefit quickly from Web 2.0 technologies would prefer a buy rather than a build option.

Besides, the deployment of a Social Computing platform would require the commissioning of large data centers to allow for simultaneous access by the platform users. But the attendant problems of maintaining a large data center can be very intimidating.  The top 3 challenges of large data centers typically center around the

a)      The problems of data growth
b)      The challenges of performance and scalability
c)      And the sticky issue of network congestion and connectivity

It is against this backdrop of relevance of Social Software vis-à-vis the enterprises’ need for collaboration tools that Social Software as a Service (SSaaS) makes eminent sense.

If SSaaS could be provided as a service to enterprises with the option of either deploying it on a public or a private cloud it would make the service very attractive.

Enterprises would not have to go through the software development lifecycle of developing the social collaboration tools besides also saving them the upfront capital expenditures of creating the associated data centers. In addition the enterprise would also not have to face the technical challenges of maintaining the data centers.

Enterprises could either license the SSaaS tools only for the organization’s internal use among its employees or it could open it to its employees, suppliers and partners enabling a greater collaboration of ideas and thoughts.

The SSaaS & cloud service provider would charge the enterprise on a pay-per-use policy based on the number of users of its compute, storage and its network.

An SSaaS service would be a win-win for both the service provider and also the enterprise which can tap the creative potential of its employees.

Social Software as a Service (SSaaS) will be extremely attractive as we move to a flatter and a more knowledge intensive world.

Find me on Google+