GREEN-IT MODULE NOTES FOR CSE AND ISE ENGINEERS

krushithashetty1 15 views 22 slides Jul 25, 2024
Slide 1
Slide 1 of 22
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22

About This Presentation

GREEN IT MODULE NOTES
GREEN-IT MODULE NOTES


Slide Content

Green ICT: History, Agenda, and Challenges Ahead

Industrial Revolution The Industrial Revolution also fundamentally changed Earth’s ecology and humans’ relationship with their environment. One of the most immediate and drastic repercussions of the Industrial Revolution was the explosive growth of the world’s population. According to Eric McLamb (2011), with the dawn of the Industrial Revolution in the mid-1700s, the world’s population grew by about 57% to 700 million, would reach 1 billion in 1800, and within another 100 years, it would finally grow to around 1.6 billion. A hundred years later, the human population would surpass the 6 billion mark. This phenomenal growth in population put enormous pressure on the planet, forcing it to cope with a continuously expanding deficit of resources.

Causes The transformation from cottage industry and agricultural production to mass factory-based production led to the depletion of certain natural resources, large-scale deforestation, depletion of gas and oil reserves, and the ever-growing problem of carbon emissions —mainly the result of our reckless use of fossil fuels and secondary products. The pollution problem following the Industrial Revolution led to the atmospheric damage of our planet’s ozone layer as well as air, land, and water pollution. The two world wars in the early twentieth century brought with them catastrophic human and natural disasters as well as rapid development of military technologies. These developments laid the groundwork for the emergence of what some would like to call the Second Industrial Revolution.

Second Industrial Revolution The Second Industrial Revolution—The Emergence of Information and Communication Technologies In a visionary paper entitled “As We May Think” (published in Atlantic Monthly July 1945), Vannevar Bush, the scientific adviser to President Theodore Roosevelt, predicted a “bold” and exciting future world, based on “memory extended” machines. In 1946, the first new generation of computers emerged from US military research. Financed by the United States Army, Ordnance Corps, and Research and Development Command, the Electronic Numerical Integrator and Computer (ENIAC) was announced and nicknamed the “giant brain” by the press.

In reality, ENIAC, as compared to the smartphones of today, had very limited functionality and capabilities. According to Martin Weik (December 1955) . ENIAC contained 17,468 vacuum tubes, 7200 crystal diodes, 1500 relays, 70,000 resistors, 10,000 capacitors, and around 5 million hand-soldered joints. It weighed more than 27 tons, was roughly took up 1800 ft 2 (167 m2), and consumed 150 kW of power. This led to the rumor that whenever the computer was switched on, the lights in Philadelphia dimmed.

In fact, all ENIAC could do was to “discriminate the sign of a number, compare quantities for equality, add, subtract, multiply, divide, and extract square roots. ENIAC stored a maximum of twenty 10-digit decimal numbers. Its accumulators combined the functions of an adding machine and storage unit. No central memory unit existed per se . Storage was localized within the functioning units of the computer” ( Weik , 1961 )

The Integrated Circuit (IC) Revolution Although there was much excitement about the development of ENIAC, its sheer size and physical requirements, coupled with the limited functionality it offered for such a huge cost, used such systems limited to large mainly military-based research labs. At the same time that ENIAC was being developed and launched, there were other significant developments in the field of information technology.

Werner Jacobi , a German scientist working with Siemens AG, fielded a patent for developing cheap hearing aids based on an “integrated-circuit-like semiconductor amplifying device.” Further development on a similar idea was pioneered by Geoffrey Drummer, working for the British Royal Rader. Drummer used the symposium on Progress in Quality Electronic Components in 1952 to discuss his revolutionary idea but was unfortunately unsuccessful in realizing his vision and building such a circuit.

It was not until 1958 that a young scientist working for Texas Instruments, Jack Kilby, came up with a solution for a truly IC and was able to successfully demonstrate a working version of it on September 12, 1958. The idea of being able to create much smaller systems, coupled with computer applications developed to offer greater functionality, soon changed the face of computing and led to the march of “machines.” Jack Kilby won the 2000 Nobel Prize in Physics for his part in the invention of the IC, and his revolutionary work on the development of IC was named as an IEEE Milestone in 2009

New Age of Computer Technology According to the Historical Museum of computers online, there were only around 250 computers in use in the world in the 1950s. The development of IC and the microprocessor manufacturing industry and the emergence of corporations such as Intel meant that the world of computer technology had changed significantly. Intel’s founder, Gordon Moore , observed that as the hardware industry grew rapidly, the number of transistors in a dense IC doubled approximately every two years—this estimation proved to be accurate and by the 1980s, more than 1 million computers were in use throughout the world. According to Gartner Inc., the number of installed PCs worldwide has surpassed 1 billion units, and it is estimated that the worldwide installed base of PCs is growing just under 12% annually. At this pace, it has been estimated that the number of installed PCs will surpass 2 billion units by early 2014. To this staggering figure, one must add the number of tablet and smartphone devices to understand the scale of this new technological age and its potential environmental cost.

It is important to note that it is not just the scale of the hardware development and emergence of more and more powerful PCs that have changed our way of lives but also the phenomenal development of applications available on our PCs and mobile devices. I recall that in early 1992, I had my first taste of using the Internet and managed to log in to the Library of Congress information page, which showed me its opening and closing times. This was incredibly impressive, and I was thrilled that I could connect to the other side of the Atlantic at such speed; nevertheless, sitting in Kingston University’s research lab, I also wondered how useful this information was for me if I could not access the rich content that the library had to offer. But it was only a matter of time before this changed, and the doors were opened to the “information super highway.” The clumsy text-based interface that I had used was replaced by a much more interesting and exciting Mosaic (graphic capable) interface and then the wonderful World Wide Web (WWW) arrived, and the rest, as they say, is history. I do not think many of us living through the rapid development of this exciting yet unknown technology could have imagined that in a couple of decades our lives would be so dominated by it with frightening and somewhat crippling effects.

In reality, we have reached a stage where estimating the size and growth of the Internet and the WWW is extremely difficult, if not impossible. Nevertheless, one can find some terrifying figures online: according to www.factshunt.com , the size of the Internet and the WWW is something like: • 48 billion—webpages indexed by Google Inc. • 14 billion—webpages indexed by Microsoft’s Bing • 672 exabytes—672,000,000,000 gigabytes of accessible data • 43,639 petabytes—total worldwide Internet traffic (2013) • More than 900,000 servers—owned by Google Inc. (largest in the world) • More than 1 yottabytes—total data stored on the Internet (includes almost everything; 1 yotta = 1000 Λ8 )

Global Mobile Computing and Its Environmental Impact Clearly, the next question one has to ask is what the overall cost and the energy footprint of the new digital age . Although there have been a number of attempts to identify a definitive cost model for the new digital age, clearly the scale and complexity of such calculations and the constantly changing data set have meant that at best we can come up with are intelligent guesses rather than definitive answers.

To answer the question of what the energy footprint of the new technology age requires, we must consider the electricity consumed by the world’s laptops, desktops, tablets, smartphones, servers, routers, and other networking equipment ( Gills, 2011 ). The energy required to manufacture these machines also needs to be included, as more importantly, does the energy required to keep such a wide range of devices and systems operational (such as the fact that smartphones typically require charging on daily basis).

Another key issue considered by researchers is the enormous volume of data transferred across our global mobile network of devices and the cost associated with maintaining and transferring these data. In an interesting report sponsored by the National Mining Association American Coalition for Clean Coal Electricity) produced in August 2013, Mark Mills (CEO of Digital Power) states, “The information economy is a blue whale economy with its energy uses mostly out of sight. Based on a mid-range estimate, the world’s Information Communications Technologies (ICT) ecosystem uses about 1500 TWh of electricity annually, equal to all the electric generation of Japan and Germany combined as much electricity as was used for global illumination in 1985. The ICT ecosystem now approaches 10% of world electricity generation. Or in other energy terms the zettabyte era already uses about 50% more energy than global aviation.”

The ever-growing demand for data as a commodity that rules our lives along with the concept of “never switching off” and “contact on demand” with the requirement to keep the devices necessary to entertain this “new digital age” has brought with it further serious challenges that require immediate attention. Cloud computing was an incredibly promising concept that was perhaps hyped beyond reasonable measures to convince mass consumer migration. The matter of cost was never discussed or scrutinized as a factor before multinational corporations “migrated” us into the cloud. The US Environmental Protection Agency estimates that centralized computing infrastructures (data centers ) currently use 7 GW of electricity during peak periods. This translates to about 61 billion kilowatt hours of electricity used. As already indicated, we are moving to the age of zetta and yotta data (10007 and 10008), but pressure from environmentally conscious consumers has forced computer and Internet giants such as Microsoft, Google, and Yahoo to build their new data centers on the Columbia River, where there is access to both hydroelectric power and a ready-made source of cooling.

The Agenda and Challenges Ahead what are the key agenda items for the “Green Information and Communication Technology” debate, and what are the key challenges facing researchers and developers in this area? While there is focus on big infrastructure and big data computing, the debate quite often tends to overlook the large number of existing “legacy” systems (which are neither green nor efficient). Even our current laptops are viewed as “old legacy” systems. While the statistics show that there are around 2 billion PCs in operation currently, most of these systems suffer from “old” hardware (i.e., power hungry) designs. It is not surprising that manufacturers such as Intel have spent billions of dollars designing the next generation of microprocessors (“Haswell”) and moving toward fanless , less power-hungr y systems. Another key challenge is how we measure ICT performance and sustainability and what tools we can use to provide reliable data. It is clear that unless we have a reliable methodology to measure ICT suitability,

One of the most critical challenges facing green Information Technology is the legal framework within which system developers and providers need to work. Although recently the European Union (January 2015) brought in regulation to European Union rules to oblige new devices such as modems and internet-connected televisions to switch themselves off when not in use, nevertheless, we are far from having robust sets of enforceable regulations for green IT on the national or international level. While cloud computing was hailed as the “green way” of moving away from device- dependent clunky power-hungry applications and data storage approaches, and although it could be claimed that the use of virtualized resources can save energy, a typical cloud data center still consumes an enormous amount of energy. Also, because the cloud is comprised of many hardware and software elements placed in a distributed fashion, it is very difficult to precisely identify one area of energy optimization.

Some of the most interesting areas of research in green Information Technology concern “ energy harvesting” or “energy scavenging,” and the concept of the internet of things (IoT). Energy harvesting is exploring how we can take advantage of various ambient power sources scattered all around us. The concept of the internet of things examines a series of technologies that enable machine-to-machine communication and machine-to-human interaction via internet protocols. Being able to use the World Wide Web and the global connectivity of billions of machines, smartphones, and tablets to monitor and control energy usage can have a tremendous impact on developing a much more environmentally friendly computing world that is sustainable.

Figure 2.  In 1962, four women programmers hold parts of the first four Army computers. From left is the ENIAC board, EDVAC board, ORDVAC board, and the BRLESC-I board. Image courtesy of the US Army.
Tags