Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×
User Journal

Journal Journal: Is Cloud computing catching on?

Using computers to gather information, organize information, and arrange information in a variety of ways is something that software developers have been working on since the initial invention of what we know as computer systems. In the early days, the focus was on making computer systems easier to program. We started with having to program the computers with buttons and switches, toggling settings that generated a 1 or a 0, representing a change in voltage, and the values represented were used to "program" various types of information. We went from there to the creation of various forms of "Assembly Language" - a set of machine instructions representing collections of those numbers and switches. We needed to move far beyond that, and there was a rapid expansion of programming languages to make it easier to "tell" the computer what to do.

All of that was useful, but in very limited ways. As computers grew more powerful, it became useful to exchange information between those computers, and the earliest forms of networks were created. That, of course, created the need for additional, faster, more capable networks. As networks grew in speed and function, the ability to transfer files between the networks was developed, then the first of many important changes took place. The ability to send notes, files, letters, and other information evolved, and it was called Electronic Mail or EMail. News groups also developed that allowed people to share information, not only one-to-one or one-to-many, but many-to-many as well. Of course, that was not enough either. When the World Wide Web (WWW) was created, this further exploded the use of the network, Email, news groups, and other forms of communication. Instant Messaging also became popular.

By this time, it was clear that using networks of computers to communicate, both machine to machine and people to people, could create a whole new range of possibilities. More than once, the creation of "network computer" systems was attempted. Many early attempts failed or had only limited success, but these also assisted in the development of further technologies, as we learned what works, what could work, and what failed to work as planned.

The term "Cloud" was used in the eighties, often referring to one of the early types of networks, a network using the X.25 International Protocol for messaging, which was actually only one of several similar standards created by ISO, an International Standards Organization for communications. The term, therefore, is not a new one when used in the context of network computing. It is, and has become, one of the most popular buzz words in the industry as developers and businesses look for more effective ways to utilize and capitalize on the available technology.

A question worth asking at this time, then, is whether Cloud Computing is really catching on, is it just the popular buzz word for today, or does it offer something tangible and useful, even in its current state of development?

I would argue that Cloud Computing is certainly not as mature as we would like it to become, and it is not the answer for every possible issue or problem, but it is a technology that promises to deliver a greater level of network access to an ever increasingly mobile centric and data hungry economy. If you question that premise, just look at the number of people carrying around some kind of device nearly everywhere they go - a cell phone, a smart phone, an iPod, iPad, GPS, you name it, just turn around and someone has one. At church, they now have to announce each week to please mute or place on vibrate any electronic devices so they do not interrupt the services. Can you even imagine such a thing, even a decade ago?

As for me, a decade ago I did not even seriously consider having a cell phone. and though I had a laptop computer, I never carried it around with me. I am still somewhat resistant to carrying electronic devices wherever I go, though I do frequently bring them with me. At work, they are an absolute must, but even in every day life, there are people who want to be in contact with me. My resistance is that sometimes I do not want ANYONE to be in contact with me; I want peace and quiet. That is one thing to consider in this information age - how much information becomes TOO MUCH? But considering Cloud Computing and its relevance, who can argue, given the overall thirst for information and entertainment, that a Cloud Computing model, where not only Email and Web browsing, but all kinds of information and entertainment, is available at any time, in any location, isn't not only a hot topic, but a reality that nearly everyone has to face?

Is Cloud Computing an idea whose time has come? I say it is, and the thirst for information and entertainment, and a desire for constant change are several of the factors that are driving it. Fast, broad band mobile networks are making it possible, and small, fast, inexpensive equipment is calling for it.

Google, a company that makes its money by selling information, is betting on it. I believe that they have every reason to do whatever they can to advance the cause - it certainly benefits their bottom line. Now to me, the responsibility is on each person. Use this information wisely; don't allow it to USE YOU!

User Journal

Journal Journal: Reviewing the latest speed wars in Web Browsers

Google raised quite a stir when they claimed to have the fastest Web browser, and Apple did the same when they claimed that Safari is the fastest Web browser. Clearly we can't have TWO "fastest Web Browsers". Where do things stand today?

Well, what is good about all of this stuff is that browser developers are taking a closer look at resource utilization of their browsers. For many years, as Web browsers added more and more features, they became more and more bloated.

I've been around a long time, and I can vividly remember when GNU Emacs was soundly criticized for being so large, using up so much memory. Well, those complaints went away a long time ago. Before we even had 1 GHz desktop computer systems, I found that GNU Emacs would load in just a few seconds on computers with as little as 200 MHz and 32 MB memory. It's so much better today. The typical Web browser is four or five times larger just to download, and the amount of virtual memory they use can easily exceed a factor of ten beyond what an even loaded Emacs would consume.

Where does that leave the typical Web browser then? Two years ago, Web browsers were probably near their all time low in terms of efficiency and performance. Yes, they were offering more and more features, but the cost of those features were becoming prohibitive.

When Apple Safari came on the scene, they took the browsing engine, KHTML, from the Konqueror file and Web browser, part of the KDE project. They released their improvements as a Web rendering technology called Webkit.

A little over two years ago, Google took that technology and used it to form the basis of their browser project. They called the browser Chrome and the source code project Chromium. They have made numerous performance improvements on top of what Apple did, and the results were staggering.

When Google made Chrome available, it shocked many people with how much faster it worked, especially on Google sites. A few years ago, that technology was immature. Other features that people commonly used were noticeably missing. Many of those features are now included, but Chrome still tries to maintain a modest appearance. It's worked great, and Chrome moved into the number three spot behind Internet Explorer and Firefox in several usage reports.

This competition has proven to be very helpful. Microsoft has done a lot with Internet Explorer to improve its reliability, performance, security, and overall usefulness. It is so much better in Versions 8 and 9 that it is hard to fathom why so many people continue to even use Versions 6 (from XP) and 7 (from Vista).

Mozilla, which has lost some ground to Chrome, has been vigorously working on improving its performance on several fronts. Mozilla had already been working on a new generation of browser capability so that it would be technology that would be portable to various hand held devices. They've taken a lot of what they've learned and applied it to the Firefox project. Version 3.6.6, the current version, is already improved a lot, but just wait until you see Version 4.0. Currently called "Minefield" because it is undergoing nightly build testing, it is nevertheless in its second test release, and we ought to see it released later this year. I've tested it a number of times and it looks really good.

I already wrote the other day about another project that the Mozilla Labs has been producing, the Mozilla Prism project, to provide Web applications. That work is also bringing new ideas to the table, and it has been effective.

Opera, which has been around a very long time, was once known as the fastest browser, but it had become large and sloppy, much like the others. Version 10.60, recently released, has many positive changes to reverse that trend. Like the other browsers, it is much faster than its previous version.

It looks like what Google has done to awaken the industry has been a positive thing. It's worth giving Google Chrome a try, but it's also worth investigating some of the test versions of the new browsers. If you can spare the time and periodically test a few of them, and provide feedback and defect reports, it definitely helps to improve the browser landscape. I try to do so at least periodically.

Slashdot Top Deals

"I may be synthetic, but I'm not stupid" -- the artificial person, from _Aliens_