Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Let's not let his work be lost (Score 1) 1

A month of profound losses, indeed.

The contributions of Steve Jobs are well known, even by people who are not interested in computers.
Dennis Richie's work is well known within the computer industry, and Slashdot is an example of how pervasively influential C and Unix became.

John McCarthy's contributions might not be as well known; and that would be a shame.

In 1961, McCarthy suggested that computing could become a reliable, affordable utility service, like the supplies of water or electricity. Within a few years, this vision became the guiding inspiration of MIT's Project MAC, which joined with GE to create the first multi-user, time-sharing, real-time, interactive, secure, reliable computing utility: Multics.

Bell Labs later joined the project, but pulled out before Multics was complete. After Bell Labs ended their involvement, Multics was finished a few years later. It was a commercial success. Its wealth of documentation about all aspects of philosophy and implementation of the system provided inspiration and guidance to other system designers. And, as was the original goal all along, the source code was fully published. Some Unix enthusiasts are ignorant of these parts of history, or misrepresent and denigrate them. That's a way to ignore some valuable lessons from the past.

Richie and his colleagues at Bell Labs were frustrated that Multics development was behind the projected timeline, and felt that excessively elaborate specification up front may have been part of the problem. Unix is a pun on Multics. C and Unix are, famously, the classic examples of starting with a very small, "good enough," easy to understand, modular and expandable system. Many of the design goals and specific features of Multics were carried through into Unix.

If there had not been a Multics project to react to, Richie might never have had his frustrations turn into inspiration for his bottom-up, "small is beautiful" approach (or, if you like Richard Gabriel's wording, "worse is better").

Some of McCarthy's other ideas were more obvious successes. He brought a mathematical logic approach to software engineering. He developed a collection of software axioms, purely for reasoning about software, which became implemented as the Lisp language. Lisp pioneered functional programming, garbage collection, consistent and minimal syntax, self-modifying software. Although its origins precede C by more than a decade, today we are still in the midst of an industry migration to bring these concepts to the mainstream.

I never met McCarthy, but I am saddened by his passing. A suitable tribute would be to carry forward his passion for fresh-thinking, forward-looking, scientifically-precise innovation in computing.

Submission + - John McCarthy, Pioneer of AI, Dies 1

Genrou writes: It is a sad month for computer sciences in the world. After Jobs and Ritchie, yesterday, October 23rd, John McCarthy died. He was a pioneer in the field of Artificial Intelligence and creator of the LISP language.

Comment Just remembered concern # 4... (Score 1) 519

Will the company stick with the system? Facebook added and changed features through the years, but they never discarded the underlying platform. Google suddenly had their whole Wave and Buzz things and then just as suddenly abandoned them. Who can be expected to be around for the long haul?

Comment 3 concerns... (Score 1) 519

I saw three concerns from people who wanted to see if Google+ could be better than Facebook.

1 It's new and cool and has a buzz about it. Does that also mean easier to use, faster, and more fun?
2 Is it better at honoring privacy concerns?
3 Is it from a more accountable organization than Facebook?

The answer to all three questions - No, not particularly. Therefore, no compelling reason to abandon the existing enormous user base available on Facebook.

Let's look a little closer at the privacy concerns. G+ "gave" with the introduction of Circles, groups of people who could be shown some updates but not others. G+ "took away" with talk of requiring government documents to authenticate the ID of all users.

Comment What he should have posted... (Score 1) 1452

I'm appalled, but not surprised. There are two possibilities here. Either Stallman is so socially incompetent that he does not realize how profoundly offense his comments are, on so many levels, and he has nobody to inform him how to be considerate and gracious towards others; or, he is aware of the offensiveness of his remarks, and does not give a damn about how petty, childish, trite, and irresponsible they show him to be, as he pisses away his opportunity of a lifetime to win support and positive regard for his movement.

Either possibility - the clueless lack of empathy, or the intentional hostility towards those who do not think identically to him - disqualifies him as legitimate moral leader of anything, let alone a revolution to change the world into following his ethical high ground.

I've been sympathetic to his cause for decades, but I've now had it with him. I would now no longer even be willing to join his parade to honor the local dogcatcher.

Some might say, if you criticize, let's see you do better. All right. Here is the statement Stallman should have made, as official position of the EFF. I retain copyright to this statement, and explicitly forbid any use of my words to benefit the EFF.

What Stallman should have said:

Steve Jobs died at an age while many expect, and receive, further decades of opportunity to make their mark on the world. Let us share our sympathy with his family, friends, and colleagues, as they mourn someone close and dear to them. Despite his life being cut short early by tragic illness, Steve made a mark on the world that has profoundly affected and inspired millions, whether or not they are in the computer technology field. He combined his own ideas with many of the best, most original, creative ideas, discoveries and inventions of many others, starting with Steve Wozniak in the 1970's and continuing through to leadership of what became the world's highest-valued company.

Because he passionately felt certain about his visions, Jobs was relentless and sometimes confrontational in driving himself and others towards their fulfillment. As a consequence, many technological developments were commercialized, brought to market, and promoted in a way that appealed to millions of customers worldwide.

The original successes of Apple Computer were based on marvelous wonders of technical efficiency that were just starting to become widely known and widely affordable: more highly integrated computer chips, and more user-friendly software. Wozniak combined these in an ingenious way to make a little machine that delighted the Homebrew Computer Club. Woz continued these developments with a machine more accessible to the masses, the Apple II, complete with its own self-contained keyboard, case, power supply, and programming language.

A major part of this machine's success was that both hardware and software were completely documented and customizable. Hardware was available for others to customize through building accessory hardware that plugged into the open slots of the machine, without any need to pay a royalty fee, work around a patent, or send a portion of the revenue to Apple after signing a non-disclosure agreement. Software was also available to be understood and built upon. Hardware schematics and source code were both published as part of the standard package of manuals that came with the initial generations of the Apple II.

Jobs had the opportunity to learn about leading research being done in a large corporate setting, at Xerox, where many ingenious, visionary, inventive people integrated existing research ideas from industry and from higher education research into computers. The Xerox team then went beyond these past ideas to new concepts about how computers could be user friendly, fun, interactive, collaborative, and understandable.

A key ingredient of the Xerox research was complete publication of the source code, accessible to any user to read and modify and extend. In this way, the Xerox researchers worked with some of the same spirit of collaborative discovery that made MIT's Artificial Intelligence Lab such an appealing, delightful place of teamwork. Both Smalltalk, at Xerox, and Lisp, at MIT, grew into large, dynamic systems that had no hidden, locked-down, private, exclusive areas limited to priests or kings of the data center. Both worked towards a computer literacy that including the ability for everyone, not just math and engineering nerds, to be able to write their own software to suit their own needs.

These ideals are a tremendous part of the emotional appeal people feel as they reflect on the work of Steve Jobs. Unfortunately, Steve drifted away from some of these ideals in his later years. Without taking anything away from the significance of Steve's many successful achievements, we can also note that he made a few large and costly mistakes along his path. Commentators have already pointed out some of those goofs, such as the overheating issue of the Apple III, and the infamous handwriting recognition of the Newton. What concerns me today is my own focus on software. What concerns me here is that, in the rush to celebrate the good that Steve did, people might overlook a more subtle, yet perhaps more important mistake: his move away from the spirit of open, collaborative sharing that was behind his initial successes.

With the Lisa, and then the Mac, and on to today's phones and gadgets and tablets, Apple has increasingly locked down a portion of their system as reserved, holy, sacred, untouchable and unquestionable. Full blueprints and full source code are no longer available. The possibility of the user putting their own choice of software, at all levels down to the operating system, on the hardware that they purchased and own, is increasingly cut off. The ability to read the source code, learn from ingenious successes, and observe and work around mistakes, is gone. In addition, commerce using these machines now passes through a central and unaccountable gateway in the hands of a corporation whose highest legal purpose is to increase its own profits and market share, not to serve the commonwealth of all users and researchers.

Apple is not alone in this transition. We see it with Apple's main rivals, such as Microsoft, and even with Google's Android, alleged to be open source but with just as much lock-down at the core as in MacOS and iOS.

As we reflect on how Steve Jobs changed the world, I encourage, invite and exhort everyone to not discard some of the best ideas that amazed Steve and helped him succeed: free, open source software that can be redistributed, read, understood, modified, improved, shared, upgraded and replaced, for the benefit of all. Let us not lose this visionary ideal at the same time that we have lost the visionary.

Now, what if Stallman had published a manifesto such as this? Wouldn't it have been better than a snippy little fart of an insult that all of Apple's customers are "fools?"

Iphone

Submission + - Is global distribution behind iPhone 4S "demand"? (delimiter.com.au)

daria42 writes: Despite a disappointed response from pundits, it appears as though the iPhone 4S is the most hotly in-demand handset in Apple's history, with the company announcing overnight that more than a million pre-orders for the phone had been placed. But is Apple's stronger global distribution behind the increased stats? The iPhone 4S will launch in more countries, faster, than the iPhone 4, leading to speculation of less organic demand for the iPhone 4S, but broader geographical reach. Do the statistics lie?

Submission + - Standard Software Development Environments 3

sftwrdev97 writes: I have only been doing software development for about 5 years, and worked most of it at one company. I recently switched to a new company and am amazed at the lack of technology used in their development process. In my previous position, we used continuous integration, unit testing, automated regression testing, an industry standard (not open source) in version control, and tried to keep up with the latest tools, Java releases, etc.
In the new position, there is no unit or regression testing, no continuous integration, compiled files are moved to the production environment basically by hand and there is no version control on them. The tools we are using have been unsupported for 5-7 years and we are still using old Java.
I am just wondering since this is only my second job in the industry, is this the norm for most development environments? Or do most development environments try to keep up on technology or just use what ever gets them by?

Submission + - Man with quadriplegia controls robot arm with mind (post-gazette.com)

awtbfb writes: Tim Hemmes, with the help of University of Pittsburgh researchers, successfully controlled a robot arm in three dimensions. He's had quadriplegia for seven years. The feat was accomplished using implanted ECoG electrodes and weeks of computer training. From the Pittsburgh Post-Gazette, "Ever since his accident, Mr. Hemmes said, he's had the goal of hugging his daughter Jaylei." Next up are six more 30-day participants, followed by a year-long study.
Medicine

Submission + - Stroke Victim Stranded at Amundsen-Scott Base

Hugh Pickens writes writes: "Renee-Nicole Douceur, the winter manager at the Amundsen-Scott research station at the South Pole, was sitting at her desk on August 27 when she suffered a stroke. “I looked at the screen and was like, ‘Oh my God, half the screen is missing,’ ” But both the National Science Foundation sand contractor Raytheon say that it would be too dangerous to send a rescue plane to the South Pole now and that Douceur’s condition is not life-threatening. Douceur's niece Sydney Raines has set up a Web site that urges people to call officials at Raytheon and the National Science Foundation. However temperatures must be higher than minus 50 degrees F for most planes to land at Amundsen-Scott or the fuel will turn to jelly and while that threshold has been crossed at the South Pole recently, the temperature still regularly dips to 70 degrees below zero. “It’s like no other airfield in the U.S.,” says Ronnie Smith, a former Air Force navigator who has flown there about 300 times. A pilot landing a plane there in winter, when it is dark 24 hours a day, would be flying blind “because you can’t install lights under the ice." The most famous instance of a person being airlifted from the South Pole for medical reasons was that involving Jerri Nielsen FitzGerald, a doctor who diagnosed and treated her own breast cancer and using only ice and a local anesthetic, performed her own biopsy with the help of a resident welder. When she departed, on October 16, 1999, it was the earliest in the Antarctic spring that a plane had taken off."
Cloud

Submission + - Beware of the iCloud (networkworld.com)

mvar writes: Network World has an article about Apple's iCloud service claiming that, as experts say, it could be an IT security professional's nightmare.

  iCloud's functionality will be very tightly integrated with both Apple devices and third-party applications. For example, app developers could use the iCloud to store data such as high scores and in-game credits, without having to set up their own Web services. Users would be automatically signed in the minute they opened the app — no need to create new user accounts for each game or application.

Then there's this scenario. You're at your office Mac, working on a sensitive company document. Now, there's a copy of the document automatically pushed to your iPad, which a family member borrowed and took to Starbucks. There's a copy on your home Macbook, which your teenager is using. Oh, and there's a copy on your iPhone, which you just left in a cab. ICloud raises serious questions in terms of what Apple plans to do to deliver a secure experience, and what enterprises need to do to protect sensitive corporate data.

Submission + - Brown vetos citizen rights (wired.com)

kodiaktau writes: In probably the most important decision Gov. Brown of California will make this year he has vetoed the bill that would require officers to get a search warrant before searching cellular phones of arrested citizens. This is sweeping legislation that further enables the police to carryout warrantless searches of private property extending into contacts, email, photots, banking activity, GPS, and other functions that are controlled by modern phones.
Privacy

Submission + - Illegal to take a photo in a shopping centre? (bbc.co.uk) 3

Kyrall writes: A man was questioned by security guards and then police after taking a photo of his own child in a shopping centre.

The centre apparently has a 'no photography' policy "to protect the privacy of staff and shoppers and to have a legitimate opportunity to challenge suspicious behaviour"

He was told by a security guard that taking a photo was illegal. He also said that a police officer claimed "he was within in his rights to confiscate the mobile phone on which the photos were taken".

Businesses

Submission + - HP should sell its PC business to save it (arstechnica.com)

packetrat writes: Hewlett Packard may not be in danger as a company, but its future in the PC business is in doubt, thanks to former CEO Leo Apotheker's maneuvers to turn HP into IBM. This article at Ars says that Meg Whitman should go ahead and sell off the PC business--mostly because HP's management is so inept, it would likely do better without them. Agilent seems to be doing okay since it was spun off in 1999, but HP may have spun off its soul in the process.
United Kingdom

Submission + - Inside the London 2012 Olympics IT Control Centre (itpro.co.uk)

twoheadedboy writes: "With the London Olympics less than a year away, IT preparations are fully underway. IT Pro got to have a poke around the testing facilities as well as the shiny Technology Operations Center. It's an IT project like no other and not just because of the technological asks. The seriously strict deadlines and immovable budgets have made this a truly strenuous initiative — one that we'll be able to judge by this time next year."

Slashdot Top Deals

There are two ways to write error-free programs; only the third one works.

Working...