Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:The programming language for the next 20 years. (Score 3, Insightful) 315

Entire operating systems are written in C -- as they should be.

But C is a low level language. Not the best tool for writing applications.

Higher level languages and managed runtime systems have gained so much traction for a reason. They are very productive to use. They protect you from simple mistakes. The relieve the burden of memory management. GC simplifies library APIs by making the question of who should dispose of what become irrelevant. We could still be programming in assembly language instead of C. Why aren't we? Why aren't OSes written in assembly? Because C is more productive and higher level. Similarly, there are higher level languages than C, and they have their place. C is not the end all of abstraction.

Comment Re:So much Fail. Ignore. (Score 4, Insightful) 315

So much fail about Garbage Collection.

GC is not about forgetting to free memory. It's about higher level abstraction removing the need for the programmer to do the bookkeeping that the machine can do. Why don't we still program in assembler? Because it's less productive. It's about productivity. As data structures become extremely complex, and get modified over time, keeping track of the ownership responsibility of who is supposed to dispose of what becomes difficult to impossible, and is the source of memory leak bugs. In complex enough programs, you end up re-inventing a poor GC when you could have used one that is the product of decades of research.

The article fails to understand that you can also run out of memory in a program using GC. Just keep allocating stuff without and keeping references to everything you allocate.

Reference Counting is not real GC. Cyclical data structures will never get freed using reference counting.

One of the major, but under-recognized benefits of GC, which the article fails to mention, is that GC allows much simpler ''contracts' in APIs. No longer is memory management part of the 'contract' of an API. It doesn't matter which library or function created an object, nobody needs to worry about who is responsible for disposing of that object. When nobody references the object any more, the GC can gobble it up.

On the subject of Virtual Machines, the article could mention some of the highly aggressive compilation techniques used in JIT compilers. So every method in Java is a virtual call. But a JIT compiler knows when there is only one subclass that implements a particular method and makes all calls to the method non-virtual. If another subclass is loaded (or dynamically created on the fly) the JIT can recompile all methods that call the method such that they are now virtual calls. Yet still, the JIT may be able to prove that certain calls are always and only to a specific subclass, and so they can be non-virtual.

The JIT compiler in JVM can aggressively inline small functions. But if a class gets reloaded on the fly such an the body of an inlined method changed, the JIT will know to recompile every other method that inlined the changed method. Based on the changes to the method, it may or may not now make sense to inline it -- so the decision on whether to inline the method can change based on actual need.

The HotSpot JVM dynamically profiles code and doesn't waste time and memory compiling methods that do not have any significant effect on the system's overall performance. The profiling can vary depending on factors that vary from system to system, and could not be predicted in advance when using a static compiler. The JIT compiler can compile your method using instructions that happen to exist on the current microprocessor at runtime -- something that could not be determined in advance with a static compiler.

All of this may seem very complex. But it's why big Java systems run so darn fast. Not very many languages can have tens or even hundreds of gigabytes (yes GB) of heap with GC pause times of 10 ms. Yes, it may need six times the amount of memory, but for the overall benefits of speed, the cost of memory is cheap.

Comment What about a mutual spying arrangement? (Score 1) 109

If we pay you to spy on our citizens, because we're not allowed to do it ourselves, then will you pay us to spy on your citizens because you cannot do that yourself?

It's good for the global economy because money changes hands. (Nevermind that no actual goods or benefits to society are procduced.)

Everyone is happy. (Nevermind citizens in the global police state.)

Comment Re: Not usable (Score 3, Insightful) 125

And Microsoft seems to just love, and perhaps even encourage this particular confusion. In fact their poor branding would seem to have been deliberately designed to cause confusion leading to people buying an RT device and then discovering that it doesn't really run Windows apps. It only runs the tiny library of Windows 8 RT apps.

Comment Re:Miami Vice (Score 1) 125

Your own argument about western culture arriving slightly delayed in the developing world should have caused you to conclude that the developing would would think something like:
1. If westerners aren't buying this Windows 8 crap, then why are they sending it to us?
2. If westerners are using non-Windows tablets, then we should be too (but perhaps just a bit delayed)

Yes, it's hammer time. For Microsoft. And it's about time.

Comment Re:Who couldn't see this coming? (Score 5, Informative) 300

> And yet, they are still making gobs of money. In fact, they are more profitable than ever.

I remember in the late 1970's when IBM people were laughing at these 'toy' microcomputers. HA HA! Those toys will never be like real computers. Certainly not a threat to IBM which is making gobs of money. In fact, IBM is more profitable than ever.

IBM introduced a PC in 1981. Thinking they might sell up to two million. By the mid 1990's IBM had lost the PC market, abandoned the PS/2 attempt to re-monopolize it, and eventually got out of the PC business completely. Before the end of the 1990's IBM had re-invented itself. Think the same thing won't happen to Microsoft? You may be too young to remember, but in the 1980's, even by the late 1980's it was completely laughable to even consider that IBM might find itself on hard times. But it happened. And just a few years ago it was laughable to suggest that Microsoft might lose its industry dominance. Not so much laughable anymore.


> Moves like this don't really help anything.. not even the bottom line, since the massive cuts crush morale and limit the ability of the company to innovate to keep ahead of the competition.

Moves far more radical than this may be the only way Microsoft stays around in the long term. We'll see what Microsoft looks like in a decade.

Comment Re:Who couldn't see this coming? (Score 1) 300

You lack faith in Microsoft, sir.

Windows 8 is going wonderful! Everyone loves the new UI. That is why Windows 9 will never bring back the Start menu. Just watch, you'll see!

PC Sales are not being affected by the new mobile device trend. Microsoft will dominate mobile devices and phones any day now! You'll see!

Oh, and some day Microsoft will make a dent in data centers, and Microsoft Azure Cloud will become important. Of course, Azure Cloud is the only cloud service that was built for Windows instead of Linux workloads. And Windows is used by some large* computing cluster users, um, somewhere. And businesses using Linux workloads would be happy to trust their business built on Linux to Microsoft, a company that to this very day is working to destroy Linux.

(*and by large, I mean much larger scale than you are thinking if you are thinking of Windows)

Comment Re:Cry Me A River (Score 2) 608

I learned BASIC in 1977, about the same way, and about as quickly.

And I was writing a few BASIC programs shortly thereafter. But they are today what I would call TRIVIAL. Things that I would do in a single method of a modern language. With much better style, correctness, comprehensibility and maintainability.

Having just learned programming myself doesn't mean I was by any means an expert ready to work on big commercial problems worth lots of money. It took years more to learn a lot of important things. Structured Programming (aka giving up GOTO). Encapsulation. Information hiding. Data structures and dynamic memory. Algorithms. Understanding performance classification of algorithms. Understanding how the machine works at the low level. Writing toy or elementary compilers. Learning a LISP language (pick any one, they will teach you the same important and valuable lessons). Learning databases. How they work as well as how to use them. Read a few good books on human interface design before building a complex GUI program. I could go on and on.


> You can't learn how to build a highly optimised, always available, secure e-commerce trading platform in 8 hours.

Correct. The point here I think is that to have all of the valuable skills that makes you good at something, and fast at it, and apparently able to recognize the solutions to problems very quickly is -- lots and lots of study and practice. Years of learning. Failures (hopefully on some of your own toy problems first rather than commercial ones). Figuring out how to debug complex systems -- without or prior to the existence of source level debuggers.

I don't have a lot of sympathy for those who cry because employers want skilled programmers. Well, professional sports teams want skilled players. And modelling agencies want beautiful people. These things come with some combination of luck of the draw and effort to take advantage of it. (Those models don't eat donuts, for example.) I also think computer geeks should be able to cry and whine that humanities studies are unfair.

Comment In the old days . . . (Score 1) 608

From TFA (the friendly article, or whatever other F-word you prefer) . . .
> In the old days there was a respected profession of application programming.
> There was a minority of elite system programmers who built infrastructure and tools
> that empowered the majority of application programmers.


I think it is still that way. But now there is a third class who think that breaking into the application programming is some kind of godlike elite skill because it requires you to actually know more than the mere syntax of a language. Programming is racist and sexist because it requires you to even learn the syntax of a programming language. Why can't the computer just do what they say? Why do they need a special language? Why should it be necessary to learn to design complex databases, and understand in memory data structures and algorithms? Why focus on gaining lots of insight in order to come up with vastly superior algorithms?

In short, from what I see on some programming boards, what some people seem to want is a high paying position where an untrained monkey could get a computer to do what the boss wants, and then collect a paycheck -- um, no. Direct deposit.

Slashdot Top Deals

New York... when civilization falls apart, remember, we were way ahead of you. - David Letterman

Working...