Comment Re:that's irrelevant (Score 1) 503
That's not even a bad idea, since they already published FUD about Linux' patent hygiene, putting the opposite in writing would finally put it to rest.
That's not even a bad idea, since they already published FUD about Linux' patent hygiene, putting the opposite in writing would finally put it to rest.
Real men use aptitude.
The JVM has far more languages than the CLR, several of them far better than C#. Scala for example makes C# look like QBASIC. Java as a language may be stalled, but the JVM as an open platform is currently unbeatable, especially by Mono.
When did "master of the house" ever mean a woman? The master/mistress word divide has been there since before electricity.
That's Python-ish but it's obviously not Python. The Python would be "res = [i in aList if i.isFnord()]". Ruby or Scala would be more like res = aList.filter{_.isFnord}.
Scala, just as type-safe as C++ with good performance and excellent JVM integration:
scala> val list = List(3, 4, 2, 1)
list: List[Int] = List(3, 4, 2, 1)
scala> list foreach println
3
4
2
1
You know very well this is a big deal for the future of programming.
People are infinitely more likely to spend long stretches of time watching television with others than alone. Having someone else there makes it feel more social and less pathetic, even if you're not saying a word to each other.
Online gaming takes this to the extreme, where there are always plenty of other people there to make players feel validated in their choice of activity, and so players stay on until the "real world" forces them out.
The social element is critical to immersion and addiction. There's nothing like tribe mentality, peer pressure and dependence upon external validation to continually fuel destructive behavior.
The kind of people who get consumed by online engagement usually aren't very successful in real life anyway. If they become successful in real life following a WoW addiction, very often it's specifically because they now value the real world so much more after being essentially isolated from it.
My belief is that technology, like all advancements, helps separate people further into their "natures". If someone is susceptible to addiction, avoidance and escapism, they'll have more advanced ways of doing that in the future, but well-adjusted people will just be the same well-adjusted people, but with fancier phones and whatever else fits into their lifestyle. They will be largely unaffected by the growth of MMOs, except that some of the people they might have hung out with before will now play games instead.
I really don't think it'd be "every other Wednesday". If your code has a genuine problem with aliasing, that is a bug, and for the most part a deterministic one. You can easily confirm the problem by turning off optimisation, for example. It shouldn't even have passed unit tests.
Besides, what do you propose as an alternative? Fortran, lacking real pointers? Anything safer is slower, and anything faster is too restricted to be a general programming language. C and C++ are an almost unique sweet spot in the programming language universe, for good reasons.
See -fstrict-aliasing, and the restrict keyword. You're welcome.
C also lacks several important features for optimization, such as static typing,
Surely you jest. C has weak static typing, but it's static typing all the same, and any " + " you see in C code becomes a specific instruction once compiled. Just because that + could be for pointers, doubles or ints doesn't mean it's not static once read in context.
The weakness comes from standard C accepting almost any implicit conversion and cast, which is trivially changed to somewhat strong (but not runtime-enforced) typing by using compiler warnings and errors.
or general reasoning about memory and parallelism.
Parallelism remains fastest in C, especially in OS kernels where the cost of synchronization primitives is close to a bare minimum. If you have a modern compiler that can distinguish vectorisation from its own ass, you'll get healthy use of parallel code pipelines too.
The CPU executes instructions, oftentimes in parallel pipelines, using an instruction cache and branch prediction - none of which are modelled in the C language.
None of which has to be. If you need that kind of performance, you have two options, both with free software:
a) Embed simple non-standard statements to communicate your branch prediction beliefs
b) Use profile-guided optimisation to automatically sample real branching statistics, and recompile based on those
Either way you end up with superior branch prediction performance. Certainly far far superior to what you'd get with LISP or Python.
If you knew even half as about language implementations as you claim to, you'd know that the C language holds it's speed crown simply because it has attracted an _enormous_ amount of research into optimizing compilers, largely because the way C works _isn't_ the way the CPU works.
Ok, so what non-assembly language do you propose that does work the way a CPU works? C is the closest we have, and with modern compilers it's way faster than any other usable language. The effort of writing C is far lower than that of writing assembly, and you generally get better performance unless you know specific SIMD/MIMD instructions to replace a loop or two.
Right, so once Microsoft catches up to their own OS and supports proper device-agnostic rendering by default in Office, there'll be some point to what you said. But whether or not something is necessary, doing it wrong is still the Microsoft Way (TM), and that's the whole point of my post.
Actually it makes sense, in a Microsoft sort of way. In the effort to make its WYSIWYG editors as WYSIWYG as possible, it offloads some rendering to printer drivers so as to mimic the printed copy as closely as possible. XPS has no rendering standard so it uses sane (but not good) defaults.
Of course they've never heard of PostScript. This kind of brain-damage is just a tiny part of the failure that is Microsoft WYSIWYG and typesetting "technology".
With a proper typesetter like LaTeX you get a PDF that's a dot-for-dot match with what you'd get in a calibrated printer, without ever having to assume any particular printer. It's the printer's responsibility to implement PostScript properly, not the typesetter's responsibility to tune its PostScript to the printer!
You realise Pidgin is written in C, right?
I'd rather keep memcpy and drop memmove. But because the standard requires memcpy to work in a specific way even with overlap, it would now break backwards compatibility (for very few programs..) to add that hack to memcpy.
"I've seen it. It's rubbish." -- Marvin the Paranoid Android