One needs to look beyond the first graph that shows all sites surveyed to look at the actually active sites - there Apache appears to have more *active* deployments than the rest combined. Counting inactive, parked domains is not really indicative of particular server popularity.
In fact, the order is not as bad as some of the similar ones from the past. The original article is here (in French):
The court ruled that the ISPs and search engines have 15 days to block the sites listed in the article and the order is in force for 12 months afterwards.
However, here is the kicker: the court ruled that the right holders are to pay the bill for the implementation of the blocks, the ISPs are not being asked to do it on their own dime. So carpet bombing the courts with poorly researched URLs to block could get really expensive
Microsoft seems to operate using a very simple scheme whenever they encounter a market they are not familiar with:
1) Deny it, ignore it
2) Belittle it
3) OK, 1) and 2) don't seem to work, let's turn it into what we know already - PC! With some creative branding the customer won't see a difference!
This is going on for decades already. Basically, they are trying to turn everything into the only thing they know, where they have a strong market position and what they could leverage to conquer that market for themselves. That is PC and DOS/Windows. They don't really have a profitable market share elsewhere.
Good example of this was the Sun's NetPC (basically thin clients) vs Microsoft's Network PC (regular Windows PC booting over the network - no advantages of the NetPC, but all the disadvantages of the Windows PC), XBox (basically a PC with a proprietary/non-standard hw), PDAs with Windows CE (transplanting a desktop UI on a PDA with a stylus really didn't work too well), tablets with a desktop OS (even Windows RT is regular Windows, just crippled and running on ARM), etc.
Unfortunately, the above strategy worked probably only for the XBox, where it the customer didn't really care and it made development simpler. The rest were/are flops, because people don't want a crappy PC instead of a phone or a tablet. Unfortunately, Microsoft doesn't get that, they have an uncanny knack to take exactly the one feature that makes the device actually attractive to the user (usability, speed, battery life, portability, etc.), remove it or completely hose it up, but you do get the "Windows experience" instead! This, when combined with their frequently brain-dead UI implementations designed by someone who has likely never had to use their own products, is a deadly combination for any product.
Well, that is just a cover-your-ass excuse for the manager so that he or she doesn't have to go to the under-performer and tell it them face to face. "Oh, see, I am sorry that we have to fire you, but you are bellow this (arbitrary) line."
With that style of "management" you will be always firing someone just for the sake of firing them, regardless of the actual performance. If you want your best people to leave in a hurry and the rest of the team waste time trying to backstab each other, it is a wonderful way to achieve that. It makes no difference whether you set the cut-off at 1% or 10% - it is still only an arbitrary bullshit number made up out of thin air with some piss-poor application of statistics to make it look legit.
Someone wrote that grading on a curve works in academia but not in industry. Why should it work for grading exams when it doesn't for ranking the workers? Especially the academics that are using it should know better.
Grading on a curve (or the MS stack ranking, which is the same) is one of the most unfair and vile ranking/grading systems invented. Why? Because your actual skills don't matter. What matters is how many better (or worse) colleagues you have. If you have are in a large team (or class) of good performers, you are screwed, even if you are good - someone will be given the short end of the stick only because there are only so many "good marks" available. An extreme example are students "hacking" their exams by handing in blank sheets. Even if they all (or sufficiently many) do that, with curve grading they are guaranteed some 75% chance that they will pass - by doing nothing, because only the low 15-20% fails. Shouldn't we be marking their skills and knowledge instead?
This system also demotivates the good learners/workers - what is the point of trying to work hard, when you will not get that good mark only because there is only a limited amount given out and simply too many comparably good candidates. Essentially the system forces (undeserved) bad marks on people even though they performed equally well as the best ones. This sort of thing does wonders for morale.
Finally, the second fallacy why this is fundamentally broken is the assumption that the skill distribution in a work team or class is normal (follows a bell curve). There is absolutely no guarantee of that, because, heck, you aren't hiring the idiots, are you? I am sure that the company is hiring only "rock star" developers. Same with the students - they have to pass stringent exams and fulfill admission criteria that the majority of the population isn't able. So you have a sample here that isn't representative of the entire population (where the bell curve would be valid) and all bets are off, because the system was built on an invalid assumption. The most extreme example of this is the constant distribution - the case when all students turn in blank sheet of paper (identical "skill" level) for their exam and still pass. You would have to pick the students or hire employees randomly out of the entire population if you wanted to have a normal distribution of skill. Not very practical, though.
To conclude, if you are responsible for examining students or for evaluating employees, for the grace of God, stop using relative ranking schemes like this. Comparing people to each other is certainly easier than to evaluate their "absolute" skill, but it isn't fair, doesn't represent what you think it does and it creates a toxic environment for everyone.
It is a pity that the poster has never actually read the description of the auction, otherwise they would have found that:
"The pictures depicted from this auction show some of the early prototypes from the project; however, it should be noted that none of that hardware will be included in this auction as I had a non-intentionally set fire
" 1) Microhard Spectra 910 900MHz serial line radio with power supply (this was a prototype 900MHz radio that I believe went on to become the current generation of ZigBee-based XBee radios; 2) A collection of PC104-based enclosures and motherboards, with various interfacing such as serial ports, USB ports, etc; 3) A Mobile Wireless Technologies RM1000g AVS vehicle transponder with WWAN and GPS tracking support; 4) Novate wireless prototyping board; 5) GNU X-Tools cross compilation software; and 6) A CD filled with backup materials during several years of the company (the most valuable part of this auction obviously)"
So still some nice hw and docs, but certainly no "spy rocks" included. RTFA, guys!
The device is nothing else but two cellphone cameras with an USB interface and 3 infrared LEDs behind an IR filter. It tracks the infrared reflections off your fingertips (or a pencil or whatever) in 3D using stereoscopic vision. It does work, as much as that technology allows (nothing really revolutionary there), but the device and mainly its software have some serious issues:
- The device is way too small - the consequence is that the cameras are too close together and thus the tracked working volume is tiny. If you use both hands at the same time, you can barely move before you run out of space and the camera stops tracking your hand.
- It is very sensitive to occlusions - if the fingers (or some reflection) aren't visible, no tracking. E.g. making a grabbing gesture is really hard, because closing the hand into a fist hides the fingertips and the software gets confused. Rotating a closed hand is outright impossible - the software doesn't know how to distinguish the top and the bottom of the hand unless it can see the fingers (i.e. you are holding the palm open and spread out)
- The API is more targeted towards emulating a mouse or a multitouch desktop than actual 3D interaction. Unfortunately, a real mouse or a proper touchscreen are way more comfortable, accurate and robust to use for such tasks than the Leap, with its glitchy tracking and buggy software. There is no way to get the raw video from the SDK, so no custom algorithms are possible.
- The available software is crap. There is no way around it. The very buggy "Airspace store" is full of paid apps that are often little more than a buggy demo that you would throw away after 5 minutes once the novelty of wiggling your fingers at the screen wears off. Yay for drawing with your finger in the air
So all in all - unless you are the type of person that wants to show off at the next Powerpoint presentation by changing slides by waving one's hands (and be a laughing stock when the device won't work or skip several slides instead), there isn't much to be excited about. It is really a solution looking for a problem.
The project above is actually being done by University College London, it is even in that damn article!
This is second time in two days I see this sort of idiocy - the first article was on Gizmodo on another project. I didn't know that stupidity is contagious.
There are plenty of other choices - both for Linux distros and desktops, many specifically targeted towards the old hardware. Furthermore, if you are running so old hw that has AGP or some ARM devices, you probably don't want to run a full-blown Gnome/Unity on that anyway.
Yes, it was an Elsevier journal, but this is not specific to them, others do this as well.
Researchers get stuck between a rock and hard place - we have to publish in high impact journals (otherwise our funding is cut, low impact factor publications don't count), but ideally open access (few high impact journals are Open Access) to save expenses for the library and you can bet that nobody will give me the 3k to pay that extortionist fee above, especially not if I am to publish at least twice a year in such journal. So what am I to do?
Honestly, this does suck. Wearing my engineering hat, it is next to impossible to pay all the IEEE, ACM, what-not subscriptions I would need to access papers in my field as a private company - that's why there is so much reinventing the wheel and patenting the obvious. We had the ACM and IEEE membership and there was always a journal or a conf that was not covered. With outfits like Elsevier, Taylor & Francis etc. it gets even worse, because the subscriptions are per journal. It is completely impossible situation for a small company to deal with.
Regarding development - development for Linux on ARM is exactly the same as development for Linux on x86 and very similar to any other Unix. Most people do not write in assembler anymore and the platform differences from the point of view of a business application writer are negligible at best.
Easy - ARM doesn't yet have 64bit cores available, they were only recently announced. It will take a while until the manufacturers license them, integrate them into their products and only then can HP buy them and build a server around them.
From the looks of it, this prototype machine is unlikely to be built for databases (4GB of RAM per chip is not a lot for something like Oracle), so the 32bit limit is not really an issue. On the other hand, this screams HPC cluster/supercomputing or some other well parallelizable load, such as web servers. 32bit CPU is plenty enough for that. 64bit on a server buys you only more RAM, not much else.
It would be *very* interesting to see performance comparison between this solution and the traditional Intel one. If it is only 50% as fast, it should give Intel a lot to worry about - the higher installation density, the power savings will easily outweigh the raw power advantage Intel may have.
On the other hand, you must balance theory with practice, because otherwise the students will a) leave b) not be able to do practical projects while studying the theory. So teaching only logic programming (which is great, IMO - it helped me a lot!) is not practical.