Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment: Re:Why bother? (Score 1) 330

You are presenting a straw man. Obviously no one is going to do a rewrite. The question is simply this: Is it viable for a project? The answer is yes, because it's already been proven in several large-scale deployments, many of which are in governments and fortune 500 corporations.

Because this is slashdot, the answer is going to be 'no', and the reason given is going to be 'microsoft', because those who disagree are automatically just 'shills'.

Microsoft

Ask Slashdot: Is an Open Source .NET Up To the Job? 330

Posted by Soulskill
from the good-steps-or-irrelevant-steps dept.
Rob Y. writes: The discussion on Slashdot about Microsoft's move to open source .NET core has centered on:

1. whether this means Microsoft is no longer the enemy of the open source movement
2. if not, then does it mean Microsoft has so lost in the web server arena that it's resorting to desperate moves.
3. or nah — it's standard Microsoft operating procedure. Embrace, extend, extinguish.

What I'd like to ask is whether anybody that's not currently a .NET fan actually wants to use it? Open source or not. What is the competition? Java? PHP? Ruby? Node.js? All of the above? Anything but Microsoft? Because as an OSS advocate, I see only one serious reason to even consider using it — standardization. Any of those competing platforms could be as good or better, but the problem is: how to get a job in this industry when there are so many massively complex platforms out there. I'm still coding in C, and at 62, will probably live out my working days doing that. But I can still remember when learning a new programming language was no big deal. Even C required learning a fairly large library to make it useful, but it's nothing compared to what's out there today. And worse, jobs (and technologies) don't last like they used to. Odds are, in a few years, you'll be starting over in yet another job where they use something else.

Employers love standardization. Choosing a standard means you can't be blamed for your choice. Choosing a standard means you can recruit young, cheap developers and actually get some output from them before they move on. Or you can outsource with some hope of success (because that's what outsourcing firms do — recruit young, cheap devs and rotate them around). To me, those are red flags — not pluses at all. But they're undeniable pluses to greedy employers. Of course, there's much more to being an effective developer than knowing the platform so you can be easily slotted in to a project. But try telling that to the private equity guys running too much of the show these days.

So, assuming Microsoft is sincere about this open source move,
1. Is .NET up to the job?
2. Is there an open source choice today that's popular enough to be considered the standard that employers would like?
3. If the answer to 1 is yes and 2 is no, make the argument for avoiding .NET.
Apple

Apple A8X IPad Air 2 Processor Packs Triple-Core CPU, Hefty Graphics Punch 130

Posted by samzenpus
from the give-me-the-numbers dept.
MojoKid writes When Apple debuted its A8 SoC, it proved to be a modest tweak of the original A7. Despite packing double the transistors and an improved GPU, the heart of the A8 SoC is the same dual-core Apple "Cyclone" processor tweaked to run at higher clock speeds and with stronger total GPU performance. Given this, many expected that the Apple A8X would be cut from similar cloth — a higher clock speed, perhaps, and a larger GPU, but not much more than that. It appears those projections were wrong. The Apple A8X chip is a triple-core variant of the A8, with a higher clock speed (1.5GHz vs. 1.4GHz), a larger L2 cache (2MB, up from 1MB) and 2GB of external DDR3. It also uses an internal metal heatspreader, which the Apple A8 eschews. All of this points to slightly higher power consumption for the core, but also to dramatically increased performance. The new A8X is a significant power house in multiple types of workloads; in fact, its the top-performing mobile device on Geekbench by a wide margin. Gaming benchmarks are equally impressive. The iPad Air 2 nudges out Nvidia's Shield in GFXBench's Manhattan offscreen test, at 32.4fps to 31 fps. Onscreen favors the NV solution thanks to its lower-resolution screen, but the Nvidia device does take 3DMark Ice Storm Unlimited by a wide margin, clocking in at 30,970 compared to 21,659.
Microsoft

Test Version Windows 10 Includes Keylogger 367

Posted by samzenpus
from the all-the-better-to-track-you-with dept.
wabrandsma writes From WinBeta: "One of the more interesting bits of data the company is collecting is text entered. Some are calling this a keylogger within the Windows 10 Technical Preview, which isn't good news. Taking a closer look at the Privacy Policy for the Windows Insider Program, it looks like Microsoft may be collecting a lot more feedback from you behind the scenes. Microsoft collects information about you, your devices, applications and networks, and your use of those devices, applications and networks. Examples of data we collect include your name, email address, preferences and interests; browsing, search and file history; phone call and SMS data; device configuration and sensor data; and application usage." This isn't the only thing Microsoft is collecting from Insider Program participants. According to the Privacy Policy, the company is collecting things like text inputted into the operating system, the details of any/all files on your system, voice input and program information.

Comment: Re: Here's the solution (Score 1) 577

by atlasdropperofworlds (#48044165) Attached to: Will Windows 10 Finally Address OS Decay?

It seems to me there isn't anything special about any OS: Every single one degrades in performance as applications come and go and time passes. OSX is definitely not immune, nor is linux.

It's a phenomenon we've all been living with forever, but we're only really starting to notice it because you don't need to upgrade every 2 years.

Comment: Re:Apple REULEZ! (Score 1) 408

by atlasdropperofworlds (#47955703) Attached to: Why You Can't Manufacture Like Apple

I guess it's also worth a mention that as a 20-year veteran developer, I prefer a linux/windows combination. It's been vastly more cost effective to use virtualized linux on custom-built desktops than it ever has been to use OSX. Apple hardware is expensive and underpowered, and the "looks" and "fit'n'finish" have zero bearing on me doing my job well. OSX itself is fine, but heavy, but it doesn't help me work in any particular way - the global menu is utterly idiotic and archaic, but at least they finally implemented proper dual screen support a few years back. Apple is stronger in the smartphone tablet department, but I've been seeing really strong stuff come from the android side of the camp lately. Apple seems to spend more time on transitions and effects and occasionally API polish, Android seems to spend more time on features.

Comment: Re:Apple REULEZ! (Score 0) 408

by atlasdropperofworlds (#47955669) Attached to: Why You Can't Manufacture Like Apple

Do you spend hours in line to buy the latest and greatest apple product every release cycle? If so, you are most definitely sheep. If you are capable of living your life until it's easier to get, then you are not sheep.

However, your reponse has nothing to do with the OP topic: The point is, Apple likes top stomp out competition by making it physically impossible to compete. This is even worse than microsoft's behavior in the past.

Comment: Re:Where are these photos? (Score 4, Insightful) 336

While not strictly true, if you follow the standard setup "workflow" as 95% of all computer do, you end up with icloud enabled.

I'd put $100 on all these celebrities just following setup instructions and ending up with icloud enabled, because they simply don't know better.

Businesses

Apple Confirms Purchase of Beats For $3 Billion 188

Posted by Soulskill
from the throwing-down-the-big-money dept.
SimonTheSoundMan writes: "Apple has confirmed it will buy Beats Electronics and Beats Music for $3 billion. Apple will make the purchase using $2.6 billion in cash and $400 million in stock. An important part of the acquisition for Apple is absorbing the Beats subscription streaming service, even though it only has about 110k users. The Beats brand will remain intact, and will continue to sell headphones. "
Science

Nat Geo Writer: Science Is Running Out of "Great" Things To Discover 292

Posted by samzenpus
from the nothing-new-under-the-sun dept.
Hugh Pickens DOT Com (2995471) writes "John Horgan writes in National Geographic that scientists have become victims of their own success and that 'further research may yield no more great revelations or revolutions, but only incremental, diminishing returns.' The latest evidence is a 'Correspondence' published in the journal Nature that points out that it is taking longer and longer for scientists to receive Nobel Prizes for their work. The trend is strongest in physics. Prior to 1940, only 11 percent of physics prizes were awarded for work more than 20 years old but since 1985, the percentage has risen to 60 percent. If these trends continue, the Nature authors note, by the end of this century no one will live long enough to win a Nobel Prize, which cannot be awarded posthumously and suggest that the Nobel time lag 'seems to confirm the common feeling of an increasing time needed to achieve new discoveries in basic natural sciences—a somewhat worrisome trend.' One explanation for the time lag might be the nature of scientific discoveries in general—as we learn more it takes more time for new discoveries to prove themselves.

Researchers recently announced that observations of gravitational waves provide evidence of inflation, a dramatic theory of cosmic creation. But there are so many different versions of 'inflation' theory that it can 'predict' practically any observation, meaning that it doesn't really predict anything at all. String theory suffers from the same problem. As for multiverse theories, all those hypothetical universes out there are unobservable by definition so it's hard to imagine a better reason to think we may be running out of new things to discover than the fascination of physicists with these highly speculative ideas. According to Keith Simonton of the University of California, 'the core disciplines have accumulated not so much anomalies as mere loose ends that will be tidied up one way or another.'"

What is mind? No matter. What is matter? Never mind. -- Thomas Hewitt Key, 1799-1875

Working...