Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Re:Why bother? (Score 1) 421

> You're the one presenting the straw man, not me.

A straw man is when you redirect an argument: In your case, you are saying "no one will do a rewrite of code if they are using a working stack", but the question is "is .NET viable?". No one is going to reply to your assertion because it is irrelevant in this context.

Comment Re:Why bother? (Score 1) 421

You are presenting a straw man. Obviously no one is going to do a rewrite. The question is simply this: Is it viable for a project? The answer is yes, because it's already been proven in several large-scale deployments, many of which are in governments and fortune 500 corporations.

Because this is slashdot, the answer is going to be 'no', and the reason given is going to be 'microsoft', because those who disagree are automatically just 'shills'.

Microsoft

Ask Slashdot: Is an Open Source .NET Up To the Job? 421

Rob Y. writes: The discussion on Slashdot about Microsoft's move to open source .NET core has centered on:

1. whether this means Microsoft is no longer the enemy of the open source movement
2. if not, then does it mean Microsoft has so lost in the web server arena that it's resorting to desperate moves.
3. or nah — it's standard Microsoft operating procedure. Embrace, extend, extinguish.

What I'd like to ask is whether anybody that's not currently a .NET fan actually wants to use it? Open source or not. What is the competition? Java? PHP? Ruby? Node.js? All of the above? Anything but Microsoft? Because as an OSS advocate, I see only one serious reason to even consider using it — standardization. Any of those competing platforms could be as good or better, but the problem is: how to get a job in this industry when there are so many massively complex platforms out there. I'm still coding in C, and at 62, will probably live out my working days doing that. But I can still remember when learning a new programming language was no big deal. Even C required learning a fairly large library to make it useful, but it's nothing compared to what's out there today. And worse, jobs (and technologies) don't last like they used to. Odds are, in a few years, you'll be starting over in yet another job where they use something else.

Employers love standardization. Choosing a standard means you can't be blamed for your choice. Choosing a standard means you can recruit young, cheap developers and actually get some output from them before they move on. Or you can outsource with some hope of success (because that's what outsourcing firms do — recruit young, cheap devs and rotate them around). To me, those are red flags — not pluses at all. But they're undeniable pluses to greedy employers. Of course, there's much more to being an effective developer than knowing the platform so you can be easily slotted in to a project. But try telling that to the private equity guys running too much of the show these days.

So, assuming Microsoft is sincere about this open source move,
1. Is .NET up to the job?
2. Is there an open source choice today that's popular enough to be considered the standard that employers would like?
3. If the answer to 1 is yes and 2 is no, make the argument for avoiding .NET.
Apple

Apple A8X IPad Air 2 Processor Packs Triple-Core CPU, Hefty Graphics Punch 130

MojoKid writes When Apple debuted its A8 SoC, it proved to be a modest tweak of the original A7. Despite packing double the transistors and an improved GPU, the heart of the A8 SoC is the same dual-core Apple "Cyclone" processor tweaked to run at higher clock speeds and with stronger total GPU performance. Given this, many expected that the Apple A8X would be cut from similar cloth — a higher clock speed, perhaps, and a larger GPU, but not much more than that. It appears those projections were wrong. The Apple A8X chip is a triple-core variant of the A8, with a higher clock speed (1.5GHz vs. 1.4GHz), a larger L2 cache (2MB, up from 1MB) and 2GB of external DDR3. It also uses an internal metal heatspreader, which the Apple A8 eschews. All of this points to slightly higher power consumption for the core, but also to dramatically increased performance. The new A8X is a significant power house in multiple types of workloads; in fact, its the top-performing mobile device on Geekbench by a wide margin. Gaming benchmarks are equally impressive. The iPad Air 2 nudges out Nvidia's Shield in GFXBench's Manhattan offscreen test, at 32.4fps to 31 fps. Onscreen favors the NV solution thanks to its lower-resolution screen, but the Nvidia device does take 3DMark Ice Storm Unlimited by a wide margin, clocking in at 30,970 compared to 21,659.
Microsoft

Test Version Windows 10 Includes Keylogger 367

wabrandsma writes From WinBeta: "One of the more interesting bits of data the company is collecting is text entered. Some are calling this a keylogger within the Windows 10 Technical Preview, which isn't good news. Taking a closer look at the Privacy Policy for the Windows Insider Program, it looks like Microsoft may be collecting a lot more feedback from you behind the scenes. Microsoft collects information about you, your devices, applications and networks, and your use of those devices, applications and networks. Examples of data we collect include your name, email address, preferences and interests; browsing, search and file history; phone call and SMS data; device configuration and sensor data; and application usage." This isn't the only thing Microsoft is collecting from Insider Program participants. According to the Privacy Policy, the company is collecting things like text inputted into the operating system, the details of any/all files on your system, voice input and program information.

Comment Re: Here's the solution (Score 1) 577

It seems to me there isn't anything special about any OS: Every single one degrades in performance as applications come and go and time passes. OSX is definitely not immune, nor is linux.

It's a phenomenon we've all been living with forever, but we're only really starting to notice it because you don't need to upgrade every 2 years.

Comment Re:Apple REULEZ! (Score 1) 408

I guess it's also worth a mention that as a 20-year veteran developer, I prefer a linux/windows combination. It's been vastly more cost effective to use virtualized linux on custom-built desktops than it ever has been to use OSX. Apple hardware is expensive and underpowered, and the "looks" and "fit'n'finish" have zero bearing on me doing my job well. OSX itself is fine, but heavy, but it doesn't help me work in any particular way - the global menu is utterly idiotic and archaic, but at least they finally implemented proper dual screen support a few years back. Apple is stronger in the smartphone tablet department, but I've been seeing really strong stuff come from the android side of the camp lately. Apple seems to spend more time on transitions and effects and occasionally API polish, Android seems to spend more time on features.

Comment Re:Apple REULEZ! (Score 0) 408

Do you spend hours in line to buy the latest and greatest apple product every release cycle? If so, you are most definitely sheep. If you are capable of living your life until it's easier to get, then you are not sheep.

However, your reponse has nothing to do with the OP topic: The point is, Apple likes top stomp out competition by making it physically impossible to compete. This is even worse than microsoft's behavior in the past.

Comment Re:Where are these photos? (Score 4, Insightful) 336

While not strictly true, if you follow the standard setup "workflow" as 95% of all computer do, you end up with icloud enabled.

I'd put $100 on all these celebrities just following setup instructions and ending up with icloud enabled, because they simply don't know better.

Slashdot Top Deals

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...