Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Compare cell phone plans using Wirefly's innovative plan comparison tool ×

Comment Re:25 years, still garbage for the mainstream (Score 1) 306

since the example I gave edits images whether you like it or not

Sigh, your definition is add odds with the standard nomenclature of the entire English-speaking population of the world. In the development community there are many that would agree with you, but that is a special case. If a user would like to remove a light-pole from an image, life the shadows of the vacation image of the kids etc, he can NOT use the methods you describe. In any way. Stop being facetious.

Your very strong reaction was very odd

Because you are acting like a Linux fanboy by being intentionally obtuse.

Comment Re:This is why (Score 1) 241

Depends on the company. I've been with my current company for about five and a half years; we came in through an acquisition, where the old company had been lowballing everyone. I got a 10% raise at the time of the acquisition, and over the last five years I've gotten about a 115% raise beyond that. (No, that's not a typo, I'm making more than double what I was when we were acquired.)

Comment Re:"More Professional Than Ever" (Score 1) 306

that's been KDE's driving philosophy for years

Yeah, but KDE is not Linux, and Linux doesn't know where that driving philosophy hides. Linux is KDE/GNOME/Whatever

it's a good thing, that proprietory vendors have a cost increase with every additional distro supported while free software developers get pretty much all distros for free

Please explain how proprietary vendors have to abide by different rules than open source developers please. Also, your statement is quite obviously false since the vast majority of software vendors are entirely un-interested in developing for Linux.

The latter part of your infantile rant is also proven wrong by the fact that statistically ZERO vendors develop for Linux. I have zero problems developing for, on and under Linux, and I have been doing so for years. In fact, I was part of a team that delivered one of the first commercial applications written in Java, running mainly on Linux (and SunOS and HP-UX) way back in the late 1990s. The fact that there are no (statistically) commercial apps for Linux shows two things, one of them mirrored on Android: 1/ Developing good UI apps for the Linux platform is hard, and it is difficult to chose a technology that will be consistent and functional over time. 2/ Linux users don't buy desktop apps (Android users are far more reluctant than iOS users to pay for apps).

Consider someone like Adobe, they have a vast array of industry-leading apps, they are functionally identical on Windows and on OSX. You can be quite sure that Adobe has a clear separation between application logic and display logic, otherwise the apps would end up having far more issues that were platform specific than they do. This means that the majority of something like After Effects is already mostly platform independent. Porting from OSX to Linux would be anything but trivial, but a reasonably cost-efficient project. If there was a customer base. However, they would have to chose between (for example) KDE and GNOME. The problem is that one year KDE is the hottest thing in Linux land the next GNOME is. Who knows which is next. This drives up maintenance cost for a platform where nobody buys software. Business wise it would be idiotic for Adobe to go that route with anything marginally more complex than what they have on mobile.

If anybody in Linux land ever cared about the desktop, there would be a single UI platform with a consistent and long-lived API. Nobody cares about that though. This is why Linux is a (great) server OS, a mediocre desktop OS, and will never conquer the mainstream desktop platform. Ever. OSX won the Unix on the desktop war, and if you need a decent Unix on the desktop for running something that is not developer tools, you'd be insane to chose anything BUT OSX.

End words: I have been using Linux to develop cross-platform apps since some time in the 1990s. After moving into the mobile space I was "forced" to get a Mac since mobile today == iOS (Android users don't pay for apps) and you are basically required to have a Mac for iOS development (has recently changed a little, but that's another matter). OSX is what Linux could have been 10 years ago if someone had ever cared. Nobody ever did, and today there is only one rational solution for Unix on the Desktop.

Comment Re:User friendly (Score 1) 306

is this related to projects that use C++, Net and other forms of object oriented programming languages where you have tons of classes, members and files?


Or for visual type of programming where you design GUI:s by drag and drop?

If I say that I have never done that, it would be a tiny bit of a lie, I once used Delphi working like that. I only do server-side and web stuff, and for that Visual Studio is amazing. It's of course best for C# since the entire C# compiler system is built into the text editor (it compiles the code as you write it, making things like intellisense exceedingly accurate and lightning fast. Though it is not as configurable as, for example, Eclipse, it is more than configurable enough, and for speed it blows Eclipse out of the water. IntelliJ a little less so, but still, it's not even close. I would not (obviously) use VS for Java, but for C++, C# and cross-platform mobile development (for example) it can't be beat.

Comment Re:25 years, still garbage for the mainstream (Score 1) 306

Yeah because just that I believe that most IT stuff can be done faster using a CLI than with a GUI

A tiny tip to clue you in. Just a little bit. 99.999% of the worlds population never uses their computer for "most IT stuff". Do you know what the word "mainstream" means? I use Unix variations for development, and grep. awk and all of that is great. It's not "mainstream" though. Not by a mile.

Comment Re:25 years, still garbage for the mainstream (Score 1) 306

closer to moronic is pretending that it is not possible without even knowing

I do photography. I do software development. I use ImageMagick in a few places to alter photographs. Photo editing is not and never was, something you can do with ImageMagick. Photo editing is not making global changes to images or crop a bunch of them in the same way. Photo editing is what you do in tools like Lightroom, Photoshop (or Elements) and GiMP. It is not something you do in ImageMagick. "without even knowing". I know that ImageMagick is not photo editing software. Attempting to present it as such is moronic by definition. It's like me suggesting Neil Armstrong should have taken his bicycle to go to the moon in 1969. Moronic. By definition. So, no, it's not my opinion he's a moron, he is by definition.

calling me a liar because I have used it to do batch processing of images on many occasions

ImageMagick is very good for this. It's not photo editing though.

Comment Re:25 years, still garbage for the mainstream (Score 1) 306

A good terminal (like bash) lets you do stuff faster and easier than any GUI.

Cool, I just shot 10 images in succession while panning. Can you please stitch them together to make a panorama? Did I mention they were RAW images? You need to read the raw, stitch them, add 10% contrast, take exposure down about .75 of a stop, add some micro contrast, adjust some curves, export in aRGB for printing and sRGB for the web. When you're done, I've got the 4K video I'd like you to edit. It consists of 25 clips, you need to...

Here is a clue for you: The average person can do basically none of the work they regularly do on a computer from the command line, and if you could cobble together stuff to do some of the above, it would be insanely difficult compared to firing up Lightroom or Capture one.

Comment Re:User friendly (Score 1) 306

the project files for Visual Studio is a complete nightmare

And now they are no longer even XML, but JSON. That'll change every single release too :-) - the joy of the "could not migrate project" messages. Still, Visual Studio blows every other IDE or development environment out of the water. Particularly the nightmarish shit put out by Apple.

Comment Re:User friendly (Score 1) 306

it's easier to tell someone how to do something from a command line than to direct them to do something via the gui. Reply to This

Really? Wow! Great! Now, can you please tell me how to do this. I have a Sony A7R2 camera and I shoot raw. I would like to open the file, reduce exposure by 1.3 stops, adjust the R and G curves slightly, add some sharpening, set a white point, set a black point, lift the shadows a little, take down the highlights and publish it in aRGB for printing and sRGB for the web. Using the command line if you please.

For the vast (like 99.9%) amount of the work people do on their computers, the command line is utterly useless. Sure, when I set up a new Node project is is great, but then again, 99.9% of computer users don't know what Node is.

Comment Re: More professional than ever (Score 1) 306

Well, there are a few errors here, but..

The Registry

Not really part of the OS

NTFS - designed to fragment

Yeah, this is true, but there is a reason for it. A large multi-user system running on slow disks will (statistically) benefit from a somewhat fragmented file system. Do the maths. Of course, this hurts WNT on the desktop. The general idea is that one user task will nor process an entire file in the time-slice alotted for it to run, so when it is pre-empted, the next task will need to read a different file somewhere else on the disk, in other words, moving the read head, and before it is finished with its file it will be pre.empted, the read head will be moved again, to another place on the disk, etc. In such a scenario a fragmented file system will have higher performance than a non-fragmented system due to the read head moving shorter (on average) distances each time.

Imagine two files on a one-platter spindle. One file is on the "inner" side of the spindle, the other on the "outer". Two processes are reading one file each, but are being pre-empted multiple times during the reads. For each task switch, the read head will have to move from the outer to the inner part of the spindle, or inner to outer. In other words, for each time slice, the read head moves across the entire disk. If the files are fragmented and the fragments are spread across the disk randomly, the disk head will, on average, only move across half the spindle each time. So, at the time of design and implementation, based on the purpose of the OS (both big server and desktop were imagined) an intentionally fragmented file system made sense. The problem is that one have to live with decisions like that for a long time :-)

A fundamentally broken and insecure security model

Again, in the OS, no the model is not broken, but the way Microsoft configured it, it did become broken. Mostly because of the elevated privileges needed for the first few years to do just about anything. But still, not a bad feature of the OS. In fact, again in the OS implementation, it beats the woefully simplistic and inadequate Unix security model of the time (and for many, still at this point in time).

Slashdot Top Deals

If you're not careful, you're going to catch something.