Follow Slashdot stories on Twitter


Forgot your password?

Comment I think that cookie-cutter approaches are useless, (Score 1) 927

not just for the question of how a contributor should be treated, but also for the question of how a leader should act.

If a leader is able to get world-beating results by being an asshole, then so be it. That leader has beaten the world, and I am not going to quibble with success. If a leader is an asshole and subpar output is the result, then by all means, tell them to treat their team differently.

Team dynamics are a complicated thing. You just don't fuck with a winning team. If they are using four letter words all the time and sacrificing live chickens at midnight, but the results are running circles around everyone else, I for one do not want them to stop, even if it would save a chicken's life.

At the same time, if they are doing all of these things and the results are uneven or poor, then by all means, change the behavior.

In this case, I'd say that the results of Linux kernel development speak for themselves. And if you just don't belong in the culture, then go somewhere else. If the culture starts to be counterproductive, give the world a great, big "I told you so!" and collect your profits on the book deal. But otherwise, to expect people to fuck up a successful operation for your feelings, for manners, or for high-minded ethics concerns is just bad juju. It's not lawyering or doctoring, ethical concerns are not front and center. It's software. The goal is that it works and works well, and in fact that's the highest ethical aspiration *of* software, given the many critical ways in which it gets used in today's world.

The value to the users is first. The comfort of the developers is second. If the culture and development process are working well, get the hell out of the way if you don't like them. As this person has done. So—problem solved.

Comment advice != information (Score 2) 138

The first thing I wanted after installing El Capitan was information on how to disable rootless mode, not advice about the soundness of this idea. Thankfully, I found an informative post or two by searching Google. Any advice would not have been the information I was looking for.

And for those that are interested, yes, there is actually a .conf file that controls the rootless mode protections. I forget the path, but if you Google, you'll find it. The catch of course is that you have to disable rootless mode in order to edit it, and each time you want to edit it, which means multiple reboots for each edit.

Given the fact that I use software from across the 'net on my Mac, much of it not Apple developer signed and some of it development oriented, I figured I'd likely encounter problems along the way by trying to edit the .conf file and would have to keep banging on and editing it over time. And I've been using Macs and Mac OS for years already without rootless mode, so I don't feel too catastrophically ba about not having it now.

Comment Rootless is a problem, and Office 2016 > 2011 (Score 4, Informative) 138

Just upgraded to El Capitan last night.

Problem (1): I found out very quickly that root has been neutered; you can't make any changes to "system" files (in this case, meaning files that were included in the OS distribution, including things like the folder or binaries, etc.). You get a message about not having permission, despite being root, and without any extended attributes being set on the files. Turns out that El Capitan uses a new "rootless" model in which root is no longer root and many parts of the system are off limits to any human user. Solution: Boot into recovery mode, start a terminal, and enter the command "csrutil disable" then reboot. You'll get root back and will be able to change files again.

Problem (2): Parts of Office 2011 didn't work at all—just beach balled upon startup. I tried to figure this out for a while but didn't see anyone else talking about solutions online, so I installed Office Mac 2016 (since I'm already paying for Office 365 anyway so that I can use it on my tablet and phone). I've been using the Office Mac 2016 applications all day (Outlook, Word, and Excel for work) heavily, without any trouble, so as a data sample of one I can say that in my case, 2016 is definitely a better bet on El Capitan than 2011, since Word and Outlook 2011 didn't work at all.

Comment Re:Doesn't sound like malware to me. (Score 4, Informative) 123

I guess I tie the idea of "malware" to two concepts:

1) Mal, as in harmful to the user.
2) Ware, as in software.

To me, "malware" as a concept is basically about end users. It is software that is installed by endusers that does something contrary to what they expect, possibly without their knowledge, that is harmful to them. Malware is inherently deceptive, and the method of its deception is posing as something else. It is directed from bad actors toward strangers that these actors wish to exploit. It is a numbers game, a volume game.

This was not installed by end users, it did not pose as something else, and the harm was directed at an organization by individuals within the organization. It was not distributed widely, but was a single instance. I'd call this a "hack" or a "sabotage" or an "embezzlement" of some kind before I'd call it malware. Maybe a new term is needed.

But it seems a big jump from the widespread distribution of a Windows wizard to millions of hapless end users all the way to the willful and direct modification of company equipment by employees for gain.

Comment The myopia here is pretty bizarre. (Score 1) 440

All of the following hardware from Apple has been absolutely groundbreaking/pathbreaking. When it came to market, there was nothing else like it:

- The original Mac 128k
- The Apple Newton
- The iMac
- The iPod
- The iPhone
- The iPad

Complaining that the Mac couldn't be expanded is like complaining that you can't rebore the cylinders on your Tesla to get more horsepower. These were products that changed users' understandings of the product space in question.

As a young geek, I cut my teeth on multiple computing systems. Three were old 8-bit systems: a C64, a TRS-80 CoCo2, and an Apple II. One was a Mac 128k. They were not even the same kinds of products. To call them all simply "computers" is ridiculous.

The same thing goes for:

Newton vs. other embedded "tablets" of the era (Fujitsu, IBM, GRiD, and others)
iMac vs. white-box PC
iPod vs. previous MP3 players or digital MiniDisc players
iPhone vs. previous smartphones like Treo
iPad vs. Windows CE "Handheld PC Pro" tablets

I can remember when everyone was making fun of the iPad for not including a stylus, a CF card slot, or a removable battery. And some geeks here continue to try to pretend that it was only a matter of advertising and image that caused consumers to gobble iPads up, and eventually, other companies (the entire Android field, for example) to essentially throw away any previous work and design to iPad specs and form factor.

But go ahead. Try it. Get someone a Viewsonic or Fujitsu tablet from 2009 and then hand them an iPad 1 and ask them which they prefer. Consumers aren't as stupid as people here make them out to be. They care about their hard-earned dollars just like everyone else. And they had no problem deciding that stylus-based resistive tablets with two hour batteries that were an inch and a half thick and two pounds (to accomodate replaceable batteries and removable storage) and that ran Windows... were not something they wanted to spend their money on.

Everyone here is quick to call it all "bullshit." Yet how many here own and use a tablet and/or a smartphone with a multitouch display and a purpose-specific operating system with a touch-oriented user interface? A lot, I'd bet, including many who mock such things as they peck out their mockings on an on-screen keyboard (which they also mock). Who brought these things from the halls of CERN to consumer electronics, not thinking that it was an impossiblility?

Apple under Jobs. Mock away.

Comment The geeks on Slashdot don't seem to realize (Score 1) 440

that there is a difference between "marketing," "advertising," and "sales."

Marketing is tremendously important *at the stage of product design.* Marketing, when done profitably, means *understanding your market* (i.e. users) and what they need (which may or may not be what they think they want) and then ensuring that your engineers get wind of that need and design to it.

Good marketing is an integral part of good tech, and happens well before any advertising takes place. The best products may or may not sell themselves (I think there's a good argument to be made that people need to actually know about a product, and it needs to be available in channels, before they are able to realize that it exists and buy it), but the only way to *get* to the "best products" for a large audience is to have a very good marketing division helping engineering to understand just what "best" means for a large, diverse userbase.

Yes, Apple has been very good at marketing over the last two decades. This skill is inseparable from their ability to design, and the fact of their having designed and taken to mass production, rather good hardware that is in high demand.

Comment I think geeks miss the point (Score 2) 440

becuase they're geeks and (understandably, self-servingly) want to point how how central and important geeks are to, say, computing and technology hardware and software development, design, and production.

But it is one of the rarest geniuses on earth to be able to conduct a group of people to produce to their maximum potential, to be able to somehow lead talent to actually produce what the talent is capable of as a group and to do things that everyone else wants to do, but everyone else also falls short of time and time again.

The founding of Apple was really far less miraculous than the turnaround, when Jobs was able to get a huge bureaucracy to start making really high-quality, completely realized products without the significant compromises that everyone else took for granted. I'm typing this on a Macbook Pro right now. For many years I used Thinkpads. There is a difference in the aesthetic, as is so often pointed out, but it's a difference that in the aesthetic of functionalism that has to be realized through design, logistics, manufacturing, etc. involving teams of many very smart people. The Macbook Pro isn't perfect, but it's a far superior machine to the Thinkpads I used to use, not because it's faster or has more features but because it has fewer flaws and compromises; it represents something far closer to a fully realized idea and goal.

The same thing goes for smartphones and tablets. I used to carry around Treos in the early 2000s. I used them heavily. They were my go-to tools. I wrote a book on a Treo, no kidding, riding on the subway every morning, that's still generating me about $20 in royalties a year (big money, heh). But I used them. The same thing for my Windows CE tablets, first a Vadem Clio and later a ViewSonic something-or-other. But they were exercises in taken-for-granted compromises. They were "as good as it gets," it takes a big company to design and make such things, and the end products, though flawed, were the best that could be accomplished. They were "hard problems" and "best-case solutions" as products. They worked well.

Or so everyone thought.

And then? iPhone. And iPad. And they set an entirely new bar and benchmark for their respective industries. The previous products were obsolete in a moment and everyone has struggled to catch up. Tim Cook has not been able to replicate this precisely because he does not have the particular genius that Steve Jobs had. That's not to say that other people inside Apple don't also have genuis of many varieties. Half of the people on Slashdot (okay, not half, but some) are probably geniuses in their own right, in algorithms, or some area of hardware engineering, or whatever.

That doesn't take away from the fact that Steve Jobs was a rare genius in management and leadership. He was the opposite of the pointy-haired boss. We make fun of the pointy-haired boss precisely because we realize that it is the norm. Jobs was not the clueless leader; he was the leader that always somehow managed to get it right and squeeze more great, historic, memorable, and compromise-free stuff out of the geniuses at his company, by far, than the vast majority of other leaders—even the highly regarded, very well paid ones—are able to ever come close to getting out of the geniuses at their own companies.

That's not nothing. And given the multiplier affect of getting the best out of many geniuses, it's quite a lot.

Comment I was thinking of "high end" in terms of (Score 1) 152

what consumers had access to by walking into a retail computer dealership (there were many independent white box makers at the time) and saying "give me your best."

You're probably right about me underestimating the graphics, though it's hard to remember back that far. I'm thinking 800x600 was much more common. If you could get 1024x768, it was usually interlaced (i.e. "auto-headache") and rare if I remember correctly to be able to get with 24-bit color—S3's first 16-bit capable chips didn't come out until late-1991, if I remember correctly, though I could be off.

SCSI was possible, but almost unheard of as stock, you either had to buy an add-on card and deal with driver/compatibility questions or one of the ESDISCSI bridge boards or similar. Same thing with ethernet, token, or any other dedicated networking hardware and stack. Most systems shipped with a dial-up "faxmodem" at the time, and users were stuck using Winsock on Windows 3.1. It was nontrivial to get it working. Most of the time, there was no real "networking" or "networking" support in the delivered hardware/software platform; faxmodems were largely used for dumb point-to-point connections using dial-up terminal emulator software.

And in the PC space, the higher-end you went, the less you were able to actually use the hardware for anything typical. Unless you were a corporate buyer, you bought your base platform as a whitebox, then added specialized hardware matched with specialized software in a kind of 1:1 correspondence—if you needed to perform task X, you'd buy hardware Y and software Z, and they'd essentially be useful only for task X, or maybe for task X1, X2, and X3, but certainly not much else—the same is even true for memory itself. Don't forget this is pre-Windows95, when most everyone was using Win16 on DOS. We can discuss OS/2, etc., but that again starts to get into the realm of purpose-specific and exotic computing in the PC space. There were, as I understand, a few verrry exotic 486 multiprocessors produced, but I've never even heard of a manufacturer and make/model for these—only the rumor that it was possible—so I doubt they ever made it into sales channels of any kind. My suspicion (correct me if I'm wrong) was that they were engineered for particular clients and particular roles by just one or two orgnaizations, and delivered in very small quantities; I'm not aware of any PC software in 1992 timeframe that was even multiprocessor-aware, or any standard to which it could have been coded. The Pentium processor wasn't introduced until '93 and the Pentium Pro with GTL+ and SMP capabilities didn't arrive until 1995. Even in 1995, most everything was either Win16 or 8- or 16-bit code backward compatible to the PC/XT or earlier, and would remain that way until around the Win98 era.

The UNIX platforms were standardized around SCSI, ethernet, big memory access, high-resolution graphics, and multiprocessing and presented an integrated environment in which a regular developer with a readily available compiler could take advantage of it all without particularly unusual or exotic (for that space) tactics.

Comment Wow, end of an era. (Score 4, Interesting) 152

For more than just a couple of us here, I suspect, there was a time when "Sparc," "UNIX," "graphics," "Internet," and "science" were all nearly synonymous terms.

Simpler times. Boy did that hardware last and last and last in comparison to the hardware of today.

Well, I suppose it can finally no longer be said that the Sparcstation 10 I keep here just for old times' sake can still run "current Linux distributions." But it's still fun to pull it out for people, show them hundreds of megabytes of RAM, 1152x900 24-bit graphics, gigabytes of storage, multiple ethernet channels, and multiple processors, running Firefox happily, and tell them it dates to 1992, when high-end PCs were shipping with mayyybe 16-32GB RAM, a single 486 processor, 640x480x16 graphics, a few dozen megabytes of storage, and no networking.

It helps people to get a handle on how it was possible to develop the internet and do so much of the science that came out of that period—and why even though I don't know every latest hot language, the late '80s/early '90s computer science program that I went to (entirely UNIX-based, all homework done using the CLI, vi, and gcc, emphasis on theory, classic data structures, and variously networked/parallelized environments, with labs of Sparc and 88k hardware all on a massive campus network) seems to have prepared me for today's real-world needs better than the programs they went to, with lots of Dell boxes running Windows-based Java IDEs.

"Stupidity, like virtue, is its own reward" -- William E. Davidsen