The next time a similar story comes out for BeOS, I'll probably be interested.
The next time a similar story comes out for BeOS, I'll probably be interested.
I guess operating systems acquiring HiDPI support is one of the reasons going for the flat look.
My own guess is that someone has a color e-ink display in the works.
The last time I checked, cheap color e-ink displays simply couldn't show enough colors for photos. They were fine for charts and graphs, and fine for color-coding text, but if you tried to do something photorealistic in them, well, it was worse than old 90s-era 16-bit displays.
Flat icons with few colors would work spectacularly well on such displays.
If these icon changes are actually in support of color displays that draw almost no power and are completely readable in full sunlight, then bring on the ugly icons please!
Incorrect. Buying a *single* ticket is worth it, since it puts you on the playing field at least.
I do not agree, because everyone is already on the playing field.
There is always a nonzero chance that you'll find a winning ticket, or receive one as a gift. That's a true thing that many people haven't internalized.
If you can internalize it, then you're always playing, and the question is whether the increase in odds from your zeroth purchase to your first purchase is worth the cost. (I have never decided that it was, so far.)
Is Linux really aiming that hard at becoming a toy OS for thirteen year olds like Windows?
Speaking as an old-time Unix neckbeard, the best evidence I've got is that the answer is "yes". (cf. "systemd", "Network Manager")
You mean like Devuan?
I'm currently trying to decide between that, Debian/kFreeBSD, and stock Debian with systemd purged and locked/held (so that I can't accidentally install something that requires it).
Why Debian/kFreeBSD? Because systemd is not portable and won't run on the FreeBSD kernel, so Debian/kFreeBSD literally cannot make the switch. I do not care very much about the kernel or about Linux-specific features, so I don't really see much of a downside to it.
(For me one of the core tenets of the Unix philosophy has always been portability. I do not want everyone running on the same kernel, or the same CPU architecture, or the same byte order, or the same word size, or anything. Code should be portable across all of that. Write pure ANSI C if you can, and add POSIX if you must, and if that won't do the trick, then compromise but feel ashamed of it.)
At the moment I'm leaning towards "stock Debian with sysvinit, and further with systemd explicitly blacklisted". Even with all the bullcrap that's been going on, it'll be a while before Debian truly requires systemd for much, and I can hope that they'll just change course before then. I can always switch to one of the other approaches later.
C is still my second-favorite language.
It was my favorite from... around 1983 to 1989, at which time Objective-C became my favorite.
(Apple is working towards destroying Objective-C, so my favorite may very well go back to C in a couple of years.)
Nifty! If this plays out the right way, I may be able to drop my plans to abandon Debian on my servers.
I suppose if the new tech amounts to "live interactive chat session with a webcam musician", that might provide an experience that some people are willing to pay for that's difficult to pirate.
It's hard to imagine much else that would actually work.
The headline made me do a double-take. It's like asking "why is it taking so long to develop an invisibility cloak?" or "why is it taking so long to develop flying cars?".
Or can you give me one good, solid reason why an ordinary person would want to use a non-Google XMPP server?
Some employers provide on-site supported XMPP servers. Until recently, I've been able to use ours to collaborate with external partners on GTalk, using federation.
Some vendors provide built-in XMPP servers as part of other products. I'm aware of one telephony platform that does so and one IT helpdesk service that does so. Using their servers enables certain useful features, like "they look like text messages to phone users" and "customers can open issues with the help desk by sending them IMs". Those are more useful when federation works.
Some web service providers have XMPP support in their service platforms. I used to be able to have IFTTT send messages to me, due to federation. Google's announcement about turning off the service has caused them to remove the channel from their service entirely. Now I can't use it anymore.
(Those are just the ones that have been impacting me personally as of late. I'm sure others could think of more. No, they're not mainstream. Yes, they're real.)
As an analytics manager you want to see Excel? Why? If someone on your team can do everything else that you list, plus maybe Python, Octave or MATLAB then why would you want to dirty yourself with Excel?
At a guess; because a ton of the data you receive will come by default in Excel format, and because a ton of the recipients your'e going to be asked to deliver data to will want it in Excel format.
I've begun doing some analytics in my current job (which is pretty much all Unix, all server-side), and I'm finding this to be the case surprisingly often.
And perhaps we can agree that a level 2/10 would not likely get hired anywhere.
I'm not prepared to concede that. I'm also not prepared to concede that it should be the case.
If there are a large number of 2/10 programmers out there not getting hired, then more value can be extracted from the workforce if someone can come up with a sandbox or somesuch in which they can actually be used productively.
Maybe you get four of them and have them keep swapping off in pair-programming pairs. Maybe you only let them write code that goes into a continuous-integration server. Maybe you only let them write social games. (Heck, maybe you use them as living, breathing "fuzzing" tools for toolchain developers to use in debugging.) I don't know.
But the economy is better off if we can figure out how to extract value from them. (They should be paid very poorly compared to better programmers, however.)
it's not the act of recollection that causes the memory to decay.
What's your basis for saying so?
(I mean, it's trivially true that it's not the only thing that causes memory to decay. I'm not asking about that. Do you have a basis for saying that it's not a thing that causes memory to decay?)
The act of recollection might very well cause the memory to decay. Our brains may "wrap" it in a "macro" that "re-writes" it as we recollect it, so that it does not seem to decay as a side-effect of recollection. I'm not aware of data we have that would let us rule this out.
Based on that, I find the whole article suspect.
So you're uninterested in turning things on and off, and adjusting the volume?
I'm uninterested in "just one remote for everything". (Volume is not a problem.) I have seen that work out so rarely that I prefer to avoid situations where people attempt it.
Archaic. None of the "remotes" that I use in my living room are keyboards.
When I hear "remote," I think "something simple and dedicated that I can hold in one hand to easily control remotely-located things." I don't think "something with at least 60 buttons, some of which are actually useful, that takes up too much room on the coffee table, and functions only as a basic input for a single device."
Huh? I'm not talking about the remote being a keyboard, I'm talking about the remote identifying itself as a keyboard. It's the equivalent of bar-code scanners that you plug into a keyboard port and that "type" whatever you scan with them.
Keyboards have some buttons that are very good for remote control functions, like "up" and "down" and "left" and "right" and "enter" and "escape" and "pause/play" and "fast forward". Make a handheld stick with just those buttons, and have it pair over bluetooth as a keyboard, and that remote would then work with an Apple TV, an Ouya, a Fire TV, a Linux box running MythTV, a Windows box running Steam in big picture mode, et cetera, et cetera. That's what I'm talking about.
Neat. Now how easily does it switch between presentations, AppleTV and Ouya? Does it change inputs on the TV and/or AVR? Turn things on and back off again? Turn the volume up and down?
No? Oh. I'd consider that a lousy remote, then.
I see. There are features in a remote that I'm so uninterested in that I don't even think of them, that you consider absolutely essential. (Though a subset of those are easy. They could all be easy given specific device choices which I'm not going to assume.)
You and I will not like the same remotes.
The Wright Bothers weren't the first to fly. They were just the first not to crash.