I can afford it. I'll be back at the karma kap before you can say "drinkypoo huh huh huh you drink poop"
I can afford it. I'll be back at the karma kap before you can say "drinkypoo huh huh huh you drink poop"
From a tablet user's perspective, Windows 8.1 had a pretty good version of IE. It was full screen (to see the URL bar/tabs/bookmarks, you had to actually affirmatively ask for them by swiping from the bottom), they made good use of gestures (swipe left and right to move through history, etc), and the browser was... well, IE, not the world's best, but it's fairly efficient, fast, and compatible.
They removed that IE interface in Windows 10 (only the desktop IE remains.) The alternative is supposed to be Edge, but it has no gestures, and is never full screen in the same way.
Worse, Edge seems to kill performance on my tablet. The browser itself only ever seems to take up single digit percentages of CPU but regardless when I start it or have it running the entire tablet grinds to a halt. Close it, and performance goes back to normal. I have no idea why. Given the low CPU usage I wonder if it's just the way it uses the graphics drivers or something similar, but it makes it unusable.
I've switched to Chrome in the meantime, which contrary to early reports and Mozilla's outburst, is actually very easy. Chrome also has the same problems as Edge in terms of not being really full screen, but it doesn't have the performance issues, and it does have the intuitive (and better than trying to hit buttons with a finger) gesture based UI that IE had.
Tablet mode in general seems a step down in Windows 10 from the Windows 8.1 approach. Oh well.
Not feeling it. Deeply suspicious. That doesn't mean I'll vote for Hillary - who has electability problems given the vast hoards of people who loath her - but I'm...
Part of it is Obama. Sure, Obama's kinda, in the last few months, turned back into the guy who ran for President in 2008, but he's still not really that person. Obama's job as candidate and President was to teach those uppity liberals that they can whine and/or get as hopeful as they want, the next guy will always be as bad - as terrible even - as the last guy. He succeeded beyond his wildest dreams.
Part of it is Ron Paul. Ron Paul - from the right- got the same kind of "genuine", "honest", "non-establishment", "heartfelt" plaudits as Sanders gets from the left. People supposedly knew him from the beginning, he's always been the real thing according to them. The Ron Paul Newsletter fiasco gave cause for concern on that. Then my professional life intersected with groups that Ron Paul is associated with indirectly, and in one case directly, and it became obvious the man's a huckster, someone who's very carefully cultivated an image designed to appeal to certain groups who'll donate money, subscribe to paid newsletters and podcasts, and so on en-mass. He's actually better at it than, say, Huckabee, who needed to run for President, or Limbaugh, who probably couldn't get it to work without the backing of a radio syndicate.
So I'm kinda cynical these days. He might get my vote in the end anyway, but it may well be a reluctant one, given on the day of the primaries and then forgotten about.
Feeling a little nostalgic at the moment, but also beginning to sense a serious part of why I feel like a dunce today when it comes to computing when once I felt like a genius.
Quick wall of text on the Nostalgia bit
That article on Vector Graphics the other day reminded me a little of the S-100 bus, and the whole move to the PC ISA that came just before I really got into computing. The first computer I really touched was our school's RM 380Z, which was a proprietary CP/M based system, but exposure to that at school was mostly a "You can book 15 minutes to use it at lunchtime but otherwise the school maths teacher will use it to demonstrate things now and then." So the first computer I learned anything from was a friend's VIC 20. I then used a variety of cheap single-board-computers until my Amiga 500+, the most powerful of which was a Sinclair QL.
So... I never touched S-100. And I didn't really touch the PC until there was literally no other choice that was viable. S-100 was never an option for two major reasons: it was expensive, and it was crap. I mean, seriously, awful. S-100 survived because the home computing establishment's equivalent of the Very Serious People decreed it was Serious, and it was Serious because it was "standard".
A typical S-100 system consisted of the S-100 box itself - a dumb motherboard (very dumb, the only components on it were the edge connectors and a few capacitors and resistors to do all that magic EE specialists understand and I could never get my head around) enclosed in a card cage, plus a CPU card, a completely separate memory card or three, a completely separate disk controller, and a completely separate serial I/O card. The disk controller would be hooked up to a disk drive it was designed to control (yes, proprietary), which would be unlike around 90% of other disk drives out there - that is, if you were lucky. And the I/O card would be hooked up to a terminal that frequently was more powerful than the S-100 computer it was hooked up to..
Each combination of I/O and disk controller cards required a custom BIOS so you could run CP/M with it.
The bus itself was essentially the pins of an 8080 turned into a 100 line bus. So you were essentially wiring each card to an 8080, or something pretending to be an 8080, in parallel. This required quite a bit of hardware in each bus to make sure each didn't conflict with other S-100 cards.
Now, technically, you could get graphics (and maybe sound) cards, but that was unusual. Likewise, you could get more exotic CPUs - though getting software for them was a problem. But the typical S-100 system was text only with a Z80, and the typical S-100 system owner spent rather a lot of time trying to figure out how to order a "standard" CP/M application in a form that would run on their "standard" S-100 system, taking into account their disk drive that only 10% of the market used and their terminal that used VT-52 codes rather than VT-101 codes or (insert one of the other popular terminals here.)
Did I mention this is expensive? While the original Altair 8800 was $500 or so, it came with nothing but the card cage and motherboard, the CPU card, and a little bit of memory. And even on this, the makers barely broke even, expecting to make the profits on after sales. Useful memory, a terminal, an I/O card, a disk controller, and a disk drive, pushed up the prices considerably. Realistically, typical "useful" S-100 systems cost somewhere around $4,000.
Given all of that, it's not really surprising it got supplanted by the PC. Much is made of the fact IBM was taken more seriously by people outside of the personal computer industry in 1981, and that undoubtedly helped, but I can't help but feel that S-100 couldn't have survived for much longer regardless. You could buy a complete system from Commodore or Apple that was more capable for a third of the price even in 1981. The PC didn't need to be cheap, it had IBM's name behind it, but it was obviously more capable than S-100, and it was obvious that if the architecture was adopted by the industry, machines based upon it would be more standardized.
The "Feeling like a dunce" bit
So anyway, that was my train of thought. And it occurred to me that the fact I even have opinions on this suggests my mindset is still stuck there. Back then, even when you programmed in BASIC, you were exerting almost direct control over the hardware. You had a broad idea of what the machine did, what memory locations were mapped onto what functions, and every command you typed affected the computer in a predictable way. The computers themselves were (mostly) predictable too.
As time wore on, especially with the advent of multitasking (which I welcomed, don't get me wrong) you learned to understand your software would be only one party to how the computer behaved, but you understood that if you followed the rules, and the other programmers did too, you could kinda get your head around what was happening to it.
And you felt like a genius if you understood this. And I say "if", because it was possible.
At some point that stopped being possible. Part of it was the PC ISA, the fact an architecture from 1981 was still in use in the mid-nineties by which time it was long in the tooth and needed serious work. Its deficiencies were addressed in software and hardware. Intel essentially replaced the CPU, leaving a compatible stub there to start older applications, and the industry - after a few false starts - threw out most of the PC design and replaced it with the PCI architecture, again, like Intel leaving compatible stubs here and there to ensure older stuff would work. And Microsoft worked on making Windows the real interface software would use to access the hardware.
Logically the right thing to do under the circumstances is to take back control, to use lower level APIs and simpler sets of rules, but in practice that's just not practical, and doing so means that my tools no longer fit inside the ecosystem with everyone else's. So it's not the right thing - it's actually the worst thing I can do, and if I tried to do it, I'd be shunned as a developer.
I was a genius once because I (mostly) understood the computers I was programming. I feel like a dunce today because that's just not possible any more.
The following is my prepared answer for anyone who asks me this stupid fucking question in any interview in the future.
func modBool(modulus: Int) -> Bool
return (self % modulus).boolValue
for x in 1...100
print((x.modBool(3) ? "" : "Fuck ") +
(x.modBool(5) ? "" : "You") +
((x.modBool(3) && x.modBool(5)) ? "\(x)" : ""))
I think the whole mobile operating system thing has screwed up GUI design to a certain degree. Microsoft, Ubuntu, and GNOME have both been brave and tried something new, but what they ended up with ended up being highly unpopular on the desktop. And to be honest, I think only Microsoft ended up with something truly good on a touch interface, though I admit to not using Ubuntu or GNOME in those contexts, just being aware that they've not really encouraged an ecosystem for applications to work well in a tablet environment, leaving users with only the main shell being friendly. So the loss of optimization for the desktop lead to no significant gains elsewhere.
The way I'm seeing it, Windows 10 seems to be genuinely exciting, and a decent modern desktop, that also encourages cross interface design. Microsoft has learned from the mistakes it made with Windows 8, kept the good parts, and put together something truly great and modern.
I don't really want to be stuck with Windows though as my primary OS. I'm hoping Ubuntu et al actually learn from it.
This is something you'll never normally hear from me, but perhaps they need a Miguel type figure to take a lead in either GNOME or Ubuntu. At this point, at least to me, it looks like Microsoft is the one with the good ideas about how a UI should work and the relationship of an application to the UI frameworks of the underlying OS. I don't want anyone to clone Windows, but it would be nice to learn from it, at least.
Back in the 1990s, nerds like me put together our own "desktops", running random window managers, app launchers, and file managers (if that) that seemed to go together. I'm feeling like the FOSS "desktop" is heading back to that era, of stuff that doesn't really go together, being shoehorned to fit, with no real philosophy binding the system together.
I really want to like libressl. But it pretends to be openssl badly. They refused a patch that would have mitigated this whole RAND_egd problem by simply returning that it doesn't work when someone tries to use it, which means that you commonly need a patch to use it at all. If it's not going to work like openssl, then it shouldn't occupy the same space in the filesystem.
This is not news to most people, but I just tried it for the first time on my first-ever normal Debian Wheezy install (I've always done minimal, netinst etc. and built it up from there for a purpose) and wow, GNOME3 is amazingly horrible. It makes Unity look usable. If that was the idea, mission accomplished, I guess.
When Jobs unveiled the iPod in October 2001, the first comment on a gadget site was that it had less storage than existing players, and no Wi-Fi connectivity, making it "lame". More than 400m have been sold.
How the Apple Watch could create a $1tn company - The Guardian
Welp, I can use Slashdot in Chrome and not in Firefox, which implies that something I'm blocking in Firefox is preventing the new improved Slashdot from working. What new spyware bullshit do I have to enable to use Slashdot now? Thanks, DICE! You'll run this place the rest of the way into the ground any day now.
Then they modded down five of my comments in a row. Why doesn't the system catch this kind of obviously abusive moderation? Oh right, because this is slashdot, not someplace with competent employees.
If moderation on slashdot were intelligently designed, this person's abusive moderation would have been autodetected and they would have been banned from moderation permanently.
Seen rather a lot of the "Parents are evil because they did something wrong because they believed that something was right" meme that's going around at the moment.
Worst case: massive harassment and threats against the parents of a trans teenager who killed herself blaming their insistence on "Christian" therapy. Horrible case, entirely the wrong approach by the parents, but at the same time if the parents hadn't cared, there wouldn't have been any therapy to begin with, bogus or not. The parents were convinced by people they trusted that the wrong thing was the right thing. Screaming at them, particularly at a time when they are mourning, that they are evil and heartless is evil and heartless.
Now seeing it in the vaccine "debate". Not the only problem I'm having with the pro-vax side (Reminder: yes, I'm pro-vax, and yes, I'm in favor of it being mandatory for the obvious deadly common diseases), but there's a world of difference between a lazy parent not having their kid vaccinated because they can't be bothered, and a parent being too scared to vaccinate their child because they've heard from convincing sources that vaccinations can cause terrible things.
"You know, we've won awards for this crap." -- David Letterman