Follow Slashdot stories on Twitter


Forgot your password?
User Journal

Journal Journal: The Clown Car 9

Jeb Bush - he's not so bad. I disagree with him, but he's basically another establishment figure and, actually, if truth be told, I suspect he's slightly less insane than most in the establishment. He seems smarter than W. On the other hand, he hates trains, so f--- him. President Bush? "OHWELL".

Rubio - has said nothing particularly impressive thus far. Seems to be mostly an empty suit. President Rubio? "OHWELL"

Trump - amused that the candidate that seems to be pandering and flip-flopping the most is the one that Republicans think is unusually honest. Other than that, if he's actually genuinely running for election at this point it's because he's suddenly realized that he has support he probably didn't realize he had at the beginning. President Trump? "OHMYHELL"

Carson - I'm sure he's a nice guy, and he's got to be pretty intelligent on some level, but he seems out of his league when it comes to politics, and genuinely unclear about how to balance the need to look like a raving lunatic in front of his base, with the need to not look like a raving lunatic to everyone else. President Carson? "OHDEAR"

Chris Christie - There's really not a lot right with this guy. Insanely Machiavellian and happy to do the wrong thing if it means looking good in front of the right people. Plus holds grudges. He's essentially the next Nixon. President Christie? "OHMYHELL".

Carly Fiorina - The only people I know in tech who support her are the kinds of idiots that rave about how much they hate government employees simply by virtue of the fact they're employed by a government. The chances of Fiorina merging the US with Canada, Mexico, and the UK, reducing the GDP of the four put together to slightly more than the US does today though is pretty slim. While she showed the usual sociopathic instincts of any CEO at HP, her excesses could possibly have been due to a misguided belief she was saving the company. Outside of HP, she seems to be a Rubio-like empty suit, mumbling platitudes to whip up the base while revealing nothing credible about her own views. On that basis, despite HP, I must rate President Fiorina an "OHWELL"

The rest: All are either religious nuts, which rules them out of the running (despite everything, the Republicans never go with those), no names, or have the surname "Paul". They're not going to win the nomination. If any do, I guess other than one of the no-names winning, it's pretty much an OHMYHELL all the way down.

Ratings explained:

OHWELL - Hey I voted Democratic, but I don't see this guy as destroying the country, so I'm not going to behave like a Republican does during a Democratic administration. Previous candidates qualifying as OHWELLs: Ford, Bush Sr, Dole, Mittens
OHDEAR - Suspect this guy won't be able to hold it together, fairly uncomfortable with him winning: Previous OHDEARs: St Reagan, McCain
OHMYHELL - This guy will probably ruin the country in some shape or form, either through complete incompetence, ideological nuttery, or sheer evil: Nixon, Bush/Cheney

User Journal

Journal Journal: My prediction, but it has an "If" in it 6

If it looks like Sanders may defeat Clinton, Biden will throw his hat in the ring.

If Clinton gets defeated by Sanders - and perhaps even if Sanders merely comes close - in the first few primaries, Biden will campaign very seriously, and the establishment will swing behind him. Biden will probably win the nomination under these circumstances.

It's an "If", but I'd put the chances of the above happening at around 30% right now. Sanders is doing well, and there have been polls showing slight (within the margin of error) leads in a couple of States. But I doubt the Democratic establishment are convinced Clinton will lose... yet.

Can Biden win the election? I know racists who voted for Bush in 2000 and 2004 who voted for Obama in '08 and '12 because Biden was on the ticket. Don't underestimate him. He's almost certainly a better bet than Clinton, but I suspect there's some deal making going on behind the scenes that's preventing him from jumping in the race at this stage. If Clinton starts to look vulnerable to Sanders, the pressure on him to run will be immense, backroom deals or no backroom deals.

BTW the fact I'm predicting this means it'll never happen.

User Journal

Journal Journal: Edge kinda sucks 2

From a tablet user's perspective, Windows 8.1 had a pretty good version of IE. It was full screen (to see the URL bar/tabs/bookmarks, you had to actually affirmatively ask for them by swiping from the bottom), they made good use of gestures (swipe left and right to move through history, etc), and the browser was... well, IE, not the world's best, but it's fairly efficient, fast, and compatible.

They removed that IE interface in Windows 10 (only the desktop IE remains.) The alternative is supposed to be Edge, but it has no gestures, and is never full screen in the same way.

Worse, Edge seems to kill performance on my tablet. The browser itself only ever seems to take up single digit percentages of CPU but regardless when I start it or have it running the entire tablet grinds to a halt. Close it, and performance goes back to normal. I have no idea why. Given the low CPU usage I wonder if it's just the way it uses the graphics drivers or something similar, but it makes it unusable.

I've switched to Chrome in the meantime, which contrary to early reports and Mozilla's outburst, is actually very easy. Chrome also has the same problems as Edge in terms of not being really full screen, but it doesn't have the performance issues, and it does have the intuitive (and better than trying to hit buttons with a finger) gesture based UI that IE had.

Tablet mode in general seems a step down in Windows 10 from the Windows 8.1 approach. Oh well.

User Journal

Journal Journal: Bernie Sanders 48

Not feeling it. Deeply suspicious. That doesn't mean I'll vote for Hillary - who has electability problems given the vast hoards of people who loath her - but I'm...

Part of it is Obama. Sure, Obama's kinda, in the last few months, turned back into the guy who ran for President in 2008, but he's still not really that person. Obama's job as candidate and President was to teach those uppity liberals that they can whine and/or get as hopeful as they want, the next guy will always be as bad - as terrible even - as the last guy. He succeeded beyond his wildest dreams.

Part of it is Ron Paul. Ron Paul - from the right- got the same kind of "genuine", "honest", "non-establishment", "heartfelt" plaudits as Sanders gets from the left. People supposedly knew him from the beginning, he's always been the real thing according to them. The Ron Paul Newsletter fiasco gave cause for concern on that. Then my professional life intersected with groups that Ron Paul is associated with indirectly, and in one case directly, and it became obvious the man's a huckster, someone who's very carefully cultivated an image designed to appeal to certain groups who'll donate money, subscribe to paid newsletters and podcasts, and so on en-mass. He's actually better at it than, say, Huckabee, who needed to run for President, or Limbaugh, who probably couldn't get it to work without the backing of a radio syndicate.

So I'm kinda cynical these days. He might get my vote in the end anyway, but it may well be a reluctant one, given on the day of the primaries and then forgotten about.

User Journal

Journal Journal: Belonging to a different era 2

Feeling a little nostalgic at the moment, but also beginning to sense a serious part of why I feel like a dunce today when it comes to computing when once I felt like a genius.

Quick wall of text on the Nostalgia bit

That article on Vector Graphics the other day reminded me a little of the S-100 bus, and the whole move to the PC ISA that came just before I really got into computing. The first computer I really touched was our school's RM 380Z, which was a proprietary CP/M based system, but exposure to that at school was mostly a "You can book 15 minutes to use it at lunchtime but otherwise the school maths teacher will use it to demonstrate things now and then." So the first computer I learned anything from was a friend's VIC 20. I then used a variety of cheap single-board-computers until my Amiga 500+, the most powerful of which was a Sinclair QL.

So... I never touched S-100. And I didn't really touch the PC until there was literally no other choice that was viable. S-100 was never an option for two major reasons: it was expensive, and it was crap. I mean, seriously, awful. S-100 survived because the home computing establishment's equivalent of the Very Serious People decreed it was Serious, and it was Serious because it was "standard".

A typical S-100 system consisted of the S-100 box itself - a dumb motherboard (very dumb, the only components on it were the edge connectors and a few capacitors and resistors to do all that magic EE specialists understand and I could never get my head around) enclosed in a card cage, plus a CPU card, a completely separate memory card or three, a completely separate disk controller, and a completely separate serial I/O card. The disk controller would be hooked up to a disk drive it was designed to control (yes, proprietary), which would be unlike around 90% of other disk drives out there - that is, if you were lucky. And the I/O card would be hooked up to a terminal that frequently was more powerful than the S-100 computer it was hooked up to..

Each combination of I/O and disk controller cards required a custom BIOS so you could run CP/M with it.

The bus itself was essentially the pins of an 8080 turned into a 100 line bus. So you were essentially wiring each card to an 8080, or something pretending to be an 8080, in parallel. This required quite a bit of hardware in each bus to make sure each didn't conflict with other S-100 cards.

Now, technically, you could get graphics (and maybe sound) cards, but that was unusual. Likewise, you could get more exotic CPUs - though getting software for them was a problem. But the typical S-100 system was text only with a Z80, and the typical S-100 system owner spent rather a lot of time trying to figure out how to order a "standard" CP/M application in a form that would run on their "standard" S-100 system, taking into account their disk drive that only 10% of the market used and their terminal that used VT-52 codes rather than VT-101 codes or (insert one of the other popular terminals here.)

Did I mention this is expensive? While the original Altair 8800 was $500 or so, it came with nothing but the card cage and motherboard, the CPU card, and a little bit of memory. And even on this, the makers barely broke even, expecting to make the profits on after sales. Useful memory, a terminal, an I/O card, a disk controller, and a disk drive, pushed up the prices considerably. Realistically, typical "useful" S-100 systems cost somewhere around $4,000.

Given all of that, it's not really surprising it got supplanted by the PC. Much is made of the fact IBM was taken more seriously by people outside of the personal computer industry in 1981, and that undoubtedly helped, but I can't help but feel that S-100 couldn't have survived for much longer regardless. You could buy a complete system from Commodore or Apple that was more capable for a third of the price even in 1981. The PC didn't need to be cheap, it had IBM's name behind it, but it was obviously more capable than S-100, and it was obvious that if the architecture was adopted by the industry, machines based upon it would be more standardized.

The "Feeling like a dunce" bit

So anyway, that was my train of thought. And it occurred to me that the fact I even have opinions on this suggests my mindset is still stuck there. Back then, even when you programmed in BASIC, you were exerting almost direct control over the hardware. You had a broad idea of what the machine did, what memory locations were mapped onto what functions, and every command you typed affected the computer in a predictable way. The computers themselves were (mostly) predictable too.

As time wore on, especially with the advent of multitasking (which I welcomed, don't get me wrong) you learned to understand your software would be only one party to how the computer behaved, but you understood that if you followed the rules, and the other programmers did too, you could kinda get your head around what was happening to it.

And you felt like a genius if you understood this. And I say "if", because it was possible.

At some point that stopped being possible. Part of it was the PC ISA, the fact an architecture from 1981 was still in use in the mid-nineties by which time it was long in the tooth and needed serious work. Its deficiencies were addressed in software and hardware. Intel essentially replaced the CPU, leaving a compatible stub there to start older applications, and the industry - after a few false starts - threw out most of the PC design and replaced it with the PCI architecture, again, like Intel leaving compatible stubs here and there to ensure older stuff would work. And Microsoft worked on making Windows the real interface software would use to access the hardware.

After a while, there were so many abstractions between your software and the underlying system, it really became hard to determine what was going on underneath. If I program, I now know there are rules I can follow that will reduce the chance of my application being a problem... today. But I don't know if that's the case for the next version of Windows, and all I know is how to reduce the chances, not how to eliminate them. I don't know if the Java I'm writing will generate a webpage that contains Javascript that will contain a memory leak that'll cause the part of the process managing the tab its in to bloat up an additional 100M or so. I can hope it won't, and use mitigation strategies to avoid things that might cause problems, but there are so many things outside of my control I have to trust now, it's just not practical.

Logically the right thing to do under the circumstances is to take back control, to use lower level APIs and simpler sets of rules, but in practice that's just not practical, and doing so means that my tools no longer fit inside the ecosystem with everyone else's. So it's not the right thing - it's actually the worst thing I can do, and if I tried to do it, I'd be shunned as a developer.

I was a genius once because I (mostly) understood the computers I was programming. I feel like a dunce today because that's just not possible any more.

User Journal

Journal Journal: Winduhs

I think the whole mobile operating system thing has screwed up GUI design to a certain degree. Microsoft, Ubuntu, and GNOME have both been brave and tried something new, but what they ended up with ended up being highly unpopular on the desktop. And to be honest, I think only Microsoft ended up with something truly good on a touch interface, though I admit to not using Ubuntu or GNOME in those contexts, just being aware that they've not really encouraged an ecosystem for applications to work well in a tablet environment, leaving users with only the main shell being friendly. So the loss of optimization for the desktop lead to no significant gains elsewhere.

The way I'm seeing it, Windows 10 seems to be genuinely exciting, and a decent modern desktop, that also encourages cross interface design. Microsoft has learned from the mistakes it made with Windows 8, kept the good parts, and put together something truly great and modern.

I don't really want to be stuck with Windows though as my primary OS. I'm hoping Ubuntu et al actually learn from it.

This is something you'll never normally hear from me, but perhaps they need a Miguel type figure to take a lead in either GNOME or Ubuntu. At this point, at least to me, it looks like Microsoft is the one with the good ideas about how a UI should work and the relationship of an application to the UI frameworks of the underlying OS. I don't want anyone to clone Windows, but it would be nice to learn from it, at least.

Back in the 1990s, nerds like me put together our own "desktops", running random window managers, app launchers, and file managers (if that) that seemed to go together. I'm feeling like the FOSS "desktop" is heading back to that era, of stuff that doesn't really go together, being shoehorned to fit, with no real philosophy binding the system together.

User Journal

Journal Journal: Why libressl is stupid 2

I really want to like libressl. But it pretends to be openssl badly. They refused a patch that would have mitigated this whole RAND_egd problem by simply returning that it doesn't work when someone tries to use it, which means that you commonly need a patch to use it at all. If it's not going to work like openssl, then it shouldn't occupy the same space in the filesystem.

User Journal

Journal Journal: OMFG GNOME3 is asstacular

This is not news to most people, but I just tried it for the first time on my first-ever normal Debian Wheezy install (I've always done minimal, netinst etc. and built it up from there for a purpose) and wow, GNOME3 is amazingly horrible. It makes Unity look usable. If that was the idea, mission accomplished, I guess.

User Journal

Journal Journal: The famous Debian ctte vote

Aide memoire

> Bdale Garbee writes:
> > - - - start ballot - - -
> > We exercise our power to decide in cases of overlapping jurisdiction
> > (6.1.2) by asserting that the default init system for Linux
> > architectures in jessie should be
> > D systemd
> > U upstart
> > O openrc
> > V sysvinit (no change)
> > F requires further discussion
> > Should the project pass a General Resolution before the release of
> > "jessie" asserting a "position statement about issues of the day" on
> > init systems, that position replaces the outcome of this vote and is
> > adopted by the Technical Committee as its own decision.
> > - - - end ballot - - -
> I vote D U O V F.

On Sat, Feb 08, 2014 at 12:16:51PM -0800, Russ Allbery wrote:
> I vote:
> D U O V F

On Sat, Feb 08, 2014 at 02:18:39PM -0800, Steve Langasek wrote:
> I vote F U D O V

On Sat, Feb 08, 2014 at 02:51:13PM -0800, Don Armstrong wrote:
> I vote D > U > O > V > F.

On Sat, Feb 08, 2014 at 02:57:52PM -0800, Keith Packard wrote:
> I vote:
> 1. D
> 2. U
> 3. O
> 4. V
> 5. F

On Sun, Feb 09, 2014 at 01:04:31PM +0000, Colin Watson wrote:
> I vote UDOFV.

On Sun, Feb 09, 2014 at 07:15:58PM +0000, Ian Jackson wrote:
> I vote F, V, O, U, D.

On Tue, Feb 11, 2014 at 09:07:11AM +0100, Andreas Barth wrote:
> Thus voting U, F, D, O, V.

So that's all the votes in, by my count. Summary is:

    4x D U O V F (bdale, russ, keith, don)
          F U D O V (steve)
          U D O F V (colin)
          F V O U D (ian)
          U F D O V (andi)

Note that only Ian ranked sysvinit above upstart or systemd.

User Journal

Journal Journal: What do I have to enable now? Fucking DICE. 5

Welp, I can use Slashdot in Chrome and not in Firefox, which implies that something I'm blocking in Firefox is preventing the new improved Slashdot from working. What new spyware bullshit do I have to enable to use Slashdot now? Thanks, DICE! You'll run this place the rest of the way into the ground any day now.

Reality must take precedence over public relations, for Mother Nature cannot be fooled. -- R.P. Feynman