Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Note: You can take 10% off all Slashdot Deals with coupon code "slashdot10off." ×
User Journal

Journal Journal: Edge kinda sucks 2

From a tablet user's perspective, Windows 8.1 had a pretty good version of IE. It was full screen (to see the URL bar/tabs/bookmarks, you had to actually affirmatively ask for them by swiping from the bottom), they made good use of gestures (swipe left and right to move through history, etc), and the browser was... well, IE, not the world's best, but it's fairly efficient, fast, and compatible.

They removed that IE interface in Windows 10 (only the desktop IE remains.) The alternative is supposed to be Edge, but it has no gestures, and is never full screen in the same way.

Worse, Edge seems to kill performance on my tablet. The browser itself only ever seems to take up single digit percentages of CPU but regardless when I start it or have it running the entire tablet grinds to a halt. Close it, and performance goes back to normal. I have no idea why. Given the low CPU usage I wonder if it's just the way it uses the graphics drivers or something similar, but it makes it unusable.

I've switched to Chrome in the meantime, which contrary to early reports and Mozilla's outburst, is actually very easy. Chrome also has the same problems as Edge in terms of not being really full screen, but it doesn't have the performance issues, and it does have the intuitive (and better than trying to hit buttons with a finger) gesture based UI that IE had.

Tablet mode in general seems a step down in Windows 10 from the Windows 8.1 approach. Oh well.

User Journal

Journal Journal: Bernie Sanders 48

Not feeling it. Deeply suspicious. That doesn't mean I'll vote for Hillary - who has electability problems given the vast hoards of people who loath her - but I'm...

Part of it is Obama. Sure, Obama's kinda, in the last few months, turned back into the guy who ran for President in 2008, but he's still not really that person. Obama's job as candidate and President was to teach those uppity liberals that they can whine and/or get as hopeful as they want, the next guy will always be as bad - as terrible even - as the last guy. He succeeded beyond his wildest dreams.

Part of it is Ron Paul. Ron Paul - from the right- got the same kind of "genuine", "honest", "non-establishment", "heartfelt" plaudits as Sanders gets from the left. People supposedly knew him from the beginning, he's always been the real thing according to them. The Ron Paul Newsletter fiasco gave cause for concern on that. Then my professional life intersected with groups that Ron Paul is associated with indirectly, and in one case directly, and it became obvious the man's a huckster, someone who's very carefully cultivated an image designed to appeal to certain groups who'll donate money, subscribe to paid newsletters and podcasts, and so on en-mass. He's actually better at it than, say, Huckabee, who needed to run for President, or Limbaugh, who probably couldn't get it to work without the backing of a radio syndicate.

So I'm kinda cynical these days. He might get my vote in the end anyway, but it may well be a reluctant one, given on the day of the primaries and then forgotten about.

User Journal

Journal Journal: Belonging to a different era 2

Feeling a little nostalgic at the moment, but also beginning to sense a serious part of why I feel like a dunce today when it comes to computing when once I felt like a genius.

Quick wall of text on the Nostalgia bit

That article on Vector Graphics the other day reminded me a little of the S-100 bus, and the whole move to the PC ISA that came just before I really got into computing. The first computer I really touched was our school's RM 380Z, which was a proprietary CP/M based system, but exposure to that at school was mostly a "You can book 15 minutes to use it at lunchtime but otherwise the school maths teacher will use it to demonstrate things now and then." So the first computer I learned anything from was a friend's VIC 20. I then used a variety of cheap single-board-computers until my Amiga 500+, the most powerful of which was a Sinclair QL.

So... I never touched S-100. And I didn't really touch the PC until there was literally no other choice that was viable. S-100 was never an option for two major reasons: it was expensive, and it was crap. I mean, seriously, awful. S-100 survived because the home computing establishment's equivalent of the Very Serious People decreed it was Serious, and it was Serious because it was "standard".

A typical S-100 system consisted of the S-100 box itself - a dumb motherboard (very dumb, the only components on it were the edge connectors and a few capacitors and resistors to do all that magic EE specialists understand and I could never get my head around) enclosed in a card cage, plus a CPU card, a completely separate memory card or three, a completely separate disk controller, and a completely separate serial I/O card. The disk controller would be hooked up to a disk drive it was designed to control (yes, proprietary), which would be unlike around 90% of other disk drives out there - that is, if you were lucky. And the I/O card would be hooked up to a terminal that frequently was more powerful than the S-100 computer it was hooked up to..

Each combination of I/O and disk controller cards required a custom BIOS so you could run CP/M with it.

The bus itself was essentially the pins of an 8080 turned into a 100 line bus. So you were essentially wiring each card to an 8080, or something pretending to be an 8080, in parallel. This required quite a bit of hardware in each bus to make sure each didn't conflict with other S-100 cards.

Now, technically, you could get graphics (and maybe sound) cards, but that was unusual. Likewise, you could get more exotic CPUs - though getting software for them was a problem. But the typical S-100 system was text only with a Z80, and the typical S-100 system owner spent rather a lot of time trying to figure out how to order a "standard" CP/M application in a form that would run on their "standard" S-100 system, taking into account their disk drive that only 10% of the market used and their terminal that used VT-52 codes rather than VT-101 codes or (insert one of the other popular terminals here.)

Did I mention this is expensive? While the original Altair 8800 was $500 or so, it came with nothing but the card cage and motherboard, the CPU card, and a little bit of memory. And even on this, the makers barely broke even, expecting to make the profits on after sales. Useful memory, a terminal, an I/O card, a disk controller, and a disk drive, pushed up the prices considerably. Realistically, typical "useful" S-100 systems cost somewhere around $4,000.

Given all of that, it's not really surprising it got supplanted by the PC. Much is made of the fact IBM was taken more seriously by people outside of the personal computer industry in 1981, and that undoubtedly helped, but I can't help but feel that S-100 couldn't have survived for much longer regardless. You could buy a complete system from Commodore or Apple that was more capable for a third of the price even in 1981. The PC didn't need to be cheap, it had IBM's name behind it, but it was obviously more capable than S-100, and it was obvious that if the architecture was adopted by the industry, machines based upon it would be more standardized.

The "Feeling like a dunce" bit

So anyway, that was my train of thought. And it occurred to me that the fact I even have opinions on this suggests my mindset is still stuck there. Back then, even when you programmed in BASIC, you were exerting almost direct control over the hardware. You had a broad idea of what the machine did, what memory locations were mapped onto what functions, and every command you typed affected the computer in a predictable way. The computers themselves were (mostly) predictable too.

As time wore on, especially with the advent of multitasking (which I welcomed, don't get me wrong) you learned to understand your software would be only one party to how the computer behaved, but you understood that if you followed the rules, and the other programmers did too, you could kinda get your head around what was happening to it.

And you felt like a genius if you understood this. And I say "if", because it was possible.

At some point that stopped being possible. Part of it was the PC ISA, the fact an architecture from 1981 was still in use in the mid-nineties by which time it was long in the tooth and needed serious work. Its deficiencies were addressed in software and hardware. Intel essentially replaced the CPU, leaving a compatible stub there to start older applications, and the industry - after a few false starts - threw out most of the PC design and replaced it with the PCI architecture, again, like Intel leaving compatible stubs here and there to ensure older stuff would work. And Microsoft worked on making Windows the real interface software would use to access the hardware.

After a while, there were so many abstractions between your software and the underlying system, it really became hard to determine what was going on underneath. If I program, I now know there are rules I can follow that will reduce the chance of my application being a problem... today. But I don't know if that's the case for the next version of Windows, and all I know is how to reduce the chances, not how to eliminate them. I don't know if the Java I'm writing will generate a webpage that contains Javascript that will contain a memory leak that'll cause the part of the process managing the tab its in to bloat up an additional 100M or so. I can hope it won't, and use mitigation strategies to avoid things that might cause problems, but there are so many things outside of my control I have to trust now, it's just not practical.

Logically the right thing to do under the circumstances is to take back control, to use lower level APIs and simpler sets of rules, but in practice that's just not practical, and doing so means that my tools no longer fit inside the ecosystem with everyone else's. So it's not the right thing - it's actually the worst thing I can do, and if I tried to do it, I'd be shunned as a developer.

I was a genius once because I (mostly) understood the computers I was programming. I feel like a dunce today because that's just not possible any more.

User Journal

Journal Journal: Slashdot could recover top spot from Reddit

So Reddit - where most veteran Slashdotters have been hanging out these days - is melting down, and for good reason.

I've been coming back here more lately.

But man, there's things that Reddit does better. No limit on mod points, for one. A better story queue mechanism for another.

There is a window here, if Slashdot admins have the balls to try. Implement Reddit's up vote system and subreddits. Maybe limit the latter to departments more traditional for Slashdot, but allow all users to submit stories in the Reddit manner. Hell, just clone the thing! You'd get a huge amount of your readership back.

Maybe the Slashdot front page is curated a la /r/bestof to get that moderator filtered quality for the front page, but subslashes should be open season.

Is the spirit of Rob Malda still alive in /. HQ? Can a tiger team code this in a hurry? You should.

Ah, nobody will ever read this....

User Journal

Journal Journal: Winduhs

I think the whole mobile operating system thing has screwed up GUI design to a certain degree. Microsoft, Ubuntu, and GNOME have both been brave and tried something new, but what they ended up with ended up being highly unpopular on the desktop. And to be honest, I think only Microsoft ended up with something truly good on a touch interface, though I admit to not using Ubuntu or GNOME in those contexts, just being aware that they've not really encouraged an ecosystem for applications to work well in a tablet environment, leaving users with only the main shell being friendly. So the loss of optimization for the desktop lead to no significant gains elsewhere.

The way I'm seeing it, Windows 10 seems to be genuinely exciting, and a decent modern desktop, that also encourages cross interface design. Microsoft has learned from the mistakes it made with Windows 8, kept the good parts, and put together something truly great and modern.

I don't really want to be stuck with Windows though as my primary OS. I'm hoping Ubuntu et al actually learn from it.

This is something you'll never normally hear from me, but perhaps they need a Miguel type figure to take a lead in either GNOME or Ubuntu. At this point, at least to me, it looks like Microsoft is the one with the good ideas about how a UI should work and the relationship of an application to the UI frameworks of the underlying OS. I don't want anyone to clone Windows, but it would be nice to learn from it, at least.

Back in the 1990s, nerds like me put together our own "desktops", running random window managers, app launchers, and file managers (if that) that seemed to go together. I'm feeling like the FOSS "desktop" is heading back to that era, of stuff that doesn't really go together, being shoehorned to fit, with no real philosophy binding the system together.

User Journal

Journal Journal: Parents aren't perfect 7

Seen rather a lot of the "Parents are evil because they did something wrong because they believed that something was right" meme that's going around at the moment.

Worst case: massive harassment and threats against the parents of a trans teenager who killed herself blaming their insistence on "Christian" therapy. Horrible case, entirely the wrong approach by the parents, but at the same time if the parents hadn't cared, there wouldn't have been any therapy to begin with, bogus or not. The parents were convinced by people they trusted that the wrong thing was the right thing. Screaming at them, particularly at a time when they are mourning, that they are evil and heartless is evil and heartless.

Now seeing it in the vaccine "debate". Not the only problem I'm having with the pro-vax side (Reminder: yes, I'm pro-vax, and yes, I'm in favor of it being mandatory for the obvious deadly common diseases), but there's a world of difference between a lazy parent not having their kid vaccinated because they can't be bothered, and a parent being too scared to vaccinate their child because they've heard from convincing sources that vaccinations can cause terrible things.

User Journal

Journal Journal: Is the Touch UI irredeemable? 4

Thoughts related to the Windows 10 "Desktop is a desktop, no "Start" screen" thing:

From 1984 to 1990, there was a serious debate as to which was better, the command line or the WIMP (Window/Icon/Mouse/Pointer) UI. Why? Well, because Mac OS's Systems 1-4 were user friendly in the sense people knew how to use them, but user unfriendly in the sense that they got in the way, were kludgy, awkward to use, and offered zero advantages - beyond a lack of training for users - over the command line. At best you could say some applications needed a mouse, but some, such as word processors, were actually harder to use in the prehistoric era of WIMP user interfaces than the keyboard based versions.

What changed? Microsoft Windows. From Windows 1.0 onwards, Microsoft offered a vision, initially a very, very, ugly vision, as to how a computer could be more, not less useful with a WIMP UI. The critical feature was multitasking. Windows offered a better way to multitask than command line based systems, because each Window, representing an application or document, could co-exist in the same "world", the desktop.

Windows wasn't anything like the best implementation, but it was the only implementation of the concept available on standard PCs.

When Microsoft pretty much forced manufacturers to provide Windows and a mouse with all MS DOS based computers, users had a straight choice of using one UI or the other, and they overwhelmingly chose Windows. By comparison, when GEM was bundled with many PCs in the late nineties, GEM was a nice to have that was ignored by most users (anecdotally, outside of stores, I never saw an Amstrad PC1512 running GEM in the wild, despite it becoming with it and being a major advertised feature.) GEM, a Mac OS UI clone, did not offer multitasking.

So: timeline:

1. Mac OS released around 1984. Causes schism between WIMP and command line users
2. Windows 1.0 released 1985ish. Most users recognize it's a very powerful system, but are put off by user interface and memory requirements.
3. DOS vs WIMP rages for next five years largely because Windows is crippled by other factors.
4. Finally PCs are forced to be powerful enough to run Windows in 1990, and Windows UI improved enough to be "good enough" compared to Mac OS. Everyone jumps to Windows. End of DOS vs WIMP debate.

Touch UIs? Where is the touch UI that is more powerful, as opposed to being easier to use, than the WIMP UI? It took Microsoft (and Commodore too) less than a year to come up with something that was actually an improvement on the command line having seen WIMP. It's been nearly a decade now, who has come up with a touch UI that is more versatile than a WIMP desktop?

User Journal

Journal Journal: Classifications 1

Apropos of nothing, just some thoughts in the shower this morning: I see people getting very upset when they hear Doom being described as "3D". "It's 2.5D!" they scream, pointing out that the maps are two dimensional albeit augmented with a height map.

The thing is while I kinda see their point, it essentially puts Doom in the same category as, say, Isometric games, while Quake is in the same category as numerous 1980s Flight Simulators. And then there's "First Person" vs "Third Person" where, again, the latter is so overly broad that it puts, uhm, a lot of isometric games in the same category as modern 3D games that are clearly "nearly" FPS but with a view of the protagonist.

Me, I'm kind of wondering if any of it is ever going to be anything but misleading anyway. 3D Monster Maze (for the ZX81), Hired Guns, the various flight simulators, Quake, Doom, Wolfenstein... all with slightly different takes on technologies that were ultimately trying to converge on the idea that you could see something broadly real, rather than an abstraction. The classifying makes it harder, not easier, to see the leaps forward each type of game engine made.

User Journal

Journal Journal: Wikipedia is fucked 1

GamerGate targeted the most active editors on the Gamergate Controversy article for abuse for several months. They also abused the article itself, inserting blatant violations of WP:BLP (the policy that stops the Wikimedia Foundation from being sued for libel every five minutes) During this time the trolls, in parallel, continually leveled complaints at the relevant Wikipedia admin authorities.

Finally, the combination of forum shopping and driving well meaning editors into the ground has paid off: the vast majority of editors in question are to be banned not just from editing the GamerGate Controversy article but from even discussing gender related issues on Wikipedia. Some token throwaway accounts on the GG side are being banned too.

What good faith editor in their right mind will want to touch any article covering an issue affected by well organized trolls after this?

Oh, and don't expect Jimbo to step in. He's actually been telling editors being harassed to step away from the article for several months now.

The backdoor password to the constitution is "terrorism". The backdoor password to Wikipedia is "Civility".

User Journal

Journal Journal: Nuts vs Nuttiers 1

It's kind of annoying that when there's an active hate campaign against a group of people you're largely sympathetic to, it becomes harder to call out abuse and extremism by individuals within that group lest you play into the agenda of the hate campaign.

Another way of saying the same thing: GamerGate and similar mobs make it hard to have rational discussions about anything.

(If you're after specifics, no, I won't give any directly, the nearest I'd mention is that I thought Pax Dickinson was treated abysmally back when he was essentially fired for alleged over-enthusiastic dudebroism.)

User Journal

Journal Journal: Supporting extremism 6

The legal right to be offensive aside (and likewise the right to be offensive without suffering death or severe violence), which is an entirely different issue and one I wholeheartedly support, I'm not going to promote punching down and re-enforcing hatred simply because terrorists brutally attack and murder some people who are doing that.

And the fact such an act has been perpetrated may mean condemnation from me, but it doesn't mean I'm going to lionize the victims or even worse promote their rotten cartoons.

You cannot attack extremism with extremism. It doesn't work that way.

Also as a former resident of Britain, which had plenty of Christian terrorism while I was living there, and which was subject to, albeit overseas, Jewish terrorism a mere 35ish years before I was born (interestingly by groups so nutty that they even, on occasion, sided with Nazi Germany seeing it as "less terrible" than the colonial British Empire), can we cut out the "Islam has a special problem" crap?

(Not that I'm saying religion can't be peaceful, Buddhist terrorists are fairly rare for example, though not non-existent, but Islam doesn't seem to be worse historically than any other Judao-Christian movement. It's just large right now, and over-represented in areas currently ruled by corrupt dictatorships propped up by the West and countries that are former examples thereof.)

User Journal

Journal Journal: Windows 8.1 is a great tablet operating system and is better than Android 22

Unfortunately third party support for it sucks. It's the AmigaOS of tablet operating systems, kinda sorta. Hey, Microsoft, have you heard of this new, 30 year old, technology called MVC? Developers love it, and it makes it relatively easy to produce frameworks that allow completely different user interfaces that use entirely different paradigms to be targeted by the same application. There's another company that makes both desktop and tablet operating systems (ironically, currently not merged though apparently from the same code base) that supports MVC quite heavily. Can't remember their name though...

It'd be nice for a FOSS equivalent of the "Tablet + Desktop" system Microsoft is doing, vs "Let's try to create a merged interface that sucks" approach of Ubuntu and GNOME. I would have been very happy with a Ubuntu for Android system, but Ubuntu and Google never seemed to go anywhere with that one.

User Journal

Journal Journal: Controversy 8


systemd - think it's a good idea. init sucks. init scripts suck. I mean, have you ever written one? Something that uses cgroups to track and manage daemons seems an unbelievably great idea.

Slashdot Beta - For as long as I can remember, Slashdot's commenting system has been broken. Now they're trying to fix it. Not perfect, but seems in line with what others are doing successfully. If they can get it feature complete, it'll definitely be an improvement.

xfinitywifi - What a great idea! Comcast, you guys need to provide those of us who don't rent your routers with a free box that, without interrupting our networks, provides an xfinitywifi connection. Costs nothing, provides a huge amount of roaming Wifi coverage, it's a great idea.

Eich? He was a dick. People had concerns about his ability to work with a diverse group, his response was to insult everyone with concerns rather than address them. He was not CEO material.

Pax Dickinson? Honestly, I think he was stitched up and shouldn't have been fired/pushed/whatever. Buuuttt.... he's now associating himself with GG, so screw him.

Not trolling (mostly) but I do seem to be at odds with most of Slashdot these days.

User Journal

Journal Journal: Saints Row 4

Really enjoying it. I got it for $15 on Steam a few days ago, a day or two before they dropped the price to $5. Yeah. $5. *sigh* Well $15 was a good price.

Anyway, if you liked 3 (loved it myself), you'll almost certainly like 4. However there's some controversy over SR1/2 and SR3, as the game changed significantly between 2 and 3. My friend who doesn't care for 3 for that reason still loves 4, so read into that whatever you want.

What is it? Well, it's a big open-world thing. SR3 was a "take over the city from hostile gangs" thing that was incredibly over the top and funny. SR4... many of the same concepts, but you're now in a simulation of a city, and you're also the President, but still a gang leader, but you have superpowers, and it's still over the top and hilarious. There's a lot of references to other games/media and there's some meta stuff in there too. I thought on reading the synopsis that it wouldn't work, but it really does. Said friend who doesn't like SR3 described it as the best superhero game out there.

Probably worth playing SR3 before SR4, but otherwise a big thumbs up.

The road to ruin is always in good repair, and the travellers pay the expense of it. -- Josh Billings