Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Note: You can take 10% off all Slashdot Deals with coupon code "slashdot10off." ×
User Journal

Journal Journal: MSS Code Factory 2.3.12994 Service Pack 1

This release refactors and rearchitects the core objects of the system using diamond inheritance of the interfaces so that you can cast between the defining schema names for the objects and the current project's schema names. All of the methods of the new interfaces rely on the defining schema names, but you can safely cast the returned objects to your current schema names.

The goal of this effort was to make it easier to write custom GUI components that rely on the defining schema names for object interfaces, so that you should be able to just reference the jars that provide those custom GUI components, wire them to your projects GUI factories, and automatically use the customized screen components.

It isn't perfect, but it's the best I could do to address what I saw as a glaring flaw with the code base. Of course, the downside is that you may have to do more typecasting in some of your code. There are always trade-offs when designing an architecture.

Another benefit of the new approach is that it is made very clear that you are using the objects and interfaces from a referenced schema/project, an issue which was concerning me as some people might have taken the approach of ignoring the license requirements of the referenced schemas because the code gets re-manufactured into their project. Now there is a clear delineation between the referenced objects and the current project, preventing anyone from pretending they "didn't know" they had to abide by the licenses of referenced projects.

http://msscodefactory.sourceforge.net/

User Journal

Journal Journal: Edge kinda sucks 2

From a tablet user's perspective, Windows 8.1 had a pretty good version of IE. It was full screen (to see the URL bar/tabs/bookmarks, you had to actually affirmatively ask for them by swiping from the bottom), they made good use of gestures (swipe left and right to move through history, etc), and the browser was... well, IE, not the world's best, but it's fairly efficient, fast, and compatible.

They removed that IE interface in Windows 10 (only the desktop IE remains.) The alternative is supposed to be Edge, but it has no gestures, and is never full screen in the same way.

Worse, Edge seems to kill performance on my tablet. The browser itself only ever seems to take up single digit percentages of CPU but regardless when I start it or have it running the entire tablet grinds to a halt. Close it, and performance goes back to normal. I have no idea why. Given the low CPU usage I wonder if it's just the way it uses the graphics drivers or something similar, but it makes it unusable.

I've switched to Chrome in the meantime, which contrary to early reports and Mozilla's outburst, is actually very easy. Chrome also has the same problems as Edge in terms of not being really full screen, but it doesn't have the performance issues, and it does have the intuitive (and better than trying to hit buttons with a finger) gesture based UI that IE had.

Tablet mode in general seems a step down in Windows 10 from the Windows 8.1 approach. Oh well.

User Journal

Journal Journal: Bernie Sanders 48

Not feeling it. Deeply suspicious. That doesn't mean I'll vote for Hillary - who has electability problems given the vast hoards of people who loath her - but I'm...

Part of it is Obama. Sure, Obama's kinda, in the last few months, turned back into the guy who ran for President in 2008, but he's still not really that person. Obama's job as candidate and President was to teach those uppity liberals that they can whine and/or get as hopeful as they want, the next guy will always be as bad - as terrible even - as the last guy. He succeeded beyond his wildest dreams.

Part of it is Ron Paul. Ron Paul - from the right- got the same kind of "genuine", "honest", "non-establishment", "heartfelt" plaudits as Sanders gets from the left. People supposedly knew him from the beginning, he's always been the real thing according to them. The Ron Paul Newsletter fiasco gave cause for concern on that. Then my professional life intersected with groups that Ron Paul is associated with indirectly, and in one case directly, and it became obvious the man's a huckster, someone who's very carefully cultivated an image designed to appeal to certain groups who'll donate money, subscribe to paid newsletters and podcasts, and so on en-mass. He's actually better at it than, say, Huckabee, who needed to run for President, or Limbaugh, who probably couldn't get it to work without the backing of a radio syndicate.

So I'm kinda cynical these days. He might get my vote in the end anyway, but it may well be a reluctant one, given on the day of the primaries and then forgotten about.

User Journal

Journal Journal: Belonging to a different era 2

Feeling a little nostalgic at the moment, but also beginning to sense a serious part of why I feel like a dunce today when it comes to computing when once I felt like a genius.

Quick wall of text on the Nostalgia bit

That article on Vector Graphics the other day reminded me a little of the S-100 bus, and the whole move to the PC ISA that came just before I really got into computing. The first computer I really touched was our school's RM 380Z, which was a proprietary CP/M based system, but exposure to that at school was mostly a "You can book 15 minutes to use it at lunchtime but otherwise the school maths teacher will use it to demonstrate things now and then." So the first computer I learned anything from was a friend's VIC 20. I then used a variety of cheap single-board-computers until my Amiga 500+, the most powerful of which was a Sinclair QL.

So... I never touched S-100. And I didn't really touch the PC until there was literally no other choice that was viable. S-100 was never an option for two major reasons: it was expensive, and it was crap. I mean, seriously, awful. S-100 survived because the home computing establishment's equivalent of the Very Serious People decreed it was Serious, and it was Serious because it was "standard".

A typical S-100 system consisted of the S-100 box itself - a dumb motherboard (very dumb, the only components on it were the edge connectors and a few capacitors and resistors to do all that magic EE specialists understand and I could never get my head around) enclosed in a card cage, plus a CPU card, a completely separate memory card or three, a completely separate disk controller, and a completely separate serial I/O card. The disk controller would be hooked up to a disk drive it was designed to control (yes, proprietary), which would be unlike around 90% of other disk drives out there - that is, if you were lucky. And the I/O card would be hooked up to a terminal that frequently was more powerful than the S-100 computer it was hooked up to..

Each combination of I/O and disk controller cards required a custom BIOS so you could run CP/M with it.

The bus itself was essentially the pins of an 8080 turned into a 100 line bus. So you were essentially wiring each card to an 8080, or something pretending to be an 8080, in parallel. This required quite a bit of hardware in each bus to make sure each didn't conflict with other S-100 cards.

Now, technically, you could get graphics (and maybe sound) cards, but that was unusual. Likewise, you could get more exotic CPUs - though getting software for them was a problem. But the typical S-100 system was text only with a Z80, and the typical S-100 system owner spent rather a lot of time trying to figure out how to order a "standard" CP/M application in a form that would run on their "standard" S-100 system, taking into account their disk drive that only 10% of the market used and their terminal that used VT-52 codes rather than VT-101 codes or (insert one of the other popular terminals here.)

Did I mention this is expensive? While the original Altair 8800 was $500 or so, it came with nothing but the card cage and motherboard, the CPU card, and a little bit of memory. And even on this, the makers barely broke even, expecting to make the profits on after sales. Useful memory, a terminal, an I/O card, a disk controller, and a disk drive, pushed up the prices considerably. Realistically, typical "useful" S-100 systems cost somewhere around $4,000.

Given all of that, it's not really surprising it got supplanted by the PC. Much is made of the fact IBM was taken more seriously by people outside of the personal computer industry in 1981, and that undoubtedly helped, but I can't help but feel that S-100 couldn't have survived for much longer regardless. You could buy a complete system from Commodore or Apple that was more capable for a third of the price even in 1981. The PC didn't need to be cheap, it had IBM's name behind it, but it was obviously more capable than S-100, and it was obvious that if the architecture was adopted by the industry, machines based upon it would be more standardized.

The "Feeling like a dunce" bit

So anyway, that was my train of thought. And it occurred to me that the fact I even have opinions on this suggests my mindset is still stuck there. Back then, even when you programmed in BASIC, you were exerting almost direct control over the hardware. You had a broad idea of what the machine did, what memory locations were mapped onto what functions, and every command you typed affected the computer in a predictable way. The computers themselves were (mostly) predictable too.

As time wore on, especially with the advent of multitasking (which I welcomed, don't get me wrong) you learned to understand your software would be only one party to how the computer behaved, but you understood that if you followed the rules, and the other programmers did too, you could kinda get your head around what was happening to it.

And you felt like a genius if you understood this. And I say "if", because it was possible.

At some point that stopped being possible. Part of it was the PC ISA, the fact an architecture from 1981 was still in use in the mid-nineties by which time it was long in the tooth and needed serious work. Its deficiencies were addressed in software and hardware. Intel essentially replaced the CPU, leaving a compatible stub there to start older applications, and the industry - after a few false starts - threw out most of the PC design and replaced it with the PCI architecture, again, like Intel leaving compatible stubs here and there to ensure older stuff would work. And Microsoft worked on making Windows the real interface software would use to access the hardware.

After a while, there were so many abstractions between your software and the underlying system, it really became hard to determine what was going on underneath. If I program, I now know there are rules I can follow that will reduce the chance of my application being a problem... today. But I don't know if that's the case for the next version of Windows, and all I know is how to reduce the chances, not how to eliminate them. I don't know if the Java I'm writing will generate a webpage that contains Javascript that will contain a memory leak that'll cause the part of the process managing the tab its in to bloat up an additional 100M or so. I can hope it won't, and use mitigation strategies to avoid things that might cause problems, but there are so many things outside of my control I have to trust now, it's just not practical.

Logically the right thing to do under the circumstances is to take back control, to use lower level APIs and simpler sets of rules, but in practice that's just not practical, and doing so means that my tools no longer fit inside the ecosystem with everyone else's. So it's not the right thing - it's actually the worst thing I can do, and if I tried to do it, I'd be shunned as a developer.

I was a genius once because I (mostly) understood the computers I was programming. I feel like a dunce today because that's just not possible any more.

User Journal

Journal Journal: FizzBuzz in Swift

The following is my prepared answer for anyone who asks me this stupid fucking question in any interview in the future.

extension Int
{
  func modBool(modulus: Int) -> Bool
  {
  return (self % modulus).boolValue
  }
}
 
for x in 1...100
{
  print((x.modBool(3) ? "" : "Fuck ") +
    (x.modBool(5) ? "" : "You") +
    ((x.modBool(3) && x.modBool(5)) ? "\(x)" : ""))
}

-jcr

User Journal

Journal Journal: MSS Code Factory 2.3.12932 Released to production!

MSS Code Factory 2.3 adds support for the specification of ServerProc, ServerObjFunc, and ServerListFunc methods that are performed at the server end and atomically committed when invoked by an XMsgClient application.

The 2.3 series also reworks the way that licensing and copyright information are tracked by the manufactured code. Now the license and copyright information of the originating project is used, instead of that of the project that is being manufactured. Just because you included a model someone else designed does not mean you get to take credit for that work.

There will be additional service packs adding new functionality to the 2.3 series, but those enhancements will not require further changes to the GEL engine, only the rule base.

http://msscodefactory.sourceforge.net/

User Journal

Journal Journal: Winduhs

I think the whole mobile operating system thing has screwed up GUI design to a certain degree. Microsoft, Ubuntu, and GNOME have both been brave and tried something new, but what they ended up with ended up being highly unpopular on the desktop. And to be honest, I think only Microsoft ended up with something truly good on a touch interface, though I admit to not using Ubuntu or GNOME in those contexts, just being aware that they've not really encouraged an ecosystem for applications to work well in a tablet environment, leaving users with only the main shell being friendly. So the loss of optimization for the desktop lead to no significant gains elsewhere.

The way I'm seeing it, Windows 10 seems to be genuinely exciting, and a decent modern desktop, that also encourages cross interface design. Microsoft has learned from the mistakes it made with Windows 8, kept the good parts, and put together something truly great and modern.

I don't really want to be stuck with Windows though as my primary OS. I'm hoping Ubuntu et al actually learn from it.

This is something you'll never normally hear from me, but perhaps they need a Miguel type figure to take a lead in either GNOME or Ubuntu. At this point, at least to me, it looks like Microsoft is the one with the good ideas about how a UI should work and the relationship of an application to the UI frameworks of the underlying OS. I don't want anyone to clone Windows, but it would be nice to learn from it, at least.

Back in the 1990s, nerds like me put together our own "desktops", running random window managers, app launchers, and file managers (if that) that seemed to go together. I'm feeling like the FOSS "desktop" is heading back to that era, of stuff that doesn't really go together, being shoehorned to fit, with no real philosophy binding the system together.

User Journal

Journal Journal: I don't understand this Google vs. Oracle thing

For the life of me, I do not understand why Google doesn't just ship an ARM64 build of OpenJDK instead of futzing around with this fight against Oracle. From a pure developer's perspective, the whole thing is flat out STUPID.

Oracle is not preventing Google from shipping their own build of the JDK. They're just stopping them from breaking Java's portability requirement by shipping non-Java bytecode.

Java is not a compiler kit.

User Journal

Journal Journal: Why libressl is stupid 2

I really want to like libressl. But it pretends to be openssl badly. They refused a patch that would have mitigated this whole RAND_egd problem by simply returning that it doesn't work when someone tries to use it, which means that you commonly need a patch to use it at all. If it's not going to work like openssl, then it shouldn't occupy the same space in the filesystem.

User Journal

Journal Journal: OMFG GNOME3 is asstacular

This is not news to most people, but I just tried it for the first time on my first-ever normal Debian Wheezy install (I've always done minimal, netinst etc. and built it up from there for a purpose) and wow, GNOME3 is amazingly horrible. It makes Unity look usable. If that was the idea, mission accomplished, I guess.

User Journal

Journal Journal: April 2015 has been a busy month

April 2015 has been a busier month than I'm used to.

I got MSS Code Factory 2.1 Service Pack 1 out the door after over two months of work.

I packed up and moved to a new apartment.

Last but not least, I installed Ubuntu 14.04.2 LTS on my failing Debian box. I still haven't done any more work on the Windows 7 laptop to get it into a development-usable state, but since I did all that performance tuning on MSS Code Factory 2.1, I really don't need to use the Windows laptop. In fact, the poor beast is so I/O bound when running 2.1 that it sounds like the hard drive is about to rupture and spew it's guts out the keyboard when MSS Code Factory is running.

The shift to Ubuntu 14.04.2 from Debian 7 was a last-ditch attempt to resolve an X-Server crash issue (white-out screen in NVidia 8600-series hardware with NVidia drivers.) Although I did see one such crash on Ubuntu 14.04.2 since installing it in the first week of April, I have not seen it in the ten days since Ubuntu released some X-Server input patches.

So it wasn't entirely the NVidia driver's fault that my X-Server was crashing; there seems to have been some bugs in the input stream processing.

I'm still not 100% confident that the X-Server bug has been resolved, but it's looking like it has. Which is a good thing -- I can't afford to buy a new computer at this time (nor in any reasonably near future, as I'm on disability and get less than $17,000/year to live on here in Saskatchewan, Canada.)

User Journal

Journal Journal: Reason 431 that I don't bother with Slashdot any more - 5 minute comment timer.

I can type at over 100wpm. Slashdot's comment timer was set to 5 minutes a few years back. So if there is a particularly interesting article with interesting comments, I can comment and reply every 5 minutes.

If I'm going at 100wpm, I could write a 500 word essay as a comment. Or what happens more frequently is, I type out a nice constructive reply to someone, and am granted the text telling me I'm going too fast.

So I close the window and go elsewhere.

5 minutes between messages on a good conversation isn't conversing. I had FidoNet conversations go faster than that. I could type up and send faxes faster than that. With a bit of practice, I could send messages over short wave radio in morse code faster than that.

Even if my comment were "you sir are a moron", that leaves well over 4 minutes waiting for the timer to run out.

If anyone wants to have an intellectual conversation with me on an old Slashdot topic (like, appropriate for the genre "News For Nerds. Stuff That Matters"), find me elsewhere. Even this comment, would still have a 3 minute timeout before I could post it.

User Journal

Journal Journal: What do I have to enable now? Fucking DICE. 5

Welp, I can use Slashdot in Chrome and not in Firefox, which implies that something I'm blocking in Firefox is preventing the new improved Slashdot from working. What new spyware bullshit do I have to enable to use Slashdot now? Thanks, DICE! You'll run this place the rest of the way into the ground any day now.

The Shuttle is now going five times the sound of speed. -- Dan Rather, first landing of Columbia

Working...