Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Note: You can take 10% off all Slashdot Deals with coupon code "slashdot10off." ×

Comment Re:But the keyboard... (Score 1) 433

Except... we don't use a "French" (Azerty) keyboard either. We use a french Canada keyboard, and Apple doesn't sell that layout at all. It's the only PC maker with the stupid canadian multilingual keyboard.

Does anyone anywhere use the Canadian Multilingual keyboard layout on any OS, anywhere?

I've worked in industry and in government, in both English and French Canada, and I can't recall ever seeing anyone anywhere using the Canadian Multilingual layout. Ever. I've seen more Dvorak users than Canadian Multilingual.

Why does it even exist in the first place? (and FWIW, it's a terrible layout for coding IMO).


Comment Re:OSX in 2013. (Score 1) 231

Actually, that clarifies that the zram feature did not make it to the Linux kernel until 2014, meaning that OSX had it prior to Linux. Yes, I understand some had a feature like it earlier, but full-blown reliable enough implementation to make it to the RTM release of an OS was OSX 10.9 (2013), Linux kernel 3.18 (2014) and finally Windows 10 in what appears to be late 2015, or maybe 2016 given their track record. Guess better late than never.

One further note: in OS X 10.9, the memory compression subsystem was on by default (and can only be turned off from the command line). I know of no Linux distort which ships with zram or zswap enabled by default (although if there is one, hopefully someone will jump in and let me know!). I doubt that either is particularly highly used on Linux, even though the facility is extremely useful and can enhance performance by quite a bit


Comment Re:OSX in 2013. (Score 2) 231

...proceeds to Google "zswap linux ubuntu"

No. What you want is zram, not zswap.

zram tries to compress pages in RAM, without swapping them to disk. I've only recently enabled this on one of my Debian Jessie boxes (an Intel Core 4 Duo with a motherboard that has a weird memory configuration that in practical terms limits it to 4GB of RAM), however my experience with the equivalent subsystem on OS X has been fantastic. Pages may still later be swapped to disk, but on OS X at least the system aims for a 2:1 compression ratio, holding successfully compressed pairs of pages in a single page without swapping them to disk. Think of it as an intermediary state for swapped pages between having them sit in RAM as-is and paging out to disk.

zswap is about compressing the swap file. This can have benefits as well (especially if you have low CPU load. The Jessie box I mentioned in the previous paragraph fits into this category of use, and I'd probably use swap too if the system didn't have a large and quite fast RAID array in it), as the stuff getting paged will hopefully take up less space requiring less IO time. It is, however, different than what the article is talking about, which is the territory of the zram module in Linux.


Comment Am I really that old? (Score 1) 78

Am I really getting so old that people find this to be a legitimate question?

Have I really been doing this for so long that there are now people who don't remember a time when disconnected machines was the norm?

Am I the only one left here who remembers (for example) dialling for hours to get into the local IBM-run BBS to download the 21 diskette images needed to update OS/2 2.1 to the latest patch level, digging into the cabinet for several boxes of diskettes, de-imaging each and every one of those diskettes (my machine had two 3.5" floppies -- I could manage two at once!), rebooting off the first disk, and then feeding diskettes into the machine one at a time as prompted to update it?

Honestly -- this is a long solved problem. Some of the technologies have changed (you probably don't need 21 USB thumb drives to contain your patch), but the basic idea remains: provide a downloadable patch image suitable for your application, have customers download it to a USB drive of sufficient size, and then have them boot from the drive or run a script or application from the drive to apply the patches.

And if you're in some industry where you worry about your patches getting out into the wild or need to ensure patch security/validity (where hashes aren't good enough) or something odd like that, put your images onto USB thumb drives yourself and ship them to your customers physically (encrypted, if you have some reason to be uber-paranoid).

Now get off my lawn!


Comment Re:The Microsoft key!!!! I've never used it...ever (Score 1) 698

I don't know how it works on a Mac, but entering alternate characters is still easy with Windows, even 8.1. (I haven't tried it on 10.) Rather, I consider it pretty easy. :-) Just hold down the ALT key, type in the four-digit code for the character you want, then release the ALT key and your character will show up. A Euro sign is ALT+0128, for example.

On OS X, you hit Shift-Option-2. Option-3 gives a pound symbol, Option-4 cents.

You still have to get to know some of the shift-option combinations, but that's WAY easier than having to know all of the character codes for whatever you want to type. Less keystrokes too. Accented characters are somewhat easier, in that you can type the accent, and then the character you want to accent (so if you want a circumflex over a character you type Option-i, followed by the character to accent). Again, it's easier to remember one two-key stroke to get the accent you want, followed by typing the letter you need accented than remembering half a dozen four-digit numeric codes. OS X does this way better than Windows.


Comment Two people of the same major marrying??? (Score 2) 90

This cannot stand. Two people with the same major marrying each other is completely against my just-now-made-up religion. It says that Frank (my just-made-up religions version of god) specifically wrote that "Thou shalt not lie with a fellow computer science major as you would with a psychology major".

The government needs to make a constitutional amendment to prevent people of the same major from marrying each other. After all, if we let two people with the same major marry, we're on a slippery slope to marrying dolphins with snack cakes. And then where will it end?


Comment Re:If you're using GPL code, you have no choice (Score 4, Informative) 171

However, the summary also mentions iOS, and I was under the impression that GPL apps on the Apple AppStore are a no go?

FWIW, the situation is a bit more nuanced than that.

If the GPL licensed code is entirely your own work, you can relicense it any way you want, including to Apple for distribution on the App Store.

Where you can get into trouble with the App Store is if you take someone else GPL'd code and release it on the App Store. This could be by including third-party GPL routines, or by publishing code that was developed by multiple parties, without their permission, where copyright has not been reassigned. This was the case for the VLC player: as the article you linked alludes, Apple took that old VLC player app out of their app store due to a copyright complaint from one of the VLC developers. That was back in 2011 -- the VideoLAN Oragniaztaion has since released their own VLC for iOS, while still retaining the GPL license (albeit in part by dual-licensing it as MPL/GPL).


Submission + - Is our universe ringing like a crystal glass?->

TaleSlinger writes: Two physicists at the University of Southern Mississippi – Lawrence Mead and Harry Ringermacher – announced that our universe might not only be expanding outward from the Big Bang, but also oscillating or “ringing” at the same time. The Astronomical Journal published their paper on this topic in April.

As many know, scientists today believe our universe – all space, time and matter – began with the Big Bang some 13 billion years ago. Since then, the universe has been expanding to the size it is today. Yet, the universe as a whole has self-gravity, which tries to pull all the matter – all the stars, gas, galaxies, and mysterious dark matter – back together. This internal gravitational pull slows down the universe’s expansion. Mead said in a statement from Southern Miss:

The new finding suggests that the universe has slowed down and speeded up, not just once, but 7 times in the last 13.8 billion years, on average emulating dark matter in the process.

The ringing has been decaying and is now very small – much like striking a crystal glass and hearing it ring down.

Link to Original Source

Comment Re:Dues it matter? (Score 1) 98

I'm not a PS4 (or any other console) fanboy, but I read this and can't help wonder: It there anything that stops a user from replacing the hard drive in a PS4 with a larger drive themselves (wonky interfaces? self destruct when opened cases? magic formatting of the drive that can't readily be duplicated?)? Is it a typical 3.5 inch drive or a smaller drive?

Sony pretty actively advertises that the PS4 HDD is completely user-upgradable. IIRC the PS4 manual contains instructions, and they also have them online here.

I've read articles that have tested magnetic, SSD, and Hybrid(SSHD) drives, and they all work just fine. The main limitations appear to be in terms of physical size (2.5" drive, 9.5mm or less in height). Word has it you can use up to a 6TB drive, although the people doing so are using 3.5" drives in external enclosures (and I've read some reports of some weird issues with powering such systems up).


Comment Re:The root cause : poor unit testing (Score 2) 130

Wait, what?

If you write code, part of the documentation before you start should be a "risks" statement, where you state that a dependency on X external, third party library, exists, and that any vulnerability could cause issues in your application. Also, that substantial upgrades to the library interface will affect maintainability if any interfaces are changed, or are deprecated.

When someone throws a pile of libraries at a problem, that risk statement gets lengthy.

Which is all well and good if you're doing greenfield development. It's not so good when you're inherited a codebase where none of this was done in the first place, and you're tasked to keep it going. As I said, the real world can be a messy place. In my case, the previous lead architect just threw immature libraries at every problem willy-nilly, at a time before I worked for the company. I get to inherit the problems this lack of foresight caused, and don't have the benefit of going backwards to fix it.

Your problem isn't with using external libraries, it's using ones without service contracts, or immature ones. And you're reacting by throwing out the baby, bathwater, bathtub, house, plumbing infrastructure, and electrical grid.

I don't disagree -- I'm hardly anti-library (you won't get too far in development without them, unless you're doing low-level embedded work). I like a good, stable, well-designed library that has been around for a long time, and where breaking changes are rare. Unfortunately, when the previous architect would throw a library at every problem that may have required eight lines of code, without care for anything other than ensuring the library license permits us to ship it.

And FWIW, I have brought these issues up with management. Their stance is they don't want to see any backend changes of any sort (where the bulk of our library woes lie) -- their focus is on the front end only. They're more than happy to defer the risks to some unspecified future.

As I said, the real world of development can be messy. I'm stuck with a codebase that is full of immature libraries, libraries that are no longer maintained, and libraries with bugs where getting new libraries introduces major changes requiring major refactors. If I were permitted to start fresh, I'd be doing things the way you described, and we wouldn't rely so much on "randomLibrary.jar" that was someones pet project seven years ago with 'LGPL' attached to it, but which has since been rewritten or abandoned. It's a willy-nilly application of such libraries to every problem that crops up without any analysis that I'm against, and not the use of libraries in any and all situations.


Comment Re:The root cause : poor unit testing (Score 5, Informative) 130

What should be happening : when you're planning a new release, raise the component versions to the latest and run your test suite. If it passes, good job, release it.

What is actually happening : the version numbers never get edited, because that version worked, and if you change it, OMG, it might stop working.

Part of the problem I run into with this is that sometimes projects stick with old dependencies because at some point, some major version came along that significantly changed the organization of the API in such a way that the latest component version an't just be dropped in, but requires significant resources refactoring your code to use it. Getting management buy-in for that when there aren't any big customers breathing down their neck to get a flaw fixed can be neigh on impossible.

I ran into this recently myself. During internal testing, I discovered a flaw in our product when accessing any of our web resources using an IPv6 destination IP in the URL (i.e.: http://18080./ A quick bit of debugging showed that an external library we had been using for several years was doing some brain-dead parsing of the URL to pull out the port number; it was just doing a string split after the first colon it found, and presumed the rest was the port number.

Modifying the Maven POM to use a newer version of the API in question was initially difficult because the project had since reorganized their own library structure, breaking things into multiple smaller JARs. Except that some of the functionality was actually _removed_, and isn't available at the latest API revision (functionality we had been using, naturally). Classes had moved around to different packages than where they were previously, and various interfaces appear to have been completely rewritten.

Upgrading to a version of the library that actually fixed the flaw was going to be akin to opening Pandora's Box. Unfortunately, our former architect (from whom I inherited this code) was the type of guy who just liked to throw external libraries at every problem. In the end we had to document the fault for all current versions of the product, and now I'm trying to get management buy-in to do the work necessary to upgrade the library in question for the next version of our product. And this is for just one library out of over 100 that need similar attention.

Suffice to say, I'm not happy about this state of affairs. Unlike the previous architect, I push against using third-party libraries as our solution to everything. If I were allowed to rewrite everything from scratch, we could avoid these problems. Things are unfortunately messy out here in the real world, and when libraries decide to significantly change their interfaces your program uses to access their functionality, no amount of unit tests is going to make upgrading those libraries any easier.


Computers are useless. They can only give you answers. -- Pablo Picasso