Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror

Comment: Re:COBOL (Score 1) 382

by Yaztromo (#48866009) Attached to: Is D an Underrated Programming Language?

That one is about stomping out all the world's other languages and making us all speak the same. What a loss of culture.

Esperanto has only ever been promoted as a universal second language. There has never been a push to make it the native language for anyone, anywhere. Hence, no loss of culture needed nor required.

Yaz

Comment: Windows 10, the bad car analogy. (Score 1) 489

by Yaztromo (#48852271) Attached to: Windows 10: Can Microsoft Get It Right This Time?

Windows has always been like a cheaply made off road vehicle made in a former Soviet-bloc country. The controls were a little weird, and it broke down a lot, but otherwise it could drive on a lot of really sketchy roads, and you probably knew a guy who knew how to fix it for you when it broke down.

Then Windows 8 came along. To continue the analogy, it was like a new model year of that same cheaply made Eastern European off-road vehicle suddenly came with a few well-needed under-the-hood improvements so that it wouldn't break down as readily, along with a big 8" spike sticking out of the centre of the drivers seat. Aficionados who have never driven another car in their lives rave about the spike (it's painted some very nice colours), and continue to flood forums trying to convince people who have stayed away from the newer model because of the spike that if they just tried it long enough, they'd get used to having a giant spike up their asses.

Now Microsoft is coming out with Windows 10, the biggest benefit of which is that it now features a slightly shorter spike. And Windows zealots will try to convince everyone else that it's a major improvement. But you're still taking it in the ass every time you get in for a ride.

Yaz

Comment: Re:They're assholes. (Score 1) 336

by Yaztromo (#48677603) Attached to: Why Lizard Squad Took Down PSN and Xbox Live On Christmas Day

This is true, but the issue is that is dumb! You really should be able to unbox a toy on Christmas morning have it work without going out the Internet and connecting to some account.

Maybe not all the functionality can be there, but functions that don't naturally require network access should not require network access.

As it happens, my wife bought me a PS4 for Xmas -- a massive upgrade over my 15 year old original PS2. It came in the box with GTA5 (on disc), and a coupon for a free digital download of another game.

It's been a PITA that PSN has been offline. There are a lot of features and functions built into the system that rely on online functionality, including for some dumb reason accessing the built-in web browser. However, playing GTA5 hasn't been an issue -- I just popped the disc in, waited what felt like an eternity while it installed itself (it didn't give me a choice, and warned me it could take up to an hour), and I was off and playing. All without having been signed into PSN.

In essence, the system worked exactly as you described that it should. A single-player game on disc loaded and ran just fine while PSN has been offline. Not all the functionality was there, but the major function that doesn't require network access (playing GTA5 in this case) has worked flawlessly.

Yaz

Comment: Re:Apple Pushing All Mobile CPU Vendors (Score 1) 114

by Yaztromo (#48605177) Attached to: Apple and Samsung Already Working On A9 Processor

Uh, a dual core 1.3Ghz cpu is "marginally superior" to phones running quad and octo cores at twice the clock speed?!

Cores and clock speed is hardly the only determinant of performance. It sets a hard upper bound, but that can be readily squandered by software.

In the case of Android phones, they pushed for extra cores early on to avoid UI stutter during garbage collection cycles. iOS has never provided garbage collection; you either have to setup your own retain/release calls to keep or relinquish objects, or you use ARC (Automatic Reference Counting) to do more or less the same thing.

In effect, it's a trade-off. Google decided to simplify memory management for developers, and keep the barrier to entry low by appealing to existing Java developers, with the trade-off being that they require more parallel processing power for garbage collection. Apple avoided the need for the additional processing power and battery capacity (and in turn device size) by not implementing garbage collection in iOS, and thus can squeeze more performance out of fewer cores, with the trade off being you can't just pull Java developers off the street and have them start writing iOS apps. ARC is so slick that IMO Apple has an overall edge with their design; others are of course free to disagree.

Yaz

Comment: Re:Open Source not a silver bullet (Score 2) 73

by Yaztromo (#48570385) Attached to: Why Open Source Matters For Sensitive Email

It seems a bit foolish to worry about purely theoretical security issues when we've got so many real ones to deal with. Ken Thompons' compiler infection demonstration was an interesting experiment designed to make a particular point, but I don't think it's wise to consider tool-chain hacking a legitimate threat, as we've never seen anything remotely like this in the wild, as far as I'm aware. And frankly, I question whether it's even realistically possible beyond a very simplistic demonstration.

First off, naturally the level of security I'm talking about would probably only be reserved for national governmental agencies intended to protect ultra-sensitive data. For them, that level of security is necessary, and they will spend the money and resources to audit and verify everything if necessary (which is why we have SELinux).

Additionally, the build chain comprises not only the compiler, but the standard libraries and any third-party libraries as well. If not verified, these could easily have unexpected code inserted into them, that compromises your product once linked against them. You wouldn't expect to see such compromised libraries "in the wild", as they would probably part of a targeted attack. This is hardly unprecedented; while not done at build time, Stuxnet uses DLL replacement on Windows to add extra routines to the operating system, which are used to inject code being uploaded into a PLC.

Again, most organizations don't care to undertake the kind of expense required to protect against such attacks; they use the chain-of-trust you describe. However, national security organizations do work at this level, and if you need that level of security, pre-compiled binaries, whether they come with source or not, is insufficient.

Yaz

Comment: Re:Open Source not a silver bullet (Score 1) 73

by Yaztromo (#48570317) Attached to: Why Open Source Matters For Sensitive Email

With a verified compiler no less. We have seen ever more sophisticated malware these days, certainly a malicious compiler could easily slip vulnerabilities into the binary.

Yup -- I intended that to be considered part of the build chain. Compiler, standard libs, any 3rd party library dependencies, the build tools themselves (have to make sure they're using the libs you expect them to...), the OS kernel...right on down the chain.

Yaz

Comment: Re:Open Source not a silver bullet (Score 1) 73

by Yaztromo (#48569943) Attached to: Why Open Source Matters For Sensitive Email

I use many open source tools, but I've never inspected the code myself. Even if I did, I'm not going to be finding these hard-to-find defects that the people in the project can't find.

From a security perspective, even just having and being able to inspect the code is insufficient if you need top-notch security: you had better also be compiling that code yourself. It is nearly impossible to be able to verify that a binary blob didn't contain additional/modified code than what the sources contain without compiling it yourself. And even with being able to compile everything yourself, you're still at the mercy of the build chain and all of its dependencies (unless you audit/build them yourself too).

Open Source is still better in this regard than closed source, of course -- at least you have the ability to compile it yourself if security is that critical. I think the problem for a lot of organizations is that security isn't critical enough for them to hire people to a) audit the code and b) build, test, and verify it for their own internal use. In which case, it would (at least form outward appearances) be cheaper/easier to go with a closed-source solution, with someone behind it whom you can blame/sue if things go sideways.

Yaz

Comment: Re:What what WHAT? (Score 4, Informative) 100

by Yaztromo (#48382757) Attached to: Ask Slashdot: Getting Around Terrible Geolocation?

Note, in the first link, everything except W3C is listed as correct, which is even more baffling for me, because somewhere the wrong information is being received, and it happened everywhere in the shop at once, across platforms.

You've got it all wrong as to where the problem lies.

First, there are two ways being used to calculate your geolocation. One of them uses online providers who have databases mapping IP addresses to locations. This is what you're seeing in the "Provider X" columns, which you state are indeed showing your correct location.

W3C doesn't provide a geolocation service. Instead, what the results of this (admittedly badly named) column indicate are what YOUR COMPUTER reports its location as being, using the W3C Geolocation API. The first link you provided above describes this succinctly in the text immediately above the map, where it states "The W3C Geolocation service determins location by the browser providing GPS location (if available) and signal strengths of visible WiFi annoucements" [sic]. Thus, the web page is asking your browser to report where it is located, and your browser is responding that you're somewhere in Ireland.

The question for you then becomes: where is my browser getting this bad data from? On Mac OS X, browsers get this from the Core Location Framework. While Core Location Framework can conceivably use a number of different factors to determine your location, typically it uses the detectable WiFi beacons in your area, mapping their SSIDs and MAC addresses, and their relative strengths to triangulate your location. On Windows it uses the Sensor and Location Platform to do much the same thing.

I don't know much in the way of details of the databases Apple and Microsoft are using on the backend to map your triangulated location based on SSIDs/MACs of visible WiFi access points, however there are a few ways the system can go wrong:

  • - The SSID/MAC of your access point matches that of another access point somewhere on the globe (and for some reason, all the other access points in your vicinity aren't in the database), or
  • - You've moved the access point in question from one location to another, and the database hasn't been updated yet. This could occur if, for example, you buy a WiFi access point used off eBay (for example), or you've moved your physical location, you've bought a refurbished access point, or your corporate IT has issued you a previously used access point from another office.

The fact that all your systems had this problem at the same time indicate it's probably one of the above. You can try to fix the situation by changing the SSID of your access point. Depending on the size of your facility, this may be more or less difficult, however it should hopefully make the incorrect results from your OSs' location services either report the correct location, or simply that your location is unknown. You may also need to change the MAC address of your access point(s), but I'd save that as a last resort. Note than making these changes should fix the issue with your systems reporting themselves as being in Ireland, but it may not result in them reporting the correct location (they might report they don't know their location at all). That's okay -- for Apple devices at least, you can fix this by simply having someone with an iPhone with Location Service enabled in the vicinity (Apple's data is crowd sourced automatically through the use of GPS co-ordinates and relative WiFi access point signal strengths (I'm not sure how Microsoft collects the information for their database, so I can't help you there -- a Google search might provide some answers).

HTH!

Yaz

Comment: Re:Unfortunate, but not surprising (Score 1) 450

by Yaztromo (#48343877) Attached to: Joey Hess Resigns From Debian

At this point, it seems that a fork of Debian is almost inevitable, though that effort appears to me to be more likely to simply dilute the overall effort than bring any resolution.

I'm pretty sure Debian is already the most-forked Linux distort out there. Wikipedia lists 117 distros (on my count) based on Debian.

Yaz

The universe is all a spin-off of the Big Bang.

Working...