Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment: Re:Hasn't this been known? (Score 4, Insightful) 159

by maccodemonkey (#48662523) Attached to: Thunderbolt Rootkit Vector

Well, now I'm reading specs on USB 3.0 controllers. Ugh. There's a lot on mapping a bus address to a memory address for DMA, but nothing addressing the security implications of doing so, or what devices are allowed to do, just broad hints like the buffer has to exist in a DMA-able part of memory without saying if that's a security implication or a hardware implication.

It would be nice to see a follow up article on if/how USB 3.0 protects against these things, because I'm not a kernel USB developer sort of guy, so while I know DMA is there, I'm not feeling like I'd be able to dissect these implementation specs.

Comment: Re:Hasn't this been known? (Score 1) 159

by maccodemonkey (#48662417) Attached to: Thunderbolt Rootkit Vector

same thing as a pci-e / pci / cardbus / express card with a boot ROM or flash. They load pre boot at least on non mac systems you can go to bios and trun off option roms / set it to EFI only mode.

Apple exposes a bunch of pre boot options for the firmware on the command line, but I'm not sure if you can disable pre-boot EFI drivers from there.

Comment: Re:Hasn't this been known? (Score 4, Interesting) 159

by maccodemonkey (#48662393) Attached to: Thunderbolt Rootkit Vector

I'm pretty sure in the case of USB 3 that DMA is a function of the host controller. A device by itself cannot inject into arbitrary memory. This thunderbolt "vulnerability" is the equivalent of the windows autorun on insertion function that was disabled years ago. Only this functions above the level of the current user (aka much worse).

I'm looking up DMA for USB3. Although there are some ways to secure DMA (like a white list of addresses/sizes that are safe to write to), all of the advertised functionality of USB3, such as the sustained data rates, would be very hard to achieve if you didn't have direct access to memory. That's why Firewire ruled for live streaming of data for so long: DMA made it's rates reliable, whereas USB's dependence on the controller and CPU for memory transfers made the throughput more flakey.

Comment: Re:uh - by design? (Score 3, Informative) 159

by maccodemonkey (#48662191) Attached to: Thunderbolt Rootkit Vector

Thunderbolt is more like USB to the user - it's a thing you use to connect untrusted devices to your system. You wouldn't expect that plugging in a USB thumbdrive would magically own your system (well, maybe you should, because it's happened in the past, but I think it's fair to say that it shouldn't). You'd think that plugging in a random Thunderbolt device would be designed to be safe. Apparently not: apparently Thunderbolt is unsafe by design.

USB 3.0 has this exact same feature (DMA), so yes, yes you should expect a USB thumb drive to be able to do this.

Comment: Re: Ethics? (Score 3, Informative) 553

by maccodemonkey (#48631961) Attached to: FBI Confirms Open Investigation Into Gamergate

Dude, the entire industry is dirty. Here's a tip: if you're worried about ethics start boycotting every video game.

It's funny how when it comes out that a gaming company acted unethically Gamergaters suddenly lower their standards by a few notches rather than give up their favorite toys.

Comment: Re:Huh? (Score 1) 191

by maccodemonkey (#48612017) Attached to: Apple Wins iTunes DRM Case

Wait, what? People no longer use MP3s? They don't buy iPods?

iTunes, the iPod, and the iPhone (which are either the default software player or the default hardware for most people, especially inside of the US) have been using MP4/AAC for years.

Google still seems to be using MP3 strangely (AAC compresses much better with higher audio quality, and you'd think they would like to save on bandwidth costs), but they could be doing that because they have to support a wider range of devices. Amazon falls into the same category.

So yeah, while MP3 is still around, but with 63 of all digital music sold in the MP4/AAC format, it's hard to argue it's the universal standard it once was.

Comment: Re: STEM is for suckers.. at least now. (Score 1) 454

by maccodemonkey (#48467211) Attached to: Researchers Say the Tech Worker Shortage Doesn't Really Exist

You forgot 1812? Or the Civil War? You apparently don't like either side of the civil war, but there was an entire group of people who's freedom was won at the end of the rifle.

The same holds true of World War II, one of the last cleanly justifiable wars. They weren't US citizens, but there was a large group of people being shoved into ovens whose freedom was won at the end of a rifle.

Normally I'm a liberal against unnecessary war, but the military has also has it's place.

Comment: Re: Embrace has started (Score 1) 192

by maccodemonkey (#48395787) Attached to: Visual Studio 2015 Supports CLANG and Android (Emulator Included)

The iOS support I've seen so far requires you rewrite any API facing code in the Cocoa APIs. You'll get to do it in C# instead of Swift or Obj-C, but you do have to rewrite.

Not that I'm complaining. I'd hate to see all the Java style train wrecks that would come to the platform from developers blindly hitting recompile buttons.

Comment: Re:Not mysterious. Just lousy. (Score 3, Interesting) 229

by maccodemonkey (#48145769) Attached to: The Subtle Developer Exodus From the Mac App Store

The sad thing is I really like the OS, and I'd be happy to develop for it if they made development accessible and quit leaving trails of unfixed bugs behind them.

How exactly is developed not accessible?

- Apps do not have to be distributed through the Mac App Store.
- Xcode is provided for free along with all documentation. There are tons of other IDEs and languages as well.
- Yes, there are bugs, but all platforms have bugs. Surely as an OS X user you can see bugs as well.

I'm not sure what you're looking for to make development more accessible.

Comment: Swift != Interface Builder (Score 1) 69

by maccodemonkey (#48023903) Attached to: Building Apps In Swift With Storyboards

"But is Swift really so easy (or at least as easy as anything else in a developer's workflow)? This new walkthrough of Interface Builder (via Dice) shows that it's indeed simple to build an app with these custom tools... so long as the app itself is simple."

What? Seriously Slashdot, if you're going to have Apple articles, at least have submitters that have half a clue what they're talking about. "How good is Swift? Let's find out by using Interface Builder which is not Swift at all!"

Swift and Interface Builder can be used together, but they're not strongly related components. They're related like a WYSIWYG web tool, like Dreamweaver, and JavaScript. They're both helpful to get what you need done, but they don't replace each other. To give you an idea of how true that is, Interface Builder first shipped in 1986. Now, it's advanced a lot since then, but it's almost 20 years older than Swift, so obviously it's had a long life away from Swift.

Duh you can't create a big complicated app with only Interface Builder like you can't create a big fancy web app with just the visual components of Dreamweaver. You've got get down and actually write some code, which you know, is what Swift is, Swift being a coding language and all. So I find it really odd that this post is talking about reviewing a programming language in the context of trying to use a completely different tool that is not that programming language.

Comment: Re:Obj-C (Score 1) 316

by maccodemonkey (#48009965) Attached to: Ask Slashdot: Swift Or Objective-C As New iOS Developer's 1st Language?

Automatic reference counting means adding retain and release messages automatically; there most certainly is a runtime hit, and that's on top of the usual memory allocator costs, which can be quite high. A good compiler can eliminate some of those retain/release calls.

Sure, but I would assume any manual memory management system worth it's salt is doing retain/release. Most of the C++ big libraries do it.

Furthermore, because of deallocation cascades, a release message in such schemes can have a very high latency (don't know whether Apple tried to add workarounds).

Two things:
- Deallocation cascades are inherent in memory management. Neither reference-countless memory management nor garbage collection can avoid it. So I'm not sure what the point here is.
- As far as latency on dispatches, Apple is using tagged pointers (http://en.wikipedia.org/wiki/Tagged_pointer) which is using some room in the pointer for holding the reference count. In practice, this means means there is practically no latency for messing with the retain count.

And, of course, ARC has the same problems with circular references that regular reference counting has.

Which is also a problem with Garbage Collection...

Reference counting is a mediocre memory management scheme at best; people use it in C-like languages because they don't have a choice. It is inferior in just about every way (runtime overhead, latency, memory utilization) to a good garbage collector.

I don't see how it is at all. Results are instant. Overhead is far less. Pretty much every claim here needs citation. It's hard to see how an entire process constantly analyzing object references is less overhead than pre-handling those references. Memory utilization? I have to burn a bunch of memory on a garbage collection process, and then watch my memory climb and then drop off a cliff constantly while I wait for the garbage collection to run, while retain counted count generally stays with a pretty stable allocation count because it's instantly computed. Puh-lease.

I saw your claim below that GC is identical to retain/release in behavior. It really is not. If I clear out a reference in Obj-C or any other retain/release language, the memory is instantly freed. In Java, I have to wait for another run of the garbage collector, which can take a while unless I manually trigger it. Yes, YOU don't have to wait for anything in the code. But ignorance is bliss.

Here's basically the comparison I'd use: Retain/release is like having an incinerator you throw your garbage into. Garbage collection is like... well... having a garbage truck. In both cases when I'm writing code I can just throw things away in my trash bin and pretend it's not there. With retain/release/an incinerator, the memory is actually gone as soon as I throw it in the trash bin. With garbage collection, I've thrown it in the bin and forgotten about it, but that doesn't change that the trash will continue piling up until the trash guy comes, which may still be a bit away. The large amount of piled up trash is also a problem, and actually makes the cascade problem you were concerned about with retain/release WORSE. Instead of a few cascade relationships being dealloc'd at once, all the dereferenced objects are going to pile up, wait for the garbage collector, and then the garbage collector is going to have to pour through thousands of relationships all in one go, causing the program to come to a halt while everything waits for the garbage collector to catch up. I've had to clean up a few Java messes that had that problem by manually firing the garbage collection to spread out the load.

Comment: Re:Obj-C (Score 1) 316

by maccodemonkey (#48007831) Attached to: Ask Slashdot: Swift Or Objective-C As New iOS Developer's 1st Language?

CLR in this context means a very large standardized library, which is not subject to fragmentation nor availability. It runs or it doesn't, and it behaves as documented (by google or stack overflow, not necessarily MSDN).

That's not what the term CLR actually means (http://en.wikipedia.org/wiki/Common_Language_Runtime) nor does that necessarily apply to Swift. Swift does indeed have a slightly larger core function base than Obj-C, but still not enough to build an entire app. For example, there is no I/O (either file or console, except for printing to console), networking, or GUI support that is part of the core implementation. You won't be building much with the Swift core library by itself.

Here's a list of every single function present in the Swift standard library in June. That every single function can be listed on a web page should tell you that it's nowhere near as expansive as the Java or .Net standard libraries:
http://practicalswift.com/2014...

It's primary API intended use API is Cocoa, but that is entirely un-attached from the language. As of posting time, Cocoa is still entirely written in Obj-C, so the primary library intended for use by Swift is not even written in Swift itself. But I digress, Swift itself definitely does not have a very large standard library. When Swift is ported to other platforms you won't see Cocoa anywhere with it. And Cocoa is different on iOS and Mac, so even if you're sticking to Apple platforms you don't have a common library between the platforms. So right there, even if we decide Cocoa could be called Swift's standard library (which it isn't) it fails the fragmentation test you've put forward.

Not to mention, the R in CLR stands for runtime, and we're talking about the Swift standard library, not the runtime (which is the Obj-C runtime, not the Swift standard library anyway.)

Thus spake the master programmer: "Time for you to leave." -- Geoffrey James, "The Tao of Programming"

Working...