Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:Maybe time to move to Chrome? (Score 1) 358

I'm using Chrome on Ubuntu 9.10. It's definitely fast -- I have an older system and Firefox is just painfully slow, Chrome is pretty useable. But when I get around to upgrading my hardware I'm probably switching back, for a few reasons:

  • Zoom/font size settings. Text on most web pages is so small I have to squint (I have this problem all the time: I crank up the minimum font size in Firefox, set my Windows fonts to 125%, etc). Chrome will remember your zoom setting per domain, which is nice, but every time I visit a new site I have to set the zoom again.
  • Stability. Yeah, Chrome is supposed to be super-stable with the process-per-tab architecture. But I've crashed the whole app, and it took chunks of my profile with it (had to reimport from Firefox). The flash plugin crashes frequently on videos that play fine in FF. I can't remember the last time FF crashed, and it always recovers cleanly.
  • Extensions. There are a few extensions that I miss, and I find myself firing up Firefox now and then just for those.
  • Updates. Firefox is in the Ubuntu repositories, so it gets updated automatically. Chrome is a .deb that I have to download and install manually.

Chrome did have flash support right after install -- not sure if it was built in or just found the flash plugin from Firefox that was already on my system. I don't see a way to disable it. There is a flashblock extension, but I haven't tried it yet.

Comment Re:How this works (Score 4, Informative) 178

I just tried this out. When I launch a test program by double-clicking an associated file, the current directory (as returned by GetCurrentDirectory()) is set to the directory where the file was located. It ignores the location of the .exe, and it also ignores the "Start In" directory from the shortcut file (if the association was to a shortcut and not directly to an exe). This is on Win7 64-bit. So I think my evilsmbshare example from above would work as described. Of course it's possible that other Windows systems exhibit completely different behavior. :)

I agree that it's still hard to exploit, but not quite as hard as requiring access to the user's local filesystem.

Comment How this works (Score 5, Informative) 178

I took me a while to figure out how this exploit works, but I think it goes like this:

I have an application, foo.exe, that can make use of an optional system component (or 3rd-party DLL), bar.dll. I don't ship that DLL, and I can't guarantee that it will be present on every user's system. So to ensure that my program degrades gracefully, I open it with LoadLibrary("bar.dll"), and if it's not found I disable the features that depend on it. Since it's not my DLL, I can't speculate on where it's installed, so I use an unqualified path and let the loader do the searching (this is, after all, the job of the loader). The ensures that, as long as bar.dll is correctly installed on the system, my application will find and use it.

From an application developer's point of view, this the right way to do things. If I did this on Linux or MacOS, it wouldn't be a problem. Unfortunately, Microsoft decided that the current directory (".") should be in the default search path (see http://msdn.microsoft.com/en-us/library/ms682586(VS.85).aspx ). It's even searched before $PATH!

Now the exploit goes like this:
1. On \\evilserver\evilsmbshare, I place a file foofile.foo, an extension which is associated with foo.exe. Right next to it, I create an evil version of bar.dll.
2. I convince the user to double-click on foofile.foo, causing windows to open foo.exe, with a current directory of \\evilserver\evilsmbshare.
3. If the user's system doesn't have bar.dll installed, Windows will eventually find my evil version of it at .\bar.dll and load it into the unsuspecting foo.exe.
4. My evil code runs and does whatever evil deeds I want it to.

If this is correct, then the decision my Microsoft to put the current directory in the library search path seems pretty braindead, and it's hard to blame application developers for assuming that LoadLibrary() will load a library in a sane and secure way. But I'm having a hard time imagining an application that would break if the current directory were just removed from the search path. Shipping DLLs in the application directory is common practice, but expecting them in the current directory? Why would you do that?

It seems that this exploit requires you to trick the user into opening a file from a filesystem you have access to, at which point you could probably just as easily get them to open a trojan directly. I think local privilege-escalation attacks are more probable (e.g. tricking a system service into opening your evil DLL).

Comment Re:It depends? (Score 1) 129

The other big factor (the biggest in most of the GPU code I've written) is your pattern of memory access. Most GPUs have no cache so access to memory has very high latency even though the bandwidth is excellent. The card will hide this latency to some extent through clever scheduling; and if all your threads are accessing adjacent memory, it will coalesce that into one big read/write. But GPUs do best on problems where the ratio of arithmetic to memory access is high, and your data can hang around in registers for a while.

I've found that in general GPU code has to be written much more carefully if you want good performance. On a regular CPU, if you pick a decent algorithm and pay attention to cache locality, you can usually rely on the compiler for the low-level optimizations and get pretty close to peak performance. On the GPU you have to pay very close attention to memory access patterns, register usage, and other low-level hardware details -- screwing up any of those parts can easily cost you a factor of 10 in performance.

This is starting to change, though -- the newest chips from nvidia have an L1 cache, and relax some of the other restrictions.

Comment Re:"ostensibly qualified" is fuzzy (Score 1) 584

I've thought for a long time that maybe there was a place for someone who's more than a nurse but less than a doctor. But the politics of that industry gives politics a bad name. It'd be the demarcation dispute to end them all.

Like a physician assistant? My primary care doctor is actually a PA, and she's probably the best doctor I've had. Great bedside manner, and even when I go two years between physicals, she knows who I am and seems to have my medical history memorized. For a primary care doctor who mostly handles routine physicals and minor problems, and hands the major stuff off to specialists, I think these kinds of skills are more important than sheer volume of medical knowledge.

I've also seen PAs in the emergency room on a couple of occasions. Maybe they're becoming more common?

Comment Re:Artificial virus (Score 3, Informative) 260

I didn't see any indication that the nano-particles are self-replicating, or capable of spreading from one person to another, so you'd need to inject each target individually. It's probably easier just to shoot them.

Plus, if I understand correctly, cultural conceptions of race don't map very well to genetic differences. So finding a race-specific gene to target might be harder than you'd think.

Comment Re:A debugger for C++ (Score 1) 310

Is there any debugger that works well with STL? Visual Studio seems to be equally useless. I can sometimes guess at what's going on by staring at the cryptically-named private variables of STL containers, but to get good information you need to call methods on the containers. I've never had that work reliably in any debugger -- if anything, GDB seems to do a slightly better job of it.

We're using VS 2005 at work, maybe 2008 is better?

Comment Re:US bullying and demanding other countries.. (Score 1) 457

Interesting read. It sounds like there's a lot more to the Israeli process than just singling out people who are/appear to be Muslim or Middle Eastern for extra scrutiny. Their security screeners interview every single passenger -- which is what allows them to catch some like Anne-Marie Murphy, who at first glance doesn't show any sign of being a likely terrorist.

They do single out Muslims as part of that process, and that makes sense from a security perspective, especially if you're Israel. But that's just one part of a security process that's vastly superior to anything I've seen in the US -- they would probably do better than us even without the racial profiling.

Unfortunately, that kind of security requires a lot of manpower, and it requires skill and intelligence on the part of the people doing the screening. I don't think the US is interested in making that kind of investment -- security theater is cheaper and easier.

If you just add profiling to a TSA-style security system, you might catch Abdulmutallab, but you still won't catch a terrorist like Richard Reid (a British citizen who didn't look Middle-Eastern or have a Muslim name) -- never mind someone like Timothy McVeigh. So I'm still not convinced that profiling on its own is worth the political and social cost.

Comment Re:US bullying and demanding other countries.. (Score 1) 457

Political correctness prevents us from using common sense. 80 year old grannies traveling in wheel chairs should not undergo the same security checks as 18 year old middle eastern men.

The problem with "common sense" is that it's often wrong. Case in point: the two attempted attacks on US flights since 9/11 were carried out by a British man and a Nigerian. Profiling Middle Eastern men wouldn't have helped in either case.

In theory, profiling could be useful if it's done right. But it's more likely to be guided by prejudice than evidence. There are many simpler and less controversial measures that would be equally effective, but they don't make good security theater so nobody's talking about them.

Comment Re:Alternative? (Score 2, Informative) 71

On top of that, the CUDA tools are still much better than OpenCL. OpenCL is basically equivalent to CUDA's low-level "driver" interface, but it has no equivalent to the high-level interface that lets you combine host/device code in a single source, etc. CUDA also supports a subset of C++ for device code (e.g. templates), which I don't believe is the case for OpenCL. CUDA also has a debugger (of sorts), profiler, and in version 3 apparently a memory checker. But I haven't been following OpenCL that closely lately -- it may be catching up on the tool front.

If you're developing an in-house project where you have control over the hardware you're going to run on, or you know that most of your customers have Nvidia cards anyway, there are still good reasons to go with CUDA.

Comment Re:X11 has never been a problem. (Score 5, Interesting) 542

I've never had performance issues running X11 over a LAN. VNC, on the other hand, is noticeably sluggish (RDP seems to work well though). I don't run apps over a WAN very often, except for the occasional emacs session (which is a bit laggy but useable).

But more importantly, the X style of remote access is much, much more useful than VNC/RDP. Remote apps integrate seamlessly into my desktop, instead of being stuck in a separate window. And multiple people can run remote apps on the same machine, without interfering with each other or a user who's physically sitting at the machine.

VNC and RDP are useful hacks for systems that weren't designed for remote access, but they're no replacement for real network transparency.

Slashdot Top Deals

And it should be the law: If you use the word `paradigm' without knowing what the dictionary says it means, you go to jail. No exceptions. -- David Jones

Working...