Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×

Comment Re:Solving the wrong problem in the wrong way (Score 1) 142

That's a good point, but the thermite need not apply extreme heat to the entire device -- just the flash memory chip. A strip of magnesium with oxidizer might also do the job -- both could be adjusted so that the user sees just a bit of smoke coming out of the phone, without personal injury. See section 3.1.2 in this document about the relation of temperature to data retention.

Shorting out the battery through a coil around the memory chip is likely to make it hot but not necessarily hot enough to truly erase the information (I may be mistaken -- I haven't done the calculations), You might even explode the battery and still keep the information. This also would not necessarily produce the electromagnetic pulse you're thinking of. Even if it did, EMP is unlikely to destroy the contents of flash memory. (Note that I am talking about actually destroying the information stored on the chip. Destroying the small gold wires on the package which connect the chip to the rest of the circuit is easier -- but an entity with enough resources can still recover the information).

Here is another writeup on the effects of temperature and storage time on flash memory (it seems baking the chip at 125 C for 10 hours will still not necessarily erase everything on it). You will need much higher temperatures than you would get with a polymerization reaction to destroy memory in span of several seconds. Both thermite and magnesium can produce thousands of degrees Celsius, which any data stored in flash memory is unlikely to survive.

Comment Solving the wrong problem in the wrong way (Score 1) 142

The real problem is buying a phone or a SIM that's not registered in your name. Since most governments archive the communications anyway, destroying the device accomplishes nothing except to give you away.

Now to destroy the device in a visible way may have some value, but wouldn't it be more reliable to simply put some thermite around the memory modules so as to destroy the memristors beyond recovery without having the phone expand into an ugly wad of polymer?

Comment Do Not Want (Score 1) 83

Also, no thank you.

And I say this as a person who finds FB quite useful in real life.

"You forgot to check in at the conference yesterday." "I see that you did not 'like' my presentation. Might I remind you that this is a condition of employment?" "I TOLD you not to respond to John's project update with department specific information."

Having witnessed several commit comment wars, I can't see this going anywhere good...

Comment Gen X'er here (Score 5, Insightful) 219

We were better than our parents, who couldn't fix the flashing 12:00 on the VCR. Our cohort (plus those within 10 years of our age range) went from dealing with BASIC on Apple II, ZX Spectrum, QuickBasic, etc., DOS, earlier versions of UNIX, all the Windows-es, etc. all the way to the abomination that is Windows 8.

To play Doom, I had to download 6 ZIP files over a 2400 baud modem for a week, unpack everything, and learn how to hack the Config.sys file on my 4 MB DOS machine to free up just the right amount of the right type of memory.

When I bought my first scanner, it took 2 days of resolving IRQ conflicts by flipping DIP switches whose meanings I did not understand at the time to make sure it didn't conflict with my sound card.

Mice required their own drivers.

The current generation is just as smart as we are, if not more so. But they always had UIs that made sense. They did not live through an entire 2 decade long information technology revolution. It shouldn't come as a surprise that they are surprised by alien (and to them, non-sensical and inconsistent) interfaces developed for a captive audience.

Add to this that enterprise software is always purchased during golf games by people who will never use it, and you have a world in which our skills of adapting to horrible and inconsistent interfaces are still useful.

I will now press Alt-H to disconnect :)

Comment The real issue is lack of transparency (Score 5, Insightful) 228

While the idea of using an algorithm to sentence a human being is bone-chilling, you might be able to justify this as a "formula" for sentencing -- which, of course, merits its own debate.

What is unconscionable about this is the fact that it's a SECRET algorithm. As in closed source. Essentially a secret law.

This has no place in democracy.

(Also, any algorithm which ingests statistical and demographic data is bound to come up with unpalatable and/or spurious demographic correlations (since there is a causal link between poverty and crime and a historic link between race and poverty) which I wold rather have society refrain from codifying -- in law or in actual computer code).

Comment User-hostile and parasitic (Score 1) 771

I have to confess that I'm an Apple customer -- I own an iPhone, iPad, and a MacBook air -- as well as some Windows-running hardware.

I LIKE Apple for the devices on which I am a consumer. I like that things work. When I have to customize things, I use Linux/Raspberry Pi/Arduino, etc.

I unhappily lived with the "lightning port" because it's a small inconvenience and is superior in a minor way (you can't plug it in upside down).

This move, however is PARASITIC. I wish it would be regulated away or something -- it reminds me of the ABSOLUTELY POINTLESS glut of phone chargers which were available in the late 1990s and the early 2000s before USB became standard. They add absolutely no value to the user and force the user to buy new accessories or adapters.

Will I stop using my existing Apple devices? Probably not. Will I buy the new iPhone -- probably not. I'll just upgrade to the latest one which does not package idiocy and stop there.

Comment Yes. Automatically incapacitate active shooter. (Score 1) 1144

I'm going to leave the obvious solution of using metal detectors and millimeter wave scanners aside for the moment.

You can design a system to make sure nobody gets off more than one shot in a crowd.

It would be relatively easy to construct a system for installment in a crowded place (night-club, movie theater, museum, etc.). which locates a gunshot being fired (multiple microphones), identifies the person doing the shooting (computer vision), and covers them in a massive amount of sticky goo or some other non-lethal deterrent.

The non-lethality of the deterrent is important because 1) No system is foolproof, and nobody wants an automatic gun potentially aimed at them at all times 2) There may be instances where the first shooter is shooting for a legitimate purpose (e.g,, rescuing a hostage).

Comment Migrations are costly and newer is not better (Score 4, Informative) 217

For the record -- IANACP -- I've never written or compiled a line of COBOL in my life..

But I did help migrate an old mainframe based system to a new "client-server based three tier architecture system based on Linux and a Java thin client" back in the very early 00's. The old system was near perfect and did the job, even though it ran on the (then alien to me) mainframe.

The new system was written by 20-somethings like myself and would (even with WAY more computational resources) conk out at the worst times. This, I believe, is the story of EVERY migration. It's not necessarily that older is better, or "they don't make them like they used to", but that software development is a bug-prone and arduous process that you will not get right the first time.

So if you're the VA administrator with an established career, you might be forgiven for not taking the risk of jeopardizing employee and patient data and services to satisfy some vague desire for "modernization", especially if you know that the project will be full of errors and WAY over budget.

What's the solution? I don't know. Start teaching COBOL and commission Google to create "Google Mainframe Migration"???

Comment Re:1000 engineers (Score 5, Informative) 128

There are several factors. First of all, what they are building is a HUGE engineered system which would have taken up a couple of buildings a decade or two ago. The fact that the end product is small doesn't change the complexity. The second part is the fact that it IS so small, which brings its own complications. In addition, semiconductor manufacturing is a very tricky business where even making the simplest thing (e.g., a transistor) takes an enormous amount of planning, characterization, and tool design.

Part of it is the R&D -- nothing like this has been done before, so certain things have to be figured out (heat dissipation, how the proximity of the components effect the other components,stuff neither of us will understand, etc. etc). Another huge part is tooling and process -- someone has to design, test and characterize the fabrication tools and processes (the "automation" you speak of has to be built by someone -- a device this complicated probably can't be built without the automation). The chip is divided into subsystems each of which needs to be designed, simulated, and optimized. Someone has to integrate all the subsystems and simulate them together. The 1000 people probably include material scientists, process engineers, electrical engineers of various stripes, semiconductor physicists, mechanical engineers (heat dissipation, packaging, etc)., systems engineers, engineering project managers, etc.

Comment No. (Score 1) 437

C has probably been around longer than you have. Rust probably isn't even a blip on the radar.

Be very wary of embracing immature technology for anything that matters. There are good reasons to do it, but they have to be EXTREMELY good reasons.

Otherwise, even if the tech succeeds, you will be stuck with a code base with immature and deprecated idioms. (Note the very old Java applets you sometimes see on the web that modern versions of Java make nearly impossible to run. So much for "write once run anywhere". There are no such programs written in ANSI C -- unless they involve 3rd party libraries).

You will also have to deal with buggy and immature developer tools and will spend weeks debugging problems that might be resolved with a simple Google search in C.

I have learned this the hard way, unfortunately..

Comment True boolean search, ability to vote on results (Score 1) 276

In addition to Google-like relevance (which is a must if you are going to survive in this field), it would be nice to have:

1) Boolean search (cat or feline) and not (catwoman or cartoon or dog))
2) Date range which works (e.g., I want to search for websites talking about Enron BEFORE the scandal).
3) If I see a result that's obviously relevant, I'd like to be able to down vote it..

Comment Do or Do Not... (Score 3, Interesting) 58

There is no half way. There are no viable one way missions. If you're going to send humans to Mars, then send humans to Mars and bring them back.

It's hard enough for any project to live more than 2 years at NASA -- "a second mission sometime in the mid 2030s" is likely to be just as canceled as the previous visions of getting to Mars which would have had us there last decade.

As NASA is publicly funded, and as the public is fickle, NOTHING less than a human walking on Mars within our lifetimes, with further trips to follow is going to convince us, the taxpayers, to not begrudge the 50 cents a day we spend on NASA's budget.

Nothing short of an inspired public (and leaders brave enough to inspire the public) will get us funded to bootstrap ourselves into space, if this is to be done by a public agency.

Slashdot Top Deals

Live within your income, even if you have to borrow to do so. -- Josh Billings