Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×

Comment One of the most useful features (Score 1) 266

Honestly, this is one of the features which makes me keep using Chrome. Especially "Close tabs to the right" which allows me to drag the tab where I want and get rid of the stuff I just wanted to take a glance at.

I really, REALLY hate the trend towards full-screen, single-page browsing with videos enabled by default. It's beginning to feel a lot like "channel surfing"

Comment Harbinger of either Utopia or Dystopia (Score 1) 440

One things is clear: The automation will not stop, and in the long run, will lead to less demand for unskilled labor. Perhaps less demand for labor overall. The societal change brought brought on by automation started more than a century ago but is likely to accelerate exponentially (technology is finally getting cheap and widely deploy-able), similar to how the dot com boom happened.

I see this leading to two divergent societal paths, and I see us actually picking a perverse mixture of the two.

One path may be a near-Utopian end to drudgery, where production is mostly automated, and people can have whatever they want without spending effort. This, however, requires a re-imagining of the predominant economic systems. We might end up in a Start-Trek like post-scarcity world, but the path going there would require either Universal Basic Income, Universal Basic Subsistence (everyone is guaranteed food, housing, healthcare and basic goods -- they work for the rest), or some form of neo-socialism. The problem is that some people (you will see this in rich kids) take well to living a life of leisure, and may even become extremely productive in other types of endeavors (art, music, etc.). Other people _really_ don't take well to idleness and become either suicidally depressed or create trouble for themselves and society (once again, another group of rich kids).

The second, and unfortunately more likely, path is that everyone who is not directly involved in exploiting automation will be squeezed out of the labor market. The only people making money will be the ones driving the automation -- with few people to buy what is produced. You will, of course, see a market created for "artisanal labor" where the rich hipsters buy hand-made goods and eat at non-automated restaurants But this will not be enough to create a labor market. This scenario is likely to lead to resource wars, repression by the rich, and genocides.

Whether out of idleness or desperation, both scenarios are likely to bring religious/political fanaticism fueled by a desperate search for purpose/food in a dangerously idle/desperate world. In either scenario, some people may end up rejecting this new world order and moving to intentional agrarian communes -- but not enough of them to matter.

The elephant in the room of this extrapolation is that most (if not all) economic activity relies on exploiting natural resources -- which (whether you are a global warming skeptic or not) are dwindling. Automated exploitation of resources is NOT going to help things. Imagine what will happen if EVERYONE in the world is able to reach American levels of consumption thanks to the ease of production. Even if you are not an environmentalist, you should still fear the awesome resource conflicts this will cause.

If we want to remain in a livable world, we may need to take a _very sober_ look at what kind of society we want to have in 50 (or even 20) years. I'm not even talking about achieving anyone's idea of Utopia -- it will take SERIOUS WORK to maintain a world which is as livable as it right now. I do not think any of the previously tried -isms is the answer. Finding what _may_ be the answer will itself be work.

Comment Re:Solving the wrong problem in the wrong way (Score 1) 142

That's a good point, but the thermite need not apply extreme heat to the entire device -- just the flash memory chip. A strip of magnesium with oxidizer might also do the job -- both could be adjusted so that the user sees just a bit of smoke coming out of the phone, without personal injury. See section 3.1.2 in this document about the relation of temperature to data retention.

Shorting out the battery through a coil around the memory chip is likely to make it hot but not necessarily hot enough to truly erase the information (I may be mistaken -- I haven't done the calculations), You might even explode the battery and still keep the information. This also would not necessarily produce the electromagnetic pulse you're thinking of. Even if it did, EMP is unlikely to destroy the contents of flash memory. (Note that I am talking about actually destroying the information stored on the chip. Destroying the small gold wires on the package which connect the chip to the rest of the circuit is easier -- but an entity with enough resources can still recover the information).

Here is another writeup on the effects of temperature and storage time on flash memory (it seems baking the chip at 125 C for 10 hours will still not necessarily erase everything on it). You will need much higher temperatures than you would get with a polymerization reaction to destroy memory in span of several seconds. Both thermite and magnesium can produce thousands of degrees Celsius, which any data stored in flash memory is unlikely to survive.

Comment Solving the wrong problem in the wrong way (Score 1) 142

The real problem is buying a phone or a SIM that's not registered in your name. Since most governments archive the communications anyway, destroying the device accomplishes nothing except to give you away.

Now to destroy the device in a visible way may have some value, but wouldn't it be more reliable to simply put some thermite around the memory modules so as to destroy the memristors beyond recovery without having the phone expand into an ugly wad of polymer?

Comment Do Not Want (Score 1) 83

Also, no thank you.

And I say this as a person who finds FB quite useful in real life.

"You forgot to check in at the conference yesterday." "I see that you did not 'like' my presentation. Might I remind you that this is a condition of employment?" "I TOLD you not to respond to John's project update with department specific information."

Having witnessed several commit comment wars, I can't see this going anywhere good...

Comment Gen X'er here (Score 5, Insightful) 219

We were better than our parents, who couldn't fix the flashing 12:00 on the VCR. Our cohort (plus those within 10 years of our age range) went from dealing with BASIC on Apple II, ZX Spectrum, QuickBasic, etc., DOS, earlier versions of UNIX, all the Windows-es, etc. all the way to the abomination that is Windows 8.

To play Doom, I had to download 6 ZIP files over a 2400 baud modem for a week, unpack everything, and learn how to hack the Config.sys file on my 4 MB DOS machine to free up just the right amount of the right type of memory.

When I bought my first scanner, it took 2 days of resolving IRQ conflicts by flipping DIP switches whose meanings I did not understand at the time to make sure it didn't conflict with my sound card.

Mice required their own drivers.

The current generation is just as smart as we are, if not more so. But they always had UIs that made sense. They did not live through an entire 2 decade long information technology revolution. It shouldn't come as a surprise that they are surprised by alien (and to them, non-sensical and inconsistent) interfaces developed for a captive audience.

Add to this that enterprise software is always purchased during golf games by people who will never use it, and you have a world in which our skills of adapting to horrible and inconsistent interfaces are still useful.

I will now press Alt-H to disconnect :)

Comment The real issue is lack of transparency (Score 5, Insightful) 228

While the idea of using an algorithm to sentence a human being is bone-chilling, you might be able to justify this as a "formula" for sentencing -- which, of course, merits its own debate.

What is unconscionable about this is the fact that it's a SECRET algorithm. As in closed source. Essentially a secret law.

This has no place in democracy.

(Also, any algorithm which ingests statistical and demographic data is bound to come up with unpalatable and/or spurious demographic correlations (since there is a causal link between poverty and crime and a historic link between race and poverty) which I wold rather have society refrain from codifying -- in law or in actual computer code).

Comment User-hostile and parasitic (Score 1) 771

I have to confess that I'm an Apple customer -- I own an iPhone, iPad, and a MacBook air -- as well as some Windows-running hardware.

I LIKE Apple for the devices on which I am a consumer. I like that things work. When I have to customize things, I use Linux/Raspberry Pi/Arduino, etc.

I unhappily lived with the "lightning port" because it's a small inconvenience and is superior in a minor way (you can't plug it in upside down).

This move, however is PARASITIC. I wish it would be regulated away or something -- it reminds me of the ABSOLUTELY POINTLESS glut of phone chargers which were available in the late 1990s and the early 2000s before USB became standard. They add absolutely no value to the user and force the user to buy new accessories or adapters.

Will I stop using my existing Apple devices? Probably not. Will I buy the new iPhone -- probably not. I'll just upgrade to the latest one which does not package idiocy and stop there.

Comment Yes. Automatically incapacitate active shooter. (Score 1) 1144

I'm going to leave the obvious solution of using metal detectors and millimeter wave scanners aside for the moment.

You can design a system to make sure nobody gets off more than one shot in a crowd.

It would be relatively easy to construct a system for installment in a crowded place (night-club, movie theater, museum, etc.). which locates a gunshot being fired (multiple microphones), identifies the person doing the shooting (computer vision), and covers them in a massive amount of sticky goo or some other non-lethal deterrent.

The non-lethality of the deterrent is important because 1) No system is foolproof, and nobody wants an automatic gun potentially aimed at them at all times 2) There may be instances where the first shooter is shooting for a legitimate purpose (e.g,, rescuing a hostage).

Comment Migrations are costly and newer is not better (Score 4, Informative) 217

For the record -- IANACP -- I've never written or compiled a line of COBOL in my life..

But I did help migrate an old mainframe based system to a new "client-server based three tier architecture system based on Linux and a Java thin client" back in the very early 00's. The old system was near perfect and did the job, even though it ran on the (then alien to me) mainframe.

The new system was written by 20-somethings like myself and would (even with WAY more computational resources) conk out at the worst times. This, I believe, is the story of EVERY migration. It's not necessarily that older is better, or "they don't make them like they used to", but that software development is a bug-prone and arduous process that you will not get right the first time.

So if you're the VA administrator with an established career, you might be forgiven for not taking the risk of jeopardizing employee and patient data and services to satisfy some vague desire for "modernization", especially if you know that the project will be full of errors and WAY over budget.

What's the solution? I don't know. Start teaching COBOL and commission Google to create "Google Mainframe Migration"???

Comment Re:1000 engineers (Score 5, Informative) 128

There are several factors. First of all, what they are building is a HUGE engineered system which would have taken up a couple of buildings a decade or two ago. The fact that the end product is small doesn't change the complexity. The second part is the fact that it IS so small, which brings its own complications. In addition, semiconductor manufacturing is a very tricky business where even making the simplest thing (e.g., a transistor) takes an enormous amount of planning, characterization, and tool design.

Part of it is the R&D -- nothing like this has been done before, so certain things have to be figured out (heat dissipation, how the proximity of the components effect the other components,stuff neither of us will understand, etc. etc). Another huge part is tooling and process -- someone has to design, test and characterize the fabrication tools and processes (the "automation" you speak of has to be built by someone -- a device this complicated probably can't be built without the automation). The chip is divided into subsystems each of which needs to be designed, simulated, and optimized. Someone has to integrate all the subsystems and simulate them together. The 1000 people probably include material scientists, process engineers, electrical engineers of various stripes, semiconductor physicists, mechanical engineers (heat dissipation, packaging, etc)., systems engineers, engineering project managers, etc.

Comment No. (Score 1) 437

C has probably been around longer than you have. Rust probably isn't even a blip on the radar.

Be very wary of embracing immature technology for anything that matters. There are good reasons to do it, but they have to be EXTREMELY good reasons.

Otherwise, even if the tech succeeds, you will be stuck with a code base with immature and deprecated idioms. (Note the very old Java applets you sometimes see on the web that modern versions of Java make nearly impossible to run. So much for "write once run anywhere". There are no such programs written in ANSI C -- unless they involve 3rd party libraries).

You will also have to deal with buggy and immature developer tools and will spend weeks debugging problems that might be resolved with a simple Google search in C.

I have learned this the hard way, unfortunately..

Comment True boolean search, ability to vote on results (Score 1) 276

In addition to Google-like relevance (which is a must if you are going to survive in this field), it would be nice to have:

1) Boolean search (cat or feline) and not (catwoman or cartoon or dog))
2) Date range which works (e.g., I want to search for websites talking about Enron BEFORE the scandal).
3) If I see a result that's obviously relevant, I'd like to be able to down vote it..

Slashdot Top Deals

If God had not given us sticky tape, it would have been necessary to invent it.

Working...