Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Do Not Want (Score 1) 83

Also, no thank you.

And I say this as a person who finds FB quite useful in real life.

"You forgot to check in at the conference yesterday." "I see that you did not 'like' my presentation. Might I remind you that this is a condition of employment?" "I TOLD you not to respond to John's project update with department specific information."

Having witnessed several commit comment wars, I can't see this going anywhere good...

Comment Gen X'er here (Score 5, Insightful) 219

We were better than our parents, who couldn't fix the flashing 12:00 on the VCR. Our cohort (plus those within 10 years of our age range) went from dealing with BASIC on Apple II, ZX Spectrum, QuickBasic, etc., DOS, earlier versions of UNIX, all the Windows-es, etc. all the way to the abomination that is Windows 8.

To play Doom, I had to download 6 ZIP files over a 2400 baud modem for a week, unpack everything, and learn how to hack the Config.sys file on my 4 MB DOS machine to free up just the right amount of the right type of memory.

When I bought my first scanner, it took 2 days of resolving IRQ conflicts by flipping DIP switches whose meanings I did not understand at the time to make sure it didn't conflict with my sound card.

Mice required their own drivers.

The current generation is just as smart as we are, if not more so. But they always had UIs that made sense. They did not live through an entire 2 decade long information technology revolution. It shouldn't come as a surprise that they are surprised by alien (and to them, non-sensical and inconsistent) interfaces developed for a captive audience.

Add to this that enterprise software is always purchased during golf games by people who will never use it, and you have a world in which our skills of adapting to horrible and inconsistent interfaces are still useful.

I will now press Alt-H to disconnect :)

Comment The real issue is lack of transparency (Score 5, Insightful) 228

While the idea of using an algorithm to sentence a human being is bone-chilling, you might be able to justify this as a "formula" for sentencing -- which, of course, merits its own debate.

What is unconscionable about this is the fact that it's a SECRET algorithm. As in closed source. Essentially a secret law.

This has no place in democracy.

(Also, any algorithm which ingests statistical and demographic data is bound to come up with unpalatable and/or spurious demographic correlations (since there is a causal link between poverty and crime and a historic link between race and poverty) which I wold rather have society refrain from codifying -- in law or in actual computer code).

Comment User-hostile and parasitic (Score 1) 771

I have to confess that I'm an Apple customer -- I own an iPhone, iPad, and a MacBook air -- as well as some Windows-running hardware.

I LIKE Apple for the devices on which I am a consumer. I like that things work. When I have to customize things, I use Linux/Raspberry Pi/Arduino, etc.

I unhappily lived with the "lightning port" because it's a small inconvenience and is superior in a minor way (you can't plug it in upside down).

This move, however is PARASITIC. I wish it would be regulated away or something -- it reminds me of the ABSOLUTELY POINTLESS glut of phone chargers which were available in the late 1990s and the early 2000s before USB became standard. They add absolutely no value to the user and force the user to buy new accessories or adapters.

Will I stop using my existing Apple devices? Probably not. Will I buy the new iPhone -- probably not. I'll just upgrade to the latest one which does not package idiocy and stop there.

Comment Yes. Automatically incapacitate active shooter. (Score 1) 1144

I'm going to leave the obvious solution of using metal detectors and millimeter wave scanners aside for the moment.

You can design a system to make sure nobody gets off more than one shot in a crowd.

It would be relatively easy to construct a system for installment in a crowded place (night-club, movie theater, museum, etc.). which locates a gunshot being fired (multiple microphones), identifies the person doing the shooting (computer vision), and covers them in a massive amount of sticky goo or some other non-lethal deterrent.

The non-lethality of the deterrent is important because 1) No system is foolproof, and nobody wants an automatic gun potentially aimed at them at all times 2) There may be instances where the first shooter is shooting for a legitimate purpose (e.g,, rescuing a hostage).

Comment Migrations are costly and newer is not better (Score 4, Informative) 217

For the record -- IANACP -- I've never written or compiled a line of COBOL in my life..

But I did help migrate an old mainframe based system to a new "client-server based three tier architecture system based on Linux and a Java thin client" back in the very early 00's. The old system was near perfect and did the job, even though it ran on the (then alien to me) mainframe.

The new system was written by 20-somethings like myself and would (even with WAY more computational resources) conk out at the worst times. This, I believe, is the story of EVERY migration. It's not necessarily that older is better, or "they don't make them like they used to", but that software development is a bug-prone and arduous process that you will not get right the first time.

So if you're the VA administrator with an established career, you might be forgiven for not taking the risk of jeopardizing employee and patient data and services to satisfy some vague desire for "modernization", especially if you know that the project will be full of errors and WAY over budget.

What's the solution? I don't know. Start teaching COBOL and commission Google to create "Google Mainframe Migration"???

Comment Re:1000 engineers (Score 5, Informative) 128

There are several factors. First of all, what they are building is a HUGE engineered system which would have taken up a couple of buildings a decade or two ago. The fact that the end product is small doesn't change the complexity. The second part is the fact that it IS so small, which brings its own complications. In addition, semiconductor manufacturing is a very tricky business where even making the simplest thing (e.g., a transistor) takes an enormous amount of planning, characterization, and tool design.

Part of it is the R&D -- nothing like this has been done before, so certain things have to be figured out (heat dissipation, how the proximity of the components effect the other components,stuff neither of us will understand, etc. etc). Another huge part is tooling and process -- someone has to design, test and characterize the fabrication tools and processes (the "automation" you speak of has to be built by someone -- a device this complicated probably can't be built without the automation). The chip is divided into subsystems each of which needs to be designed, simulated, and optimized. Someone has to integrate all the subsystems and simulate them together. The 1000 people probably include material scientists, process engineers, electrical engineers of various stripes, semiconductor physicists, mechanical engineers (heat dissipation, packaging, etc)., systems engineers, engineering project managers, etc.

Comment No. (Score 1) 437

C has probably been around longer than you have. Rust probably isn't even a blip on the radar.

Be very wary of embracing immature technology for anything that matters. There are good reasons to do it, but they have to be EXTREMELY good reasons.

Otherwise, even if the tech succeeds, you will be stuck with a code base with immature and deprecated idioms. (Note the very old Java applets you sometimes see on the web that modern versions of Java make nearly impossible to run. So much for "write once run anywhere". There are no such programs written in ANSI C -- unless they involve 3rd party libraries).

You will also have to deal with buggy and immature developer tools and will spend weeks debugging problems that might be resolved with a simple Google search in C.

I have learned this the hard way, unfortunately..

Comment True boolean search, ability to vote on results (Score 1) 276

In addition to Google-like relevance (which is a must if you are going to survive in this field), it would be nice to have:

1) Boolean search (cat or feline) and not (catwoman or cartoon or dog))
2) Date range which works (e.g., I want to search for websites talking about Enron BEFORE the scandal).
3) If I see a result that's obviously relevant, I'd like to be able to down vote it..

Comment Do or Do Not... (Score 3, Interesting) 58

There is no half way. There are no viable one way missions. If you're going to send humans to Mars, then send humans to Mars and bring them back.

It's hard enough for any project to live more than 2 years at NASA -- "a second mission sometime in the mid 2030s" is likely to be just as canceled as the previous visions of getting to Mars which would have had us there last decade.

As NASA is publicly funded, and as the public is fickle, NOTHING less than a human walking on Mars within our lifetimes, with further trips to follow is going to convince us, the taxpayers, to not begrudge the 50 cents a day we spend on NASA's budget.

Nothing short of an inspired public (and leaders brave enough to inspire the public) will get us funded to bootstrap ourselves into space, if this is to be done by a public agency.

Comment The Big Reveal (Score 4, Interesting) 239

This is coming. I don't want it to come, and neither do you.

But one day, there will be such a security breach that regular people for whom monitoring happens to other people will find every phone call they've made, every email/text/IM they've sent, every street camera picture that's been taken of them, every URL they've visited, and every nude airport scan available for searching, downloading, and scrutinizing going back at least a decade.

Some will find surprisingly more.

This will hurt you, me, the super-paranoid dude with the encrypted hard drive, the boring grandma, and the powerful politician.

After the dust settles from the several million ruined marriages, the inevitable political scandals, and the rampant identity theft, things will change. For a while.

New politicians will get elected. Privacy laws will be enacted. Watchers will be appointed to watch the other watchers. Whatever government surveillance exists will go further underground. Everyone will encrypt everything.

And then people will relax and thighs will go back to some version of what we have now.

Comment I've done this (Score 1) 175

Taking apart a computer and programming are two orthogonal skills -- someone might be great at one and terrible at the other.

I've successfully used Basic4GL to teach basic programming, graphics, and algebra concepts to underperforming 6th graders. They really loved the exercise of drawing a spaceship first on graph paper and then on the computer using simple graphics commands. Basic4GL is great because it has built in sound, graphics, etc.

I will suggest 3 other things

1) Teach the Processing computer language. It's graphical, easy to start with, and mature.
2) Teach the Arduino. Build a simple circuit or a very simple robot with two servos. Any Arduino workshop devolves pleasantly into students tinkering with stuff.
3) Teach Python

I'll caution against Python because the text only interaction may bore them (even though I've taught this language before).


Researchers Direct Growth of Neurons With Silicon Nitride Microtubes 23

MTorrice writes: Bioengineers want to connect electronics and neurons to make devices such as new cochlear implants or prosthetic limbs with a seemingly natural sense of touch. They also could build synthetic neural circuitry to use to study how the brain processes information or what goes wrong in neurodegenerative diseases.

As a step toward these applications, a team of researchers has developed a way to direct the growth of axons, the connection-forming arms of neurons. They use transparent silicon nitride microtubes on glass slides to encourage the cells' axons to grow in specific directions. The cultured nerve cells grow aimlessly until they bump into one of the tubes. The axon then enters the tube, and its growth is accelerated 20-fold. Silicon nitride already is used in some orthopedic devices, and could serve as a substrate for electronics to interface with the growing neurons.

Slashdot Top Deals

Enzymes are things invented by biologists that explain things which otherwise require harder thinking. -- Jerome Lettvin