Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Bell & Howell (Score 1) 523

Way back when, Apple hadn't yet established relationships with public schools. School administrators didn't know how to classify computer equipment, anyway. The Bell & Howell company came to the rescue: they were vendors of audiovisual equipment like film projectors. Bell & Howell agreed to let Apple use their connections with school districts in exchange for the computers being rebranded as Bell & Howell equipment, in Bell & Howell livery.

This is why the first computer I ever got time on was an all-black sleek Apple II+ that looked like it belonged on the Death Star.

The original Bell & Howell Apples are now almost completely forgotten about. But man, do I ever have fond memories of them.

See one for yourself here.

Comment Re:How to automate C++ auditing? (Score 1) 341

And before you start spewing about vectors vs C-style arrays performance, I would suggest you revisit some simple profiling code; vectors are just as fast as c-style arrays these days.

Actually, no.

The C++ standard doesn't give any guarantees on how quickly the OS has to service your request for a dynamically allocated block on the heap. Most modern desktop CPUs and OSes do this very quickly, but there are embedded platforms where allocating memory off in the heap takes multiple milliseconds.

Likewise, modern CPUs tend to aggressively cache vectors in ways that are absolutely beautiful. But in embedded systems with simpler CPUs, they may very well be accessing each element of the vector via a pointer indirection.

The world's a much bigger place than server farms and laptops. In those areas, yeah, std::vector<T> is a lifesaver. But in the IoT world there's still a very real need for C-style stack arrays.

This is, incidentally, the entire motivation for std::array<T, std::size_t> in C++11. The embedded guys insisted on a container with an STL-compatible API which would be compiled down to straight-up C arrays. I've never found anything in the embedded space where a std::array<T, std::size_t> was an inappropriate choice, but I've been in lots of projects where a std::vector<T> was simply not in the cards.

Comment Lots of errors. (Score 1) 341

I've been programming in C++ since 1989 and have been getting paid to do it since 1993. I've seen the language grow up, and I don't think you're very near to right. You've got some serious misunderstandings of the language.

Because it does not have momentum, it will probably never develop momentum. Like Lisp, it may be a neat language, but it will almost certainly be consigned to the dustbin of history.

I'm also an old LISP hacker (since the early '80s). LISP is not "consigned to the dustbin of history". It was foundational in the development of modern languages and concepts it introduced to programming are still with us today. I personally side with Google's Peter Norvig: LISP is still around, we just call it Python.

If Rust is doomed to share LISP's fate, I think the Rustaceans would consider that a victory beyond their wildest imaginings.

It should also be noted that well written C++ can be just as good as Rust.

Meaningless. Well-written X can of course be as good as Y. Well-written C is just as good as Rust. The question that's relevant to software engineering is "how much effort is required to do the task well in X versus well in Y?"

As someone who's entering his fourth decade of C++ programming: yes, the modern dialect of C++ is wonderful. It still has an absurd number of gotchas and corner cases, though, to the point where unless you've already made that massive investment in learning corner cases ("can I use the universal initializer syntax to initialize my object, or will it turn it into std::initializer_list? And someone remind me again why vector<bool> is a bad idea and will screw me over if I try to use it?") I would genuinely recommend using another language for low-level power.

The first, and somewhat less important principle of C++ is template metaprogramming. This approach to software development is entirely geared towards reduction of redundancy in code writing.

No. That's generic code, period. Template metaprogramming is different: it involves exploiting the fact the C++ template instantiation facility is Turing-complete to perform at compile time things which in other languages would be deferred until run time.

This can be a really big deal. In C you might assert that the size of an integer is what you're expecting, but you won't know until you try to run the code. Your assert will give you a clue as to why your code failed in production at 3:00am, but you'll still get the call at 3:00am telling you everything blew up. In C++, a static_assert evaluates that same question at compile time, and if the architecture you're compiling for doesn't have the necessary word size you'll know it when the compilation halts.

And that's just scratching the surface! Template metaprogramming is also used in things such as the Blitz++ numerical libraries (I'm old, yes, I remember Blitz++) to optimize matrix operations to the point where it beats FORTRAN matrix multiplication. Nothing like unrolling loops at compile time, automatically, to give your code a performance boost. And of course, libraries like Boost are continually pushing out the frontiers of what we can do with template metaprogramming.

Templatized code is actually about abstract algebra (as no less than Stepanov has said), allowing us to separate algorithms from the types they operate on. But template metaprogramming is mostly about moving things normally done at runtime into the compile-time cycle. It's not about code reuse.

This is an extension of the idea of inheritance where you re-use base class code, and only modify what you need to adapt the base to your use case.

Not really. Inheritance is inevitably about types: you can't talk about inheritance without talking about the type of the parent and what behaviors get inherited. Generic programming is about separating type out from the discussion and instead talking about the algorithm in an abstract-algebra sense. (And then you have abominations like the Curiously Recurring Template Pattern which fuse them both.)

And then we get to...

Object oriented programming is a mechanism for creating a software structure that forces most types of bugs to be compile time bugs.

No. Just. No.

There is no universe in which this paragraph is anywhere near right. OOP in C++ was introduced in the 1970s in the very first iteration of the language, and let me tell you, as someone who has actually had to work with cfront that compiler did absolutely nothing to turn my run-time bugs into compile-time ones, nor did object-orientation magically provide this capability.

But by the early '00s, around GCC 3, when C++ compilers started to produce scarily optimized template code? About that time is when template metaprogramming took off, and that was when I began to replace my C-style casts with static_casts and began to get warnings about "uh, boss, that cast isn't valid, it'll blow up on you at runtime".

Your example is also weird: you say the non-OO way would involve a struct that contains data fields and a type ID, but, uh -- that's what an object is: it's a struct containing function pointers and data objects. The example you give can easily be written in C by anyone who understands function pointer syntax. Remember, C++ classes were explicitly designed to be translatable into C structs-and-function-pointers. (That's how the first major C++ compiler, cfront, worked. It parsed the language and spat out equivalent C, which was then compiled.)

OOP converts this type of flow control into virtual dispatch, which can be validated at compile time,

... except that virtual dispatch explicitly cannot be validated at compile time. Virtual dispatching is done at runtime. Only static dispatch can be done at compile-time. (Note: before you put together a toy example that uses the virtual keyword, be careful: if the compiler can statically determine the type, it's allowed to implicitly convert your virtual call into a static call.)

I'm afraid that I'm sounding like an angry old man standing on my porch talking about these kids today and looking around for my shawl. I apologize if that's the way I'm coming off. But you seem to have some really weird misapprehensions about the C++ language, and I really hope you'll correct them.

Comment Re:I just don't buy the shit MIT... (Score 5, Insightful) 435

For a good computer scientist...

Ah, the No True Scotsman fallacy.

that basic world view is ingrained in their soul.

No. Definitively, no.

I was born in 1975. By 1979 I knew I was going to be a hacker. No kidding: I was sitting on Mrs. Walters' kitchen floor discovering recursion by drawing geometric shapes. I remember looking at this Easter egg I'd decorated in a recursive pattern and being in awe, and thinking I wanted to draw recursive patterns on eggs forever.

I was there for Flag Day in 1983 when ARPANET became the Internet. I was eight years old and the local college computer science department viewed me as their mascot, I guess. I'm grateful to them for the time I got to spend on LISP Machines.

Today I'm 44. I hold a Master's degree in computer science and am a thesis away from a Ph.D. I've worked for the United States government's official voting research group (the now-defunct ACCURATE) and private industry. I've spoken at Black Hat, DEF CON, CodeCon, OSCON, and more. I think that I meet your, or anyone's, definition of a good computer scientist with a long career.

And I am telling you, brother, you are wrong.

In the late '80s and early '90s there was a USENIX T-shirt given to attendees. "Networks Connect People, Not Computers." It was a neat shirt and I wore mine until it was shreds, not because I liked wearing a ratty T-shirt but because there are so many of us who need to learn this lesson.

Logic is the tool we use to serve humanity. But if you let logic blind you to the fact other people are human beings with human feelings who need to be treated like human beings, then you just stopped being a hacker and you started becoming a tool.

Hackers serve humanity. We don't rule it. And we're not excused from the rules of human behavior.

I really wish RMS had learned this. It's too late for him. It's not too late for you.

Comment A big NO. (Score 1) 428

As a guy who did his EMT training in '92, let me give you a giant fuck you, buddy.

It has never been an EMT's job to put themselves in harm's way for a patient, and in fact, we were specifically trained to not do that. Job number one is to ensure the safety of the area before going in, otherwise we're at grave risk of being taken out by whatever's already critically injured at least one person. Adding one more to the casualty count solves nothing: it just means you need another ambulance.

If there's a downed power line draped over your car and you're unconscious behind the wheel, well, sorry, but you're going to be waiting there until either the power company or the firefighters tell me the scene's electrically safe.

"Emergency medical personnel used to be expected to put themselves in harm's way to protect people." You've clearly been watching way too much Hollywood.

Comment No. (Score 1) 171

In a word, "no".

Within the event horizon, there is literally no path 'outside'. It isn't that getting there involves an infinite redshift: it's that there is literally no geodesic leading out. Within the event horizon space twists in on itself such that all directions lead deeper inwards towards the singularity.

You have tremendous freedom to move about in time, but your freedom to move about in space gets sharply curtailed. It's exactly the reverse of the spacetime situation outside the event horizon, where we have tremendous freedom to move in space but are only allowed to move forwards into the future.

Comment The Binder of Doom (Score 5, Interesting) 198

In 1999 I was hired by a Midwestern telco -- in the interests of not getting sued I won't say which: I'll just say their market cap used to be in the billions and now you could buy them with the lint in your pocket -- to do security remediation on their billing system. I spent weeks poring over architectural diagrams, going through source code, examining protocols. After a while I realized I had some really scary information, so I asked my manager for a safe.

"Just put it all in a binder," she said. "We trust you to keep an eye on it."

The Binder of Doom was a nondescript black binder about three inches thick. It had no cover page and no markings: I didn't want anyone to realize the secrets that were in it. I carried it around with me everywhere. I slept with it in bed with me. That's how terrified I was these secrets would come out.

Then the Binder of Doom got worse. Having completed my survey, I now devised attacks on the system. I found ways enterprising individuals could fleece the company out of truly mind-boggling sums, and how difficult it would be to detect these attacks with the then-current security infrastructure. By the end of six months the Binder of Doom was stuffed to bursting and I was giving serious thought to filing for a concealed-carry permit. I wondered if the sheriff's department would understand if I told them I was routinely carrying around a binder with a *conservative* worth to a criminal syndicate of $100 million.

I went back to my manager. I told her I was done. It was time to remediate the risks. "Oh, excellent," she told me, "because we just ran out of money for the remediation."

Uh. What?

"Management has decided the main risk is in unsecured communications links, so just ensure we're using PGP on everything and we'll call it good."

I asked if she wanted the Binder of Doom.

"No, you hold onto it for a while."

So I became increasingly disgruntled, bitter, and sarcastic. I told everyone I worked with that I'd been retasked to "secure" our network using PGP -- and even old-school PGP 2.6, not GnuPG (which had just reached 1.0), either -- and oh God this is awful and if this company lasts another year it'll be a miracle and...

I was shortly thereafter cashiered for having a toxic attitude towards work. I walked into the parking lot, got into my car, and tossed the Binder of Doom into the passenger seat. As I drove away I realized something was horribly wrong, but didn't realize what until I was pulling out of the lot:

I HAD THE BINDER OF DOOM IN MY PASSENGER SEAT.

I returned to the office and tried to walk inside, but was met by an HR rep at the door who told me if I didn't leave they'd call the police and file a trespass charge. I held up the Binder of Doom to the HR rep. "Do you want this back?" I asked.

"No," she told me clearly. "Keep it. We just want you to leave."

I turned around, gobsmacked, and left the company holding detailed plans for how to embezzle $100 million or more... which the company had just thoughtfully delivered into the hands of a disgruntled former employee.

(And if you're wondering what I did with the Binder of Doom, it sat on my bookshelf for a few days tempting me before I threw it into an incinerator and threw the ashes into a strong wind.)

Comment And we want it this way! (Score 1) 156

More to the point: refusing to prosecute unless A or B is met is genuinely good for national security. If people know their mistakes are forgivable they're going to be much more inclined to cooperate with investigators to help seal the breach. If people think they're looking at 10-to-20 for their carelessness, they're far more likely to lawyer up.

Comment Re:Well, there goes the 4th Amendment again... (Score 1) 204

Go read that opinion again. (It's another Scalia one.)

In that case, the officer was (a) in a home and (b) did not have the homeowner's permission to take hold of anything. The home is what ramped the protections up to the max; the fact the homeowner did not consent to anything kept those protections in force.

It's much different from the driver of a car giving evidence directly to a cop. The protections were lesser, and the driver waived them.

Comment Re:Well, there goes the 4th Amendment again... (Score 1) 204

Please, go read the opinion again. Particularly read Scalia's opinion, where he lays out the reasons why an infrared camera is an illegal search of a home. It has to do with the fact the home is the bastion of the Fourth Amendment. There is literally nowhere that receives more Fourth Amendment protections than the home.

A set of blank cards, which someone voluntarily gives to the police, receives far less protection. If a cop asks me for a birthday card I'm holding, and I voluntarily hand it over, and the cop opens it up and finds I've tucked a baggie containing bump of cocaine inside, has the cop committed an illegal search? Under your logic, yes, since the bump wasn't in plain sight.

But the plain sight exception does not apply when the police have lawful possession of the evidence!

Good grief, man. This is high-school civics class stuff.

But seriously, read Scalia's opinion.

Comment Put it in perspective. (Score 2) 204

Alice and Bob are driving down the road when they're pulled over by cops. Alice is driving. Bob gets arrested on an outstanding warrant. As Bob's getting out of the car, the cops see a black plastic bag underneath Bob's seat. They ask Alice about the bag. She says, "This? Oh, it's just oregano, officers. A lot of oregano. No, we don't have receipts for it, and, uh, we bought it at ... err, from some guy. But it's just oregano. See?", and gives it to the cop. The cop, upon opening the baggie, sees what looks like oregano. But the volume of the oregano is much more than you'd need for a pizza, so the cop figures it might be marijuana and decides to run a field test on it. Ultimately this field test is turned over to the State Police, which are able to conclusively say it's marijuana. Bob is now facing marijuana possession charges and complains his Fourth Amendment rights were violated.

That's exactly what happened here. The defendant was arrested on an outstanding warrant, the arresting officer asked what was in the bag, the driver gave the bag over and said he and the defendant bought 143 gift cards from "someone", but couldn't identify whom, nor provide any receipts, and their business plan was to "resell" these cards for a profit. Put all that together and it's on the same level as telling the cop your weed is oregano -- it's a lie that's completely transparent.

Since the cops were given the evidence, they did not seize it illegally. Since the cops had an incriminating statement from one of the participants, they had probable cause to check for illegality. Legal seizure plus probable cause equals go directly to jail, do not collect a $200 gift card.

This Slashdot headline is misleading to the point of being journalistic malpractice.

Comment Re:What liberal arts actually means (Score 1) 420

A BA in a science is a BS with the math and other difficult parts removed.

I said that was true for institutions which offered both. And even then, it's not that math is removed -- it's that a couple of upper-level courses covering esoteria are removed to make room for a better grounding in the humanities.

My friends with BAs in math did the full gamut of differential and integral calculus, number theory, differential equations, analysis, linear algebra, statistics, and more. Even as a CompSci major I took differential and integral calculus, differential equations, and statistics.

Comment Re:What liberal arts actually means (Score 4, Insightful) 420

Liberal arts is rooted in theoretical nonsense...

I hold a B.A. in computer science from a fairly good private college. One of my best friends graduated with a triple-major B.A. in physics, mathematics, and computer science, from the same institution. Other close friends from undergrad received B.A. degrees in chemistry, biology, geology, environmental science, and botany.

In fact, my undergrad alma mater doesn't offer the B.Sc. degree at all.

In 20 years in the software industry, not once has anyone ever asked whether I hold a B.A. or a B.Sc. It's a total nonissue. Some institutions offer the B.A., some offer the B.Sc., some offer both but differentiate them on how many differential calculus classes you've taken.

Slashdot Top Deals

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...