Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:To quote Einstein (Score 5, Insightful) 381

I think you're confusing feature-creep with a comment that was meant to be about edge-scenarios. Allowing someone to configure parameters that were never spec'ed to be configured is feature-creep (gold plating, extra coding, call it what you will), and I agree should be avoided and adds unnecessary (or not obviously necessary) "complexity".

Handling an edge criteria that was implied but not explicit in a specification is what is typically meant of "corner case", and is not the same thing you described. Recognizing that the customer asked for something logically impossible (they want two data sets to reconcile, but they are at unexpectedly incompatible cardinalities), or something that, upon investigation while building an app, wasn't precise enough (they asked for this to be their standard green, but their standard list only includes red and blue).

It's nearly impossible to specify all of those prior to coding, which is why the typical "waterfall" development techniques have fallen out of vogue. You're always going to learn things while coding, and this is one of the main contributors towards apparently unnecessary complexity. If I design version 1 of a program perfectly, and customers have new requirements for version 2, it's unlikely that the "simplest" implementation of version 1 will be the one that is most conducive to an upgrade. You end up with a choice between refactoring completely or sacrificing some efficiency and simplicity to graft the new features onto an otherwise good version 1.

I think Dr. Dobbs is nitpicking, though. There are definitely many ways to address, measure, or understand simplicity, and I agree that it should not be THE goal in and of itself. But the idea of making code easy to read, easy to understand both in the micro and macro sense, and just generally "simpler", has many merits.

Comment Re:Who is "we"? (Score 1) 601

"We" is a tricky word, but ultimately I do find it ironic that it seems like more people are bent out of shape over the NSA/Snowden mess than were over the Patriot Act (although I have no numbers, I may be remembering history incorrectly). Many of "us" were against the explicit and well-publicized expanse of government surveillance powers and the erosion of privacy more than a decade ago that clearly precipitated the sorts of things the NSA has been doing since. Has been doing, quite possibly, LEGALLY, since, and because of things like the Patriot Act.

The fact that the NSA leaks brought more public discourse is probably a good thing, although I'm not sure I agree it was justified. But I think it was predictable given the post 9/11 tradeoffs ostensibly for safety over privacy, and I think "we" had a strong opportunity to tell the government not to go down this path long before Snowden's leaks came to light.

Comment Re:Bogus argument (Score 4, Insightful) 311

If you're worried about the lineage of a binary then you need to be able to build it yourself, or at least have it built by a trusted source... if you can't, then either there IS a problem with the source code you have, or you need to decide if the possible risk is worth the effort. If you can't get and review (or even rewrite) all the libraries and dependencies, then those components are always going to be black-boxes. Everyone has to decide if that's worth the risk or cost, and we could all benefit from an increase in transparency and a reduction in that risk -- I think that was the poster's original point.

The real problem is that there's quite a bit of recursion... can you trust the binaries even if you compiled them, if you used a compiler that came from binary (or Microsoft)? Very few people are going to have access to the complete ground-up builds required to be fully clean... you'd have to hand-write assembly "compilers" to build up tools until you get truly useful compilers then build all your software from that, using sources you can audit. Even then, you need to ensure firmware and hardware are "trusted" in some way, and unless you're actually producing hardware, none of these are likely options.

You COULD write a reverse compiler that's aware of the logic of the base compiler and ensure your code is written in such a way that you can compile it, then reverse it, and get something comparable in and out, but the headache there would be enormous. And there are so many other ways to earn trust or force compliance -- network and data guards, backups, cross validation, double-entry or a myriad of other things depending on your needs.

It's a balance between paranoia and trust, or risk and reward. Given the number of people using software X with no real issue, a binary from a semi-trusted source is normally enough for me.

Comment OpenVMS and PDP's relationship... (Score 1) 336

"Not sure about the OpenVMS vs PDP comparison"... Since one is software and one is hardware, the confusion makes sense, but the comparison was basically valid:

The PDP-11 was 16-bit and gave way to the 32-bit VAX-11 (except, apparently, in nuclear power plants). The operating system developed to run the VAX-11 system was VAX-11/VMS (later just VMS). As the OS matured and Digital Equipment Corp (DEC)'s hardware moved to the Alpha CPUs, VMS matured along with it until HP eventually ported it to Itanium as OpenVMS.

While all of those hardware platforms ran multiple systems (early UNIX variants even ran on the PDP-11), VAX/VMS were tightly integrated to anyone that worked with them, and the PDP/VAX lineage was well established. In that way, comparing the lifecycle of PDP-11 to OpenVMS makes a lot of sense (and, honestly, I'm very surprised by which lasted longer!).

Comment Re:So what is it? (Score 2) 166

It's a news aggregator... the progression of RSS/ATOM readers, if you will. I like them -- I get most of my news through random other places (like Slashdot), but for things that may slip by like rarely updated must-read blogs, I liked google reader. It's pretty easy to live without, but it did combine like 10 sites that I regularly hop around down to one link, which was helpful.

Comment Problem Solving Is Why (Score 1) 656

Parent got it right; it's about problem solving. I have dual math and cs degrees, and while most of the actual math escaped me decades ago (I couldn't solve half the diff-EQs or integrals now that I could in college), the practices and thought processes have (IMnsHO) made me a better programmer. Programming is about efficiency as much or more than it is about knowing any specific language or being able to execute a particular task. Most importantly, I think is the ability to have faith that your code is correct and complete... proofs in linear algebra and number theory were immensely helpful for that. Testing edge cases and knowing that your loops will terminate properly flex the same muscles as proofs by induction. I think of Pollard's rho more doing database programming than I did in math classes, but I'm glad someone pointed it out to me there.

Math can also be directly applicable depending on what you're going into. Visual and game programming is full of geometry and trigonometry. Artificial Intelligence, Big Data, Data Mining all require statistics, hashing algorithms, efficient tree traversal, and all sorts of things that span the boundary between CS and Math. In the end, though, all of programming is just implementing algorithms, and all algorithms are just math problems. The two complement each other brilliantly.

Comment Commodore "Compute's Gazette" Magazines (Score 1) 623

My dad brought home a commodore and I subscribed to Compute's Gazette (I think that's what it was called) -- a magazine with a lot of commodore stuff in it. One thing they had was pages and pages of bytecode that you had to type in with no debugger or syntax to speak of. I learned a LOT from that, and from the built-in basic the OS had. The first thing I really remember programming to completeness was a Julia and Mandelbrot set generator... in Commodore basic. It was not fast; I could see the program drawing pretty much every pixel. Good times.

I ended up with a degree in computer science, but I'd say that was more an opportunity to practice than it really was how I "learned" to program. Algorithms and Operating System classes had some concepts that I hadn't run across, and every class was an opportunity to learn or find new snippets of knowledge. But the formal things I learned like "Bresenham's circle algorithm" and topological sorts, or anything from the Dragon Book, the volumes of Knuth (ah), or the Numerical Programming books were important conceptually... probably good to make efficient code, and great ways to not recreate the wheel. But the only way to "learn" to code is to code.

Comment Re:What else is in the "industry"? (Score 1) 209

Military still seems "proprietary" to me. If they meant "commercial", I could see a difference. I also considered "embedded" or "firmware" style code that, while software, is more closely tied to a physical hardware implementation. All of those still seem either "proprietary" or "open source", though, and you're right (@stillnotelf) that these would raise rather than lower industry averages.

It could include things like javascript that is just out-in-the-wild. If you were to strip programmatic pieces from websites that were one-offs... things that were neither marketed nor sold, and not really managed as software, just put out there, code quality would probably drop. I'm thinking of websites with funny animations or just hand-coded scripts to do navigation or whatnot. These wouldn't be "open source" in the sense that there's no statement of open copyright, and they wouldn't be "proprietary" in the sense that noone is marketing or working to save or publish or reuse the code, and while copyright may exist noone is really worried about projecting it (due to the one-off nature). Still, it's a weird statistic.

I wonder if they meant a "standard" as in a target or an accepted limit that is somewhat arbitrary rather than an "average" (which the article actually uses) of real-world code. This would make sense since the average they cite is exactly 1.

Comment Re:and all the children are above average (Score 2) 209

Wow, yeah, I posted an almost identical sentence myself. Eerie. (Although I didn't have a Wobegon reference... sorry). But yeah, it seems like an odd sentiment. Internal use software is still either "proprietary" or "open source"... isn't it? But good point. If someone calculated the bugs in my excel macros as if they could be used for general purpose computing I'd be in sad shape. (ObNote: I use excel macros as rarely as possible, and normally only at gunpoint).

Comment Playing along with the ridiculousness... (Score 1) 629

I agree with most posters that the logic in the post is hugely flawed (predicting something about the future by arguing that we can't know enough to predict it is inane). But more constructively: our constant access to information hasn't sated our desire for more information. Information collection is driving the recent knowledge boom as much if not more than ease of access. Besides, no matter how much time passes, if we haven't visited another world we won't have the information about that world at our availability. You have to collect information before you can use it... that's WHY further exploration will always be a goal (unless, you know, we obliterate ourselves somehow in the mean time, or find something similarly more important in any given short-term).

Privacy

Submission + - Giving away personal info byte by byte for "security"? 1

ZahrGnosis writes: "Through work, and bad planning, I've been subscribed to a lot of trade magazines. I don't generally hate them, most come electronically now, and some are actually useful. What I despise is the annual calls to renew my "free" subscription and update my info. Ok, giving away my work e-mail, title, and line of business info isn't that bad, it's practically good marketing. But at the end of each call the phone people say "in order to verify that we talked to you, can you tell us" followed by something tiny and seemingly innocuous. The last digit of the month you were born in, or the first letter of the city you were born in. Today it was the country you were born in. I've started lying, of course, because noone needs to aggregate that sort of personal information byte-by-byte, but now I'm convinced that's what they're doing. Does anyone have any details on this? Can someone confirm or deny that these seemingly innocuous "security" questions are being aggregated. Yes, I tried some google searches, but my fu seems weak in this area... or noone is talking about it yet. Bueller?"

Slashdot Top Deals

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...