Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment You don't want to _know_ about the broken stuff (Score 1) 430

I did try to get the coding standard fixed.

Meanwhile, elsewhere in the code, in full compliance with the coding standards I found:

(1) unconditional static down-casts from base types to derived classes despite the possibility of error event classes being the return value. (e.g. class A, A_failed, and B, where B and A_failed were derived from A, and then a key static cast from A* to B* without any check for A_failed at all.)

(2) shaving down (bit shifting and masking) pointers through a void* arg and then into four bytes (char) that were then pushed into a byte queue, where they were later popped off as four bytes and shifted back into a pointer some type. (The "real time programmer" who came from a VX works background didn't believe in just making an array of void* and moving all the bytes at once for whatever retarded reason.) [also broken because the A* to void* to B* three-way conversion isn't necessarily safe since it should be cast to A*, reinterpret_cast to void*, then reinterpret_cast to A* then dynamic_cast to B* to be safe and symmetric.]

(3) so many unsafe operations at in the module call prototypes that I eventually just made my code "correct" (e.g. call-safe) then put a conversion layer that used the unsafe API in both directions and called that translation unit "unsafe.cc" and had lots of forwarding functions spelled out why the calling convention was flirting with disaster so that all the unsafe calls and unsafe casts were all in one pile and in one place.

Item 3 was somewhat insurrectionalist because I wasn't allowed to get any of my criticisms to be acknowledged by, or then past my boss who's "it worked when we tested the prototype code that one time" attitude kept things tightly broken.

So we had nicely regimented coding standards but I always look at the brand name of any medical equipment I see sitting next to a bed now because I know what the code base for one particular brand really looks like and how much they didn't give a rats ass about doing things right (as opposed to doing things they way someone decided they should be done based on single test runs).

That guy who noticed that if we build buildings the way we built software, the first woodpecker to come along would destroy civilization? Yeah, he wasn't exactly wrong.

Comment Re:Ya to me sounds like "I'm special" syndrome (Score 1) 430

Treating all programmers as interchangable morons who cannot be trusted to write code is a sign of managerial immaturity.

An outstanding programmer often knows when rules must be broken. Just as an outstanding jazz musician knows when to use dischord.

Now just because the Dunning-Kruger effect causes programmatic noobs to assume they are masters deserving of liberty doesn't mean that the masters are a-priori being immature.

Fault: there is too much baby in this bathwater. Get a seive before proceeding. Session closed... 8-)

Comment And yet, you are wrong to "find it impossible..." (Score 1) 430

I have worked on projects that loast hundreds of millions of CPU cycles because the coding standards encoded individual ideals into runtime performance killers. (the example I have placed elsewhere is "must use getters/setters" and "may not put function defintions inside class definitions" turns "class foo { int X; ... int getX() const {return X} };" (which can be optimized down to a register load) into a (not optimizeable at all) far call from each point of use into the foo.o (object file) after a potential PIC (position independent code) fixup for a shared library.

And this stupidity can waste a _lot_ of man hours. In order to get my part of the medical device that _that_ coding standards bug was written under, to run in acceptable time (e.g. to not overuse my CPU time budget in a freaking real-time heart monitoring ap) I had to break the coding standard and put the getters/setter (or occasionally the plain "public" variable) into the class definition anyway, then run it into the version control system, then go through the "naughty programmer" output list and create a bug report for each such optimization and assign that bug to that "naughty naughty" message. Then the bug review team had to deal with those bugs. Then the code review team had to approve those optimizations.

Even with this only costing me a couple of hours on the one set of modules, when you consider the ten or twelve people that the automation systme then had to nag, and the hours _they_ lost. you get into a lot of wasted time over all.

Now add the part where once every _other_ programmer who silently followed the automatically enforced rules ran over time-budget for their code (so the system was too slow), and all _their_ code had to be fixed once everybody noticed that _mine_ was not so plagued.

Then the cost of the project running late and eventually being determined to be "not worth the effort being expended" and getting canceled outright...

Well, truely hundreds of man hours and _many_ thousands of dollars were wasted on a project that was largely killed because all the programmers were muzzled into paralysis by a huge steaming pile of these sorts of pointless restrictions (many of which would have been good for a _class_ in programming but most of which were _toxic_ to a real project).

Well, you know, there are reasons that failed projects fail, and sometimes those reasons involve over-regimentation of the otherwise creative process of finding solutions and expressing those in code.

Comment Re:Standards are (_Not_ Usually) Good (nor bad) (Score 3, Insightful) 430

Standards and enforcement of same is (usually) a symptomp of the "interchangeable morons" school of management. It presumes that all problems have a (ayn rand-ish) uniform solution that all _programmers_ will process identically.

A small number of "do not do"s with a "unless you have good cause" _guidelines_ are reasonable, but something as firm as a "standard" is a great way to make your great programmers no better than your worst over time.

Standards often contain bugs themselves. Things that create a hidden cost on the programmer and the program that can bog both down.

examples:

Even Microsofte eventually abandoned their "Standard" for putting the variable type as encoded dirt on the front of their varialbe names such as "lpintThingy" having plagued their code with Thingies that are no longer long pointers to integers and that cannot be globally searched and replaced because that hazards destroying other code.

Combined rule failure (use getters and setters + don't put member function definitions inside of class definitions => what would be a register load operation becomes an optimization resistant far call across translation units ** every dang time you set or read a scalar).

If you cannot trust your programmers to write good code then making them format it so it _looks_ like good code is a wasted effort.

If you cannot trust your great programmers to write great code eventually they will stop even trying to do so and you will be left with a hassle avoiding idiot or someone looking for a new job.

If you cannot trust your new programmers to understand your previous code then your new programmers are probably inferrior to your older coders.

If you are not winnowing out the _bad_ programmers via rational code review then your management is useless.

All but the most rudimentary coding guidelines are productivity and creativity and performance murderers.

Every company eventually realizes this, on and off, for a while, each time a management team ages into a job, and then forgets it again when they hire new managers.

Comment (In support) (Score 1) 430

Most "coding standard bugs" are hidden in a meta-level of reasoning that is harder to find than solving actual crap code.

True Story: Working at a medical equipment manufacturer writing C++. These two atomic rules, placed far away from one aonther in the standard made a mess. See if you can spot the mess.

(A) No member function may be defined within the class definition, and instead must be defined in the translation unit for that class. [e.g. you have to put the member definitions in the .cpp file not the .h file, so "class X { ... void foo() { /* implementation */} }; is not legal.]

(B) Access to member data may only take place via "getter" and "setter" functions. [as opposed to putting the varialbes in the "public:" part of the class.]

Both harmelss enough by themselves. But I opened a crap-ton of bugs on this issue because the two rules taken together turned simple register load/store operations into unoptimizeable far calls between translation units for each get/set operation. So I put my getters and setters in the class definitions like a sort of sane person (I didn't try to force sanity on them complely and just make some of the trivial values public, as I don't think they could have taken the strain) and, as required by the version control integration with the codeing standards enforcement and bug tracking tools, I filed a request for exception for every single damn such usage and let them choke on their procedure.

But there was a reason that only _my_ code didn't run over its CPU time budget.

  A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines. -- Emerson.

Comment Non-Whitespace standards can be very harmful. (Score 1) 430

True Story: Consider these two rules...

(1) Getters and Setters must be used for all local varialbe access.

(2) No function may be defined within the class definition body, and must instead be in the corresponding translation unit. (in C++ terms, you have to put all your member functions in the .cpp file and not the .h file etc.

So now every get/set operation (e.g. a register load or store) is turned into a far (inter-moudlue, cross translation-unit, unoptimizable) call, with arguments, stack frame, etc., to a remote function to do the register load/store.

Create a variable with that microsoft horror where you prefex the variable name with its type "lpszFileName" (long pointer to zero terminated string named FileName) and then change the underlying type after you have written all the code so that some long pointer to int is now a long pointer to a long but it still says int, or is now an opaque 16-bit value instead of a pointer at all but the names in countless blocks of code still lie.

And as far as the whitespace thing, I have a unicode non-break space with your name on it, particularly if you write in Python.

Coding standards that are _dumb_ can be _incredibly_ _dumb_ in many hidden ways.

Comment Google should then provide signed certs (Score 3, Insightful) 299

This cut at free flow of information, and this alligation that the cost is trivial in the parent poster's post, suggests that if it were such a nothing then google should offer a means to comply wihtout forcing people to go out and pay a third party.

If it's so cheap and such a nothing, then what's the problem wiht them providing what is needed to interract with their own service?

Comment In which world do preferences not matter again? (Score 1) 599

You will note that I said "warmer" not "better". Preferences vary and people can tell the difference no matter what you choose to beleive.

You know why there is artificial hiss added to VoIP? Because perfectly accurate digital silence is "not as good" as fake analog hiss when it comes to working with the human perceptions.

See, we are analog beasts. We evolved in an analog world. And we _like_ analog. Part of analog is signal _loss_ through smoothing. How much of which features of sound an individual _likes_ is an _individual_ taste.

Accuracy is not always king, and "better" isn't a universal place. You keep using "better" to mean "more accurate" so you have a religious-grade opinion over the someone esles' subjective experience. That kind-of makes you the dick pissing on other peoples preferences in the name of an absolute.

So you say accuracy is better, and they say warmer signal is better. Why do you think you are the one who gets to choose for everyone?

Hubris, my young man, is its own punishment. That you are bothered enough by a subjective opinion in others to the degree that it is rant-worthy means that you are suffering your own little mania.

Comment Eh.... (Score 1) 735

The problem is that the U.S. has displaced Caveat Emptor with Caveat Vendor. We put everything on the seller now days. That is too much as well. The sweet spot is some more Caveat Everydamnone with some enforcement all around. The Emptor is _not_ supposed to get a completely free ride in a rational society.

Sure, someone should be keeping the vendors in check. But the buyer is _supposed_ to beware as well.

Complaining after the fact is just lazy bullshit.

Comment tubes... (Score 1, Interesting) 599

Thing about tubes. I generally agree, but there is a warm thing abou tubes that *is* better. Digital sampling vs analog cricuitry is a aurally distingusiable feature. In digital sampling there is no trending, no inertia, to the samples. Tubes provide a continuous representation of the analog waveform where digital apratus (transistors, or god forbid, digitial medial 8-) provide snapshot sampling. The harmonics of each are distinct since the tubes will represent the intersticial times skipped by a digital media.

That said... have I rushed out and bought a tube set? No. Do I care about the difference? Not really. Do I think this is the same as the vinal question? Sort of. Do I care enough? No.

One thing that gets lost to most people is the belief that what they don't preceive is perhaps still perceptiable to others.

I think most "audiophiles" have been duped. Monster cable selling "gold plated HDMI cables to remove digital distorion" is complete and utter bulshit foisted on a fatuous public. On the other hand, I can and do hear a difference in continuously variable analog signals compared to digital signals in many settings. My ex was way more sensitive in the audio range. I do see the difference between motion blur and high frame rate and he cannot. (I have better eyes, he had better ears.).

Distinctions that you personally don't perceive are not _necessarily_ imperceptable to others. People vary.

How much that variance matters compared to a technology is a completely subjective question.

But yes, while I agree that most of the things are completely in people's heads, there are differences.

Don't be too dismissive. There is _some_ baby in that bathwater.

Comment U.S.A. U.S.A. !! (Score 1) 75

Here we go! Having broken _our_ system here in the USA, we always find a way to break other systems worse than our own.

It's so much easier than fixing our own problems.

Dimming innovation at home? Make sure that it's freaking impossible in the lands of our competetors.

Now on to South America and Asia.

USA! USA!

Comment Add USB 3.0 in there too. (Score 1) 330

I do a lot of movement onto and off of compact flash media and such. I recently got a USB 3.0 card reader and woo-doggy is it faster.

Similarly I would expect that paying the tiny extra sum for 3.0 drives would let you stack a couple CD/DVD read/write devices onto your system a lot more efficently.

You really can bump your head into the 2.0 data limits pretty easily at times.

Slashdot Top Deals

What is mind? No matter. What is matter? Never mind. -- Thomas Hewitt Key, 1799-1875

Working...