Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

Comment: environments, team, & domain matter (Score 1) 232

by dakra137 (#45892881) Attached to: "Clinical Trials" For Programming Languages?
The execution environment matters if the application runs in one.

Execution size & time matter if execution will be long or done many times, or in a constrained environment.

Compile time matters if it only has to run once. That's why there was WATFIV.

If there are any, the standards of the team or organization that produce or consume the source code matter.

Readability and maintainability sometimes matter. I did a lot with APL, including enhancing a program where the flow chart was over 40 pages, but the program was so well structured, it was easy. In most personal programming cases, however, any complex function had to be written and tested in one day. After that, it was almost undecipherable, even by the author.

Verbosity matters to poor typists. It gets in the way of understanding. Did the person who came up with, "Blah blah = new Blah;" stutter? In what way is groovy inferior to java?

Domain really matters. Once I needed something to generate several drawings for a patent application. I considered generating SVG, I considered macro capabilities in a CAD package. In less than a day, I learned and made the diagrams with MSLogo. If I had to do it again, I might use python turtle graphics, but it would have been more verbose..

+ - Finally: Electronics & Media Accelerated Burn-in at Home-> 1

Submitted by dakra137
dakra137 (1590245) writes "Big Lots stores are now offering the ideal product for accelerated burn-in, or at least heat stress testing, of electronics and media such as DVD's, vinyl records, tape, floppies, Edison wax cylinders, etc.
Warning: This item appears to be unvented, so it is not suitable for media destruction.
This has to be one of the world's worst product ideas ever."

Link to Original Source

Comment: Banning what can be used in games of chance (Score 1) 238

by dakra137 (#44252653) Attached to: Florida Law May Accidentally Ban Computers and Smartphones
>> "any machine or device or system or network of devices" that can be used in games of chance.

Besides card tables, this also includes Candyland and other board games with spinners or dice. The entire game is a "system."

One way to get the governor voted out of office would be for the state to ban the dreidel.

This is also a way to prevent any more Wheel of Fortune roadtrips to Florida.

Comment: Don't trust important calculations to toys (Score 1) 211

by dakra137 (#44189189) Attached to: LibreOffice Calc Set To Get GPU Powered Boost From AMD
Are the results correct? Well I've double checked the formulae.

So what, if your computer doesn't use Error Correcting Code (ECC) Memory or at least parity memory, soft or hard errors may not be noticed.

If you must use toy class computers for important work:
Verify the data and formulae and calculate file hashes on a business class computer,

for (i=1; i < 3; i++) {transfer to toy[i]; verify the hashes; calculate;}
Gather the results; compare ;
if (same) {hurray;} else {try again;}

Comment: Don't be disrespectful of the elder languages (Score 1) 276

by dakra137 (#44117093) Attached to: Join COBOL's Next Generation


* COBOL isn't evil. It is good for its domain. It is very good for nested data structures.
* An example is given that goes beyond the typical.
* I claim not to be a COBOL bigot.
* I explain where COBOL programmers came from originally.
* There are some aspects of COBOL I like.

Don't be disrespectful of the elder languages just because their domain isn't what you work on.
COBOL programs are probably handling your payroll, bank, and investment accounts.


>> For those of us who are unfamiliar, could you describe what it is that COBOL does that other languages don't cover as well?
>> Pages and pages and pages of foreplay before you get any action.
Yes, there is stuff at the top that is largely documentation followed by a FILE CONTROL SECTION with information about the files to be used, and then the DATA DIVISION. In many cases, the DATA DIVISION, the part of the program that describes the variables, condition codes, and data structures is larger than the procedural logic.

Most of the discussion here has been about the procedural code in COBOL programs, not the DATA DIVISION and its structures. One can accomplish very interesting things with the hierarchic data structures of COBOL and PL/I. The following example shows how to march through a variable length record containing instances of 2 kinds of variable length segments, each of which could appear many times. It leaves out some of the boring stuff.

01 LAYOUT-1.
        03 LL-1 PIC S9999 USAGE COMP-3.
                88 TYPEAA VALUE "AA".
                88 TYPEBB VALUE "BB".
        03 AASEGMENT.
                05 FIRST-FIELD ....
                05 SECOND-FIELD ....
                05 FIRST-FIELD ....
                05 SECOND-FIELD ....


open the file




        blah blah blah
        blah blah blah
* slide over to the next segment


Admittedly, this is not your typical COBOL program, but stuff like this can and is done with COBOL when necessary. My first real COBOL program read 7-track (6 bit characters + parity) tape where there were multiple variable length records within a block. To process the records on an IBM mainframe, I had to deal with the fact that the lengths were 12 bits that had been read into two 8 character bytes, so the length was really byte0+64*byte1. All this was doable in COBOL. Once I got the data out, I did the first level of analysis in FORTRAN and the fun stuff in APL.

I wouldn't try to create and select from the 4 dimensional outer product of 4 vectors in COBOL, and I wouldn't and didn't in FORTRAN, as long as I had access to APL.

When is COBOL a preferred choice:
* When the data structure definitions are many and many are subsidiary to one another.
* When it's what the example is written in that just needs some tweaking for a new purpose.
* When the auditors have to read it, as previously stated.
* When business users express and want to be able to read the business rules. (Although a lot of that is now going into rules engines which accept COBOl and crank out COBOL, java, etc.

Please don't accuse me of being a COBOL bigot. My first computer language was FORTRAN, and the next one I used a lot was APL.

>COBOL code isn't written by programmers, but by business admins - the accountants, management, etc who basically don't want to learn "code" but do something that feels natural to them.

Consider that the first several generations of business programmers were not computer science majors. There was no such thing. Once there was, they wouldn't go near COBOL. COBOL was the domain of business and later the MIS (Management Information Systems) majors. They would not know a truth table if they were stretched over one and tortured for violating DeMorgan's Laws.

Companies could not hire programmers, they had to train them. The applications were accounting and record keeping, so they hired accounting majors. (Companies hired engineering, statistics, math, physics, and music majors for technical computing.) The trainees wrote programs that followed the examples they were taught. Many companies have COBOL style guides that specify what language capabilities are permitted and which not. They also specify standards for section and paragraph names. The result is that reading and understanding old code is relatively easy. They usually prohibit use of MOVE CORRESPNDING.

I like the ability to have loops with the test at the top (possibly 0 times) or the bottom (at least once through). I get that with COBOL and MS BASIC.

In the early '90's I designed and developed a framework for SOA-like interoperability across OS/2, OS/2 with CICS/OS/2, CICS/MVS, and IMS/VS. Programs could be in COBOL on all those platforms or C on OS/2 or CICS/OS/2. The COBOL client and service applications were portable. The framework was itself written in COBOL and portable among all of them. The compilers were from IBM for the mainframe and Microfocus elsewhere.

Don't be disrespectful of the elder languages just because their domain isn't what you work on. COBOL programs are probably handling your payroll, bank, and investment accounts.

Comment: DIFF: Quantum mechanic vs. Auto mechanic (Score 1) 276

by dakra137 (#44114839) Attached to: Quantum-Tunneling Electrons Could Make Semiconductors Obsolete

Q: What's the difference between a quantum mechanic and an auto mechanic?

A: A quantum mechanic can get his car into the garage without opening the door.

Note: When I thought that joke up over 35 years ago, there was little risk that "Auto mechanic" would be misconstrued as referring to an autonomous programmed agent, except in Science Fiction.

Comment: Tweets no; presence yes. (Score 1) 358

I consider facebook and twitter to be personal-social, not business. Linkedin is for business. It is where recruiters in many industries are looking for candidates. They troll profiles and the appropriate groups. They also post jobs in the groups. My linkedin profile is unusually thorough, with dozens of project entries, but as a result is producing very much on-target contacts from recruiters.

Somebody from Nigeria asked me to mentor him towards certification in something. I was concerned that maybe this would turn into a scam or my electronic interactions might end up with an inbound malware payload. By googling him I found his interactions and comments on other people's blogs and emedia columns going back several years. The name, id match, and content gave me the confidence that he was both bona fide and an experienced practitioner.

I am sure there are blogs, columns, help sites, and discussion groups in your field. Participate. You'll get ideas and contacts. Of course, if your industry doesn't publish because it is covert, for good or for bad, that is another story. ;-)

Comment: What I use (Score 1) 656

by dakra137 (#43886761) Attached to: Ask Slashdot: How Important Is Advanced Math In a CS Degree?

I do system modeling, architecture and design, capacity planning, and design for high availability, high capacity, high performance, continuous operations,, and occasional programming. I am not in a research organization. I am in the world of commercial IT. Most of my career has been in technical sales.

I use simple algebra all the time. Workload growth over time implies exponentiation. In the last few years I have used logs and even quadratic equations in system analysis and modeling.

I use Boolean logic, AND, OR, NOT, XOR, IMPLIES, DeMorgan's Law etc to construct and understand & debug other's complex if statements and hierarchies of them.

I use probability to understand variations in workload coming in to servers, their business (busy-ness), and how to combine multiple workloads together. Some of Excel's probability functions have an unnecessarily limited domain. I had to substitute alternative versions in VBA that added and subtracted the logs of the components and then exponentiated to return the results.

When considering servers (machines and programs) capacity and responsiveness, I use queuing theory, and consider where things get buffered and queued along the way and at the layers between clients and servers.

I use graph theory to be able to understand network flows, linkages between relational tables, multiply linked lists, and graph databases.

When modeling communications links or systems availability, with lots of things with very low error or failure rates, I do binomial expansions and sometimes Maclaurin or Taylor Series.

I do most of all this in spreadsheets, but sometimes the provided functions are simplistically written, so they cannot handle combinations of large and small parameters. Then I had to substitute a better algorithm.

Some people are willing to zero in on an optimum of something via successive approximation, but when a function is readily differentiable and that is easy to solve for a zero, isn't it embarrassing?

I have created and navigated through a four dimensional array A[i,j,k,l] only once, to deal with the consequences of combining four independent discrete probability distributions of requirements for a type of resource.

Comment: civilian and military uses (Score 1) 244

by dakra137 (#43484349) Attached to: Researchers Report Super-Powered Battery Breakthrough
This research was funded in part by the US Air Force. If this technology can really be tweaked to outdischarge supercapacitors at similar or better energy densities, I am surprised it wasn't declared a military secret.

On the civilian side, I am sure many hearing aid users will look forward to recharging their device, maybe by induction while wearing it, rather than consuming zinc-air batteries.

Comment: Error Correcting Code (ECC) memory (Score 1) 591

by dakra137 (#43367971) Attached to: If I could change what's "typical" about typical laptops ...
People expect a computer to run a program correctly, fail with an error code, or crash. But what if that’s not so? What happens when errors can just slip by, producing erroneous results? Sometimes it doesn't matter. In business and industry, it often does. Even tablets are increasingly used in critical health care. Except for "workstation class" desktops, today's desktops and laptops have no error detection for soft memory errors which are mostly caused by cosmic rays, alpha particles from the plastic surrounding the chip and other unknown causes. These code and data changes last through memory refresh cycles until a new value is stored into memory by the application, operating system, or input device. This isn't a problem for servers, since they typically have ECC (Error Correcting Code) memory which detects and corrects these errors.

In 1981, IBM introduced "parity memory" to the world of personal computing, turning the personal computer into a real business machine. When there was a memory error, the original IBM PC simply stopped running, producing no output rather than false output.

Memory was expensive, ~1K$/MB ca. 1988. Dropping parity would save ~11%. Consumers and then businesses started buying PC clones without parity memory from Dell, Compaq, and others. Unless the changed memory causes a program or system crash, these computers give no indication of a soft error. They just deliver erroneous results. This demonstrated that most businesses do not value accuracy in desktop computing enough to pay for it. Being “market driven,” even IBM also stopped using the more expensive parity memory in desktop computers. The executives of both the PC and chip divisions said it the market had made a bad choice.

Even though memory ECC memory is inexpensive, nowadays, most non-server CPU's, chipsets, and mother boards do not support ECC memory.

How pervasive is the problem? One trade rule of thumb is to expect one soft error event per Gigabyte*Month at sea-level. The higher a computer is above sea-level, the more likely it is that a cosmic ray will flip some bits. As chip technology advances and geometries shrink in size, each cosmic ray changes even more bits. Google's 2009 paper "DRAM Errors in the Wild: A Large-Scale Field Study" indicated the problem is an even worse combination of hard and soft errors. Some computers are set to do a memory test whenever they are powered on, unless recovering from hibernation. As OS's have improved, the meantime to Power-On-Reset (POR )has increased. As memory size has increased, the factory default setting has become not to do a memory test even at POR. It takes too long.

Using and saving a complex spreadsheet at 35,000 feet is not a smart thing to do.

I highly recommend that we all use machines with ECC for critical work. Fork and do not re-accept into your mainstream source, executables and critical content that have been saved on non-ECC systems.

Feasible? Not yet. Make this a cause within the blogosphere, your company, vendors, Intel/AMD, and the trade press?

Maybe. Until then, caveat emptor, and caveat computor.

Fundamentally, there may be no basis for anything.