Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:Hopper? No. She's the grandmother of COBOL (Score 1) 339

Grace Hopper did not invent COBOL

COBOL was ultimately designed by a committee, but Grace's early compilers had a lot of influence on the language design.

The military and other organizations found it difficult to build financial, logistics, and administrative software for multiple machines that each speak a different language, and thus formed a joint committee to create a common language. (Science and research institutions had FORTRAN for cross compatibility.)

Basically the COBOL committee grew sour with disagreement. As the deadline for the first specification approached, the committee fractured into a "git'er done" tribe, and a "do it right" tribe. The git-er-done tribe basically cloned Grace's language with some minor changes and additions because they knew they didn't have time to reinvent the wheel from scratch. Grace's language was road-tested.

As the deadline came up, the git-er-done group were the only tribe with something close to ready, and so that's the work the committee heads ultimately submitted. There were a lot of complaints about it, but the heads wanted to submit something rather than outright fail. (The story varies depending on who tells it.)

Later versions of COBOL borrowed ideas from other languages and report-writing tools, but the root still closely mirrored Grace's language. Therefore, it could be said that Grace Hopper's work had a large influence on COBOL.

(It's somewhat similar to the "worse is better" story between Unix/C and a Lisp-based OS: http://dreamsongs.com/WorseIsB... )

- - - - - - -

As far as what orgs should do about existing COBOL apps, it's not realistic to rewrite it all from scratch, at least not all at once. That would be a long and expensive endeavor. It's probably cheaper to pay the higher wages for COBOL experts, but also gradually convert sub-systems as time goes on.

However, whatever you pick as the replacement language could face the same problem. There's no guarantee that Java, C#, Python, etc. will be common in the future. Rewriting COBOL into Java could simply be trading one dead language for another.

I know shops that replaced COBOL "green screen" apps with Java Applets. Now Java Applets are "legacy" also. (And a pain to keep updating Java for.)

Predicting the future in IT is a dead-man's game. At least the COBOL zombie has shown staying power. If you pick a different zombie, you are gambling even more than staying with the COBOL zombie.

If it ain't broke, don't fix it. If it's half-broke, fix it gradually.

Comment Re:The Answer Comes Around 1am (Score 1) 138

There is indeed more social pressure on men to be the bread winners, similar to how women are pressured to look attractive. And thus we'd expect young men to work harder and longer to try to get the promotions. If you are pressured by society to do X, you are more likely to do X.

It may not be "fair", but that's society as-is. A quota system doesn't factor this in.

Comment Human Nature [Re:Company's Fault] (Score 1) 138

perhaps companies ARE mistreating women and minorities which WOULD make it the company's fault

The company can't force employees to like someone. If there are known incidents, they can perhaps do something, but most "mistreatment" is subtle and/or unrecorded. The organization cannot micromanage social encounters at that level.

In general, many people are tribal jerks. I've had white colleagues who told about mistreatment when they worked with a uniform non-white group, such as all Asian. The "minority" is often targeted. Sometimes it's driven by resentment of "white culture" discriminating against them in general. They channel that frustration into an individual who happens to be white.

I'm not sure how to fix this because it's probably fundamental to human nature. Mass nagging about "being good" only goes so far. If you over-nag, people often do the opposite as a protest to the nagging. (Is the word "nagging" sexist?)

Comment Re:High-brow fails [Re:It depends on the use] (Score 1) 416

forcing yourself into a pure functional style means that your code can run anywhere because it doesn't care about the context in which it runs.

Most planners focus on the current needs, not future needs. Whether that's rational (good business planning) is another matter. It's generally cheaper to hire and get future maintenance on procedural code. If and when machine performance issues override that, the planners may THEN care.

It depends on the project. If most of the processing for an app is happening on servers in cloud centers, then it's probably already parallelizing user sessions. Parallelizing at the app level thus won't really give you much more parallelism potential, especially if most of the data chomping is done on the database, which should already be using parallelism. Web servers and databases already take advantage of parallelism. It would thus be diminishing returns to parallelize the app level. If the code is looping on 10,000 items in app code itself, the coder is probably do something wrong.

Comment Re:FP becomes more popular than OOP? (Score 1) 416

The industry learned the hard way that OOP works well for some things but not others. For example, OOP has mostly failed in domain modeling (modeling real-world "nouns") but is pretty good at packaging API's for computing and networking services.

The industry may also learn the hard way where FP does well and where it chokes. Some understandably don't want to be the guinea pigs.

Comment Re:Functional Programming Considered Harmful (Score 1) 416

If you work on Chevies when everyone else is working on Fords, then you may have difficulty finding mechanics for your customized Chevies. I've yet to see evidence FP is objectively "better" in enough circumstances and niches to justify narrowing staffing options except for certain specialties. (I've complained about lack of practical demonstrations elsewhere on this story.)

Comment Re:High-brow fails [Re:It depends on the use] (Score 1) 416

Addendum and corrections:

The "actor model" seems pretty close to event driven programming. I don't know the official or exact difference. But my key point is that the event handling programming and interface is procedural. The only non-procedural aspect may be that requests for further actions may need a priority value (rank) and to be submitted to a request queue. For example, a game character may request a "shoot arrow" event on their part as a follow-up. But the event handler writer doesn't have to concern themselves with the direct management of the event-request-queue.

"Any more than...query writers...don't have to know..." should be "Any more than...query writers...have to know...". Remove "don't"

Comment Re:High-brow fails [Re:It depends on the use] (Score 1) 416

is more important now because of the trend towards more and more cores

But the bottleneck is not CPU itself for a good many applications. And specialized languages or sub-languages can handle much of the parallelism. If I ask a database to do a sort, it may use parallelism under the hood, but I don't have to micromanage that in most cases: I don't care if the sort algorithm uses FP or gerbils on treadmills. Similar with 3D rendering: the designer submits a 3D model with polygons and texture maps, and a rendering engine micromanages the parallelism under the hood. That "root engine" may indeed use FP, but the model maker doesn't have to know or care.

And event-driven programming can hide that fact that parallelism is going on, or at least provide a non-FP interface. For example, in a game, a character may be programmed by how they respond to various events. There may be 10 events going on at once, but the app dev is only focusing on one at a time per character or character type. Granted, it may take better languages or API's to abstract parallelism well. The "root engines" may make heavy use of FP, such as the database, rendering, and event handling engines, but the 2nd level, typically called "application developers", probably won't need that any more than SQL query writers (usually) don't have to know how the database engine was written. But only time will tell...

Comment Plane Truth [Re:Why??] (Score 1) 228

who invented flight thing over again, they will just keep on redefining what flight is until they are first.

Not comparable to this situation per sister message, but as far as the first manned plane flight, the definition matters because it was relatively trivial to attach a motor to a propeller and then to a thing with wings and lunge sky-ward for a short period of time. After all, gliders, as in hang-gliders, were already common by then.

One could argue it was really an evolution, but the Wright Brothers were way ahead of the others in terms of control for several years regardless of who made the first lunge into the air. They were doing figure-8's when others could barely turn.

They finally lost that distinction when others moved and perfected the "tail" on the back instead of the front, which made planes safer.

Comment Re:Why?? (Score 1) 228

Why are Americans so desperate to prove that everything happened there first...[such as] who invented flight...

What is "first" in this bone case in terms of competitor nations? I don't see any relationship between that and planes. If the claim were that Americans reached America before Europeans did, then it may be comparable, but then its meaningless. I-dont-geddit

Comment Re:I Have No Trouble Making Accurate and Precise.. (Score 1) 216

I've done many project plans for clients, and when I give them the results, they always bitch. But, when the project is actually delivered, they finally agree that I was right in the first place. After that, it gets easier...[with] THOSE clients.

Indeed. Many PHB's have to learn the hard way:

"Who knew healthcare would be so difficult?"
     

Comment Re:No, but they require planning (Score 3, Insightful) 216

The product was finished as described in the requirement documents, but generally didn't work 100% like the customer expected.

Yip, generally it's easy to make an estimate for a clear specification. But, customers rarely know what they REALLY want until they see something in production. This is a very common problem. I don't know any easy solution to that: mind-reading machines don't exist.

One partial solution is for the technical analyst to become a domain expert first. But obviously this is often not practical. Further, sometimes the main customer/manager wants something rather odd that is a quirk of their personality. You may build something that fits the domain, but they want to see their domain in peculiar quirky way.

Another partial solution is "RAD": Rapid application development tools. Someone who knows the tool well can usually spit out something pretty quick and change it fairly quick.

However, RAD tools are not known to be very flexible in the longer run, such as when UI styles and expectations change. They achieve RAD in part by marrying business logic to the UI. This marriage makes less "marshaling" code between the database, biz logic, and the UI; BUT also hard-wires it to UI assumptions. Keeping up with the UI Kardashians can be a major PITA. Just when GUI's were maturing, web came along. Just when web was maturing, multi-device-handling needs came along. The current "in" thing is going to be klunky because it's not mature yet.

For now, it looks like we are stuck with some degree of organic meandering to get something the customer is actually happy with; but organic meandering takes more time and money and is hard to estimate up front.

Slashdot Top Deals

If this is a service economy, why is the service so bad?

Working...