Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:And JCL and System 360 Mainframe HLA and CICS? (Score 1) 372

Obviously a disgruntled and inexperienced mainframe user - sorry :-(

And yes, for a customer, I rewrote (from a hex dump) an old ASM routine that they'd lost the source for - simply so I could make it 64bit safe.

However, one does not need to go through CICS to compile a COBOL program - though your shop may have created a CICS transaction to do compiles.

As for IDEs, if you don't have ISPF and plugins for most of that, complain to your management! My first IDE was emacs (BSD 4.2) with functions to compile and point me at all the compiler messages.

As far as Java/AJAX/REST/XML/... yes there are still some rough edges - after all, these are, by and large, bolted on very old (and stable) code.

Comment Re:BUGS (Score 1) 372

Program by contract, anyone ?

That was supposed to be ADA but, at the time, compiler technology was not upto snuff - so things like:
* Loop pre-conditions
* Loop invariants
* Loop pst-conditions

Just were not validated.

But yeah, you obviously have never tried to reconstruct the logic of a 65K+ line long COBOL program :-(

Comment Re:If it ain't broke don't fix it. (Score 1) 372

They are learning COBOL


The problem is the language is COBOL. Programmers should learn Hexadecimal and Binary (machine level code) and then go into application layer programming from there, but that is neither cool or trendy.

I learned Assembly fairly early on, and I would agree that, once knowing how the machine operates, it is much easier to express what you want in any higher level language (neglecting things like synchronization points) - even OO constructs often boil down to simple things - calling by pointer. I still write vastly more C than C++ code.

I have read more hex dumps than many extant programmers (long before IPCS), and can still read many opcodes (ld, st, ...) - but I would not recommend starting at that level. Assembly and/or (restricted) C code fulfill the requisite learning objectives nicely enough.

However, this is not a path to a feature rich resume (lacking in super models).

AFAIK there are still no viruses on MVS & VM systems. TPF and CICS still function wonderfully there is just a huge price to get into the game with a mainframe system vs. PC/minicomputers.

Well, when was the last time you spent some quality time perusing the micro-fiche PL/X listings for MVS (doubt they still exist for z/OS) ?
The barrier for entry is quite high here... for the nonce !

Joe random hacker isn't going to easily get access to a z/OS (or z/VM) image to start inspecting object files for issues

Comment Re:Translate COBOL to other languages? (Score 1) 372

> I've had to resort to using FORTRAN to C converters several times, mainly to turn code written by scientists into something useful in a product. It was ALWAYS worth the effort!

Really ? Do you not understand the largest difference in FORTAN and all other current languages ?

In many applications, the difference may not be noticable, but in many mathematical cores, the transformation is fatal !

FORTRAN stores arrays in Column major order, whereas nearly everyone else stores them in Row major order.

This difference can be the difference betwixt high order page/cache/tlb misses, and running/scaling smoothly

Comment Re:My brother and I were talking about this (Score 1) 372

Patently false... Last I checked there were still more lines of COBOL in business and government than all the other languages, combined.

I'll grant the fact that new comers don't want to get their hands dirty... they want something that enhances the resume for the next job... Something we see with all the 'legacy languages'.

I played vanguard during the stupid Y2K false apocalypse - writing superviser level code to track storage from a date field all the way through a program, in order to generate a report saying 'This field is a data, or is based upon a date value'.

I don't plan on doing updating that code for the Y2038 issue, but I fully expect COBOL & PL/I to still be in wide use at that time

Comment Re:It will probably never die (Score 1) 372

Full agreement, on most all accounts !

As noted above (AC, sorry - fscked up), modern COBOL has many of the flaws of other major languages (even pointers, &diety forbid).

The control flow of old COBOL programs is horrendous (I've seen > 65K line, monolithic programs), where a C++ guy would do a metric crap-tonne of 20+ line functions.

COBOL PERFORM flow is covered by a (5, maybe 4) way return function - optimization under patent. It drives optimizer folk bat-shit crazy (as if they weren't already there) :-)

I have also seen an inordinate amount of old and new programmers who fail to grasp the power of the COBOL I/O system - and pay dearly in runtime costs because they chose the wrong file organization...

Comment Re:What is needed.. (Score 1) 372

Really, have we learned nothing from ANDF ( ?

I actually started down this path, but got side tracked by earning a living wage.

The idea was to take a PL/I and Ada basis as an IL, and provide language based decoder/encoder - so one could achieve idiomatic code in any Source->Source translation.

Interesting yes, lucrative - kinda... There are companies that make their living doing this... but the bugaboo is in the details, especially with PL/I deriviatives and the macro processor.


What's Happening As The University of California Tries To Outsource IT Jobs To India ( 483

Long-time Slashdot reader Nova Express shares an epic column by Pulitzer Prize-winning journalist Michael Hiltzik. It details what's happening now as the University of California tries to outsources dozens of IT jobs -- about 20% of their IT workforce -- by February 28th. Some of the highlights:
  • The CEO of UCSF's Medical Center says he expects their security to be at least as good as it is now, but acknowledges "there are no guarantees."
  • Nine workers have filed a complaint with the state's Department of Fair Employment and Housing arguing they're facing discrimination.
  • California Senator Feinstein is already complaining that the university is tapping $8.5 billion in federal funding "to replace Californian IT workers with foreign workers or labor performed abroad."
  • Representative Zoe Lofgren (from a district in Silicon Valley) is arguing that the university "is training software engineers at the same time they're outsourcing their own software engineers. What message are they sending their own students?"
  • 57-year-old sys-admin Kurt Ho says his replacement spent just two days with him, then "told me he would go back to India and train his team, and would be sending me emails with questions."
  • The university's actions will ultimately lower their annual $5.83 billion budget by just 0.1%.

Comment Re:LLVM byte code (Score 1) 230

The article explains what ANDF is, but it doesn't say what was wrong with it. What was wrong with ANDF?

That is a very good question. The quest for an UNCOL ( goes back to the mid 1950s.

A terse, but readable history/background upto and including early ANDF ( This paper seems to argue that a change in relative costs betwixt machines and programmers changed the landscape to where the complexity was not worth doing a true UNCOL.

Things changed from complexity to business politics, much like we see now with cell carriers (and lockin), customers wanted to be able to move work from one vendor to another, and up sprang another organization to guide that - the OSF (

A changing playing field (enter MS and Apple) and business politics seemed to doom OSF and most of its work - as it turned to 'every man for himself' until Linux came into being and pretty much changed everything... How many vendors still have their own Unix ?

I've been playing in this industry since the mid 1970s, professionally since the early 1980s - most all with a bent towards compilers and operating systems.

My reading of the tea leaves entails a re-birth of the UNCOL idea - and again, due to changing relative man/machine costs, and the prevalence of the IoT.

I believe we actually wind up with the ANDF variant of UNCOL
* You compile language du-jour (with whatever optimization, say to SSA, you can do at a high level), and generate a distribution package (apk, deb, rpm) that includes the UNCOL instead of binaries.
* At install time, the UNCOL is compiled to native (for some definition of native) code... This too, should be optimized

This is where Android is already moving - ART vs Davlik. On smaller devices, you don't want the interpretation overhead unless you have a scheme to handle JiT and building up a native application as the various codepaths are exercised.

Today, we have fewer architectures to contend with (x64, arm, ppc), a better battering ram (open source/crowd source & funding) - ie, driving from the correct side.

This still requires effort by the hardware business to create the UNCOL->Native binary

Slashdot Top Deals

It is much harder to find a job than to keep one.