You sure remember the Fox incarnation.
You sure remember the Fox incarnation.
Good! I feel better.
5 years experience in a technology that has only existed for 14 months and cannot be taught in a classroom outside of business anyways. The requirements are way past ridiculous and border on the insane.
There's a "shortage" of good liars. I know a guy who was a fantastic BS-er that way. He had a network of fake references, for example. "Sure, he was doing Java for us in 1989. We used the first beta out. And he used Silverlight when it was still Bronzelight."
I felt too slimy to copy his techniques, but in a competitive world where a position receives hundreds of resumes, it's "survival of the fibbist", I hate to say.
Ego comes first...The unqualified never know that they are unqualified. It's just a bunch of meanies [to them], picking on them.
Heaven forbid if we ever got a president like that.
Or, white men are conditioned to an environment of abrasive competition, and not to complain about such behavior.
In a more general sense, different cultures value different things in different proportions, and that is going create conflict. "X people don't do enough Y" and/or "X people do too much Z".
Our egos make our own culture the center of the universe, and we try to shape the universe in our image. A recipe for conflict.
Most women I've known put money far above men's looks. If I had to use a point system, I'd assign it as such:
Earning power/potential: 60 pts.
Protecting and caring: 30 pts.
Looks/muscles: 10 pts.
True, the muscle part could be seen as "protecting and caring". I'm rather large in general such that perhaps that part mostly took care of itself despite me NOT resembling a super-hero. A man small or slight in stature may need muscles or karate skills to make up that portion of the report card.
Women want to be able to walk down the street at night with their guy and feel safe. There are different ways to achieve that. Some men fake it well with pure attitude.
Grace Hopper did not invent COBOL
COBOL was ultimately designed by a committee, but Grace's early compilers had a lot of influence on the language design.
The military and other organizations found it difficult to build financial, logistics, and administrative software for multiple machines that each speak a different language, and thus formed a joint committee to create a common language. (Science and research institutions had FORTRAN for cross compatibility.)
Basically the COBOL committee grew sour with disagreement. As the deadline for the first specification approached, the committee fractured into a "git'er done" tribe, and a "do it right" tribe. The git-er-done tribe basically cloned Grace's language with some minor changes and additions because they knew they didn't have time to reinvent the wheel from scratch. Grace's language was road-tested.
As the deadline came up, the git-er-done group were the only tribe with something close to ready, and so that's the work the committee heads ultimately submitted. There were a lot of complaints about it, but the heads wanted to submit something rather than outright fail. (The story varies depending on who tells it.)
Later versions of COBOL borrowed ideas from other languages and report-writing tools, but the root still closely mirrored Grace's language. Therefore, it could be said that Grace Hopper's work had a large influence on COBOL.
(It's somewhat similar to the "worse is better" story between Unix/C and a Lisp-based OS: http://dreamsongs.com/WorseIsB... )
- - - - - - -
As far as what orgs should do about existing COBOL apps, it's not realistic to rewrite it all from scratch, at least not all at once. That would be a long and expensive endeavor. It's probably cheaper to pay the higher wages for COBOL experts, but also gradually convert sub-systems as time goes on.
However, whatever you pick as the replacement language could face the same problem. There's no guarantee that Java, C#, Python, etc. will be common in the future. Rewriting COBOL into Java could simply be trading one dead language for another.
I know shops that replaced COBOL "green screen" apps with Java Applets. Now Java Applets are "legacy" also. (And a pain to keep updating Java for.)
Predicting the future in IT is a dead-man's game. At least the COBOL zombie has shown staying power. If you pick a different zombie, you are gambling even more than staying with the COBOL zombie.
If it ain't broke, don't fix it. If it's half-broke, fix it gradually.
There is indeed more social pressure on men to be the bread winners, similar to how women are pressured to look attractive. And thus we'd expect young men to work harder and longer to try to get the promotions. If you are pressured by society to do X, you are more likely to do X.
It may not be "fair", but that's society as-is. A quota system doesn't factor this in.
perhaps companies ARE mistreating women and minorities which WOULD make it the company's fault
The company can't force employees to like someone. If there are known incidents, they can perhaps do something, but most "mistreatment" is subtle and/or unrecorded. The organization cannot micromanage social encounters at that level.
In general, many people are tribal jerks. I've had white colleagues who told about mistreatment when they worked with a uniform non-white group, such as all Asian. The "minority" is often targeted. Sometimes it's driven by resentment of "white culture" discriminating against them in general. They channel that frustration into an individual who happens to be white.
I'm not sure how to fix this because it's probably fundamental to human nature. Mass nagging about "being good" only goes so far. If you over-nag, people often do the opposite as a protest to the nagging. (Is the word "nagging" sexist?)
Most American banks aren't building those kinds of buildings *now*. I think they stopped doing that in the 50s. Seeing that kind of building implies they've been around a long time. I don't know if it was considered over-spending when it was done. It was a more common thing to do in the early 20th century. It may have been a kind of reassuring message to people who grew up in the Depression. A "we're here to stay" expressed in architecture. Banks also may have been in competition at that time to pull in well-heeled customers who didn't want to be seen going into a shabby building. People cared about stuff like that back then--guys wore suits all the time, and Fedoras with the suit as they were meant to be worn, with no hint of irony.
forcing yourself into a pure functional style means that your code can run anywhere because it doesn't care about the context in which it runs.
Most planners focus on the current needs, not future needs. Whether that's rational (good business planning) is another matter. It's generally cheaper to hire and get future maintenance on procedural code. If and when machine performance issues override that, the planners may THEN care.
It depends on the project. If most of the processing for an app is happening on servers in cloud centers, then it's probably already parallelizing user sessions. Parallelizing at the app level thus won't really give you much more parallelism potential, especially if most of the data chomping is done on the database, which should already be using parallelism. Web servers and databases already take advantage of parallelism. It would thus be diminishing returns to parallelize the app level. If the code is looping on 10,000 items in app code itself, the coder is probably do something wrong.
If you're doing it right, you should never be doing anything twice. Anything you do should become a packaged and re-usable element that doesn't need to be coded again. I don't care whether you call it a subroutine, an object, or what have you.
Software development, done right, should grow exponentially--with a highly fluctuating exponent. No task should be predictable because no task should closely resemble anything you've done before. If it does, you shouldn't need to develop new code, you should just be able to re-use old code.
Well, OK, this is a gross oversimplification, but it does capture something fundamental about software development.
In the past I've found that managers almost prefer to do thing repetitively, over and over, the same stupid way. They love what is conceptually close to a duplication of the essentials of the last job, because although it's highly inefficient, it's also highly predictable. They would much rather have a near-linear curve of accomplishment versus time, then a much faster, but much less predictable exponential-with-fluctuating-exponent curve.
The typical manager would probably order you to recode the same thing ten times rather than "waste time" writing a subroutine.
(To be fair--it's hard to write a truly re-usable piece of code and easy to waste time in the name of re-usability and write code that isn't actually re-usable).
I remember this coming came up during the Y2K flap. COBOL programmers are dying out because companies have decided, for whatever reason, that they aren't willing to pay them much. A quick Google search--I don't know how reliable--shows me:
COBOL Sr. Software Engineer / Developer / Programmer $88,049
Java Sr. Software Engineer / Developer / Programmer $103,239
But that understates the difference because the various job titles shown for COBOL positions are predominantly lower-sounding and lower-paid positions.
As several have said, COBOL isn't hard to learn--I don't know it but I crammed it once for a test. Like all languages it takes a while to get good at it, but it's not specially difficult, nor is it specially bad.
The best strategy would be to render unto COBOL what is COBOL's--keep good legacy systems in COBOL--and pay enough to give people a reason to learn the language.
I'm going to go downtown, park at the sturdy Bitcoin building, walk in past the colonades and marble lobby, right up to the sturdy oak desk of my local and well-respected Bitcoin representative and seek reassurance that his institution is sound, and that my deposits are safe, fully insured, and returning the advertised rate of interest.
The industry learned the hard way that OOP works well for some things but not others. For example, OOP has mostly failed in domain modeling (modeling real-world "nouns") but is pretty good at packaging API's for computing and networking services.
The industry may also learn the hard way where FP does well and where it chokes. Some understandably don't want to be the guinea pigs.