Forgot your password?
typodupeerror

Comment Re:Stick with Safari (Score 1) 303

I would use CyberDog, but it's the Apple browser that doesn't get any respect any longer.

Hahaha, wow. CyberDog! I worked for an ISP back in the day, doing telephone technical support. A customer called in one time and told me they had CyberDog on their system, so I had to download and install it on mine so I could help them use it. I remember mentioning it to my coworker/boss, the sysadmin, who cracked up about the name CyberDog. From then on it became a running joke about us needing to officially support CyberDog. Good times.

Comment Re:Should you leave HTTP for Gopher? (Score 1) 303

I miss Archie, Veronica, and Jughead.

As do I. (I assume you were being sincerely wistful.)

I miss curated web site indexes. I miss the plethora of distinct search engines that used to exist. I miss pressing 'g' and typing in a URL in Lynx. I miss the burps and chirps of a dial-up modem. I miss having civil conversations with interesting people, free of trolls. I miss opening up Pine and seeing one or two letters from friends instead of a dozen from spamers. (Though I don't miss the chain letters. Sheesh!) I miss perusing the seemingly endless lists of newsgroup topics. It was glorious and awe inspiring, and while it was still largely a text experience, much was left to the imagination.

When did the Internet become overrun by corporations? When did it become fractured and politicized? For a brief moment we were all Netizens in an egalitarian society, united by our common interests. Truly, I miss the simplicity of the Internet that was.

Comment Trade-offs of change (Score 1) 383

Any LOB (Line Of Business) application will require periodic tweaks and refreshes to remain relevant to operations. Depending on the business, the amount of time between required changes could be measured in days, months or years. Regardless, when the time comes, a choice must be made. The decision is between either improving (in the "remodel" sense) upon an existing application, developing an entirely new one, or adopting/adapting a COTS (Commercial Off The Shelf) solution.

For incremental changes, improving on an existing application is likely the least negatively impactful solution. The procedures associated with the software upgrade process should already be fully baked within the organization. Retraining costs--both time and money--will be limited to application changes only. There won't be a need to maintain side by side systems like there is while proving out an entirely new technology. TCO (Total Cost of Ownership) is understood.

However, it's always worth considering what positively impactful changes a new, custom LOB application or COTS solution could provide. It could be that the hardware required to run it would be cheaper or more readily available. It could be that the programming talent required to maintain it would be cheaper or more readily available. It could be (in the case of COTS software) that hiring people familiar with using and supporting it would be cheaper than training them to use and support a proprietary solution. And, of course, it could be that it offers desirable features that are cost prohibitive or technologically infeasible to implement in a legacy LOB application.

Which solution you choose depends on how risk averse you are, how much money you have to spend, and whether you need to gain a competitive advantage over (or catch up to) your peers.

COBOL is an old language, and it is still used in a lot of places. Just like a human language, it will continue to exist for as long as there are people who choose to use it. As to whether it should be abandoned in favor of newer languages, I submit that the same argument could be made for phasing out most of the world's time honored human languages. Why continue to use something that has bizarre syntax rooted in ancient history? Maybe because it continues to work, regardless its accumulated cruft. (I don't see much traction to adopt Esperanto as a world standard!) Unless we go out of our way to shun and discourage the use of COBOL, I suspect it will continue to be used where applicable for a long time to come. People who don't want to use COBOL are welcome to use something else. There is no shortage of languages.

Comment Multiple Reasons for "Modern" UI (Score 1) 489

Here's my armchair analysis:

Too much white space, huge margins...
-Facilitates usability on touchscreen devices. Fingers are significantly less accurate input devices than mice, and have a variable size selection surface. An effective touchscreen UI provides large enough spaces between screen elements to accommodate the full range of human hand sizes without increasing the errors due to mistaken selection.

...too little information
-Result of one size fits all UI design for touchscreen devices. An effective touchscreen UI must have sufficient space between elements, as described above. However, screen sizes are also variable. The most popular touchscreen format is the cellular telephone, which has a very modest I/O space. An effective touchscreen UI must switch between output centric and input centric display modes in order to make the best use of limited screen space.

Text is indistinguishable from controls
-Risky UI design choice. Possibly a result of taking context sensitive design to an extreme. Since no indication is made as to which screen elements are controls, any could potentially be controls. New functionality can be added with no change to the existing UI. If sufficient potential usage is anticipated, software is deemed intuitive by those who use it. Otherwise, it is likely to be considered incomprehensible.

Text in full-CAPS
-Stylistic choice. Reduces readability, but is one mechanism for distinguishing text from controls.

Certain controls cannot be easily understood (like on/off states for check boxes or elements like tabs)
-Subjective. Possibly a result of an ever expanding number of UI toolkits. Graphical controls must be sufficiently unique to comply with copyright law, but sufficiently similar to existing controls to be understood. More toolkits means less room for unique controls, unless similarity is sacrificed.

Everything presented in shades of gray or using a severely and artificially limited palette
-Results from desire for consistent UI presentation. Can be used to differentiate text from controls. If colors indicate function, or have some other logical mapping within the software, makes it easier to provide alternate color palettes with consistent, non overlapping usage.

Often awful fonts suitable only for HiDPI devices (Windows 10 modern apps are a prime example)
-Results from tailoring UI design for an ideal output device, and failing to have suitable alternate presentations for other devices.

Cannot be controlled by keyboard
-Results from tailoring UI design for an ideal input device, and failing to have suitable alternate controls for other devices.

Very little customizability if any
-Results from lack of resources to implement. User defined alternate views add additional challenges for development, as well as documentation and technical support.

I would add the following peeves to the list:

Lack of initial input field focus and tab order on forms.
-Results from mistakes made during application planning, development, and testing.

Entirely pictographic user interfaces.
-Results from need for internationalization of UI controls. Text costs money to translate. Pictographic interfaces shift the onus of translation to the user. Can lead to the phenomenon described by the OP, "certain controls cannot be easily understood." Also, graphics often can be made to fit in less space than the written word would. Pictographic interfaces may help ease screen space constraints on mobile devices.

Comment Debugging nightmare! (Score 1) 499

I haven't read the article. However, the idea of this sort of processing technology finding its way into general purpose computers makes my skin crawl. Isn't there already enough uncertainty in computing? We already have spurious hardware malfunctions and uneasily reproduced software glitches to contend with. It's hard enough when troubleshooting issues to discriminate between the existing, overwhelmingly unintentional sources of computing errors without having to consider the possibility of intentionally sloppy chips.

A number of people have voiced concern about using sloppy chips in the context of banking and finance. At least in those contexts you're dealing with data that's not obfuscated in any way--corporate financial statements aside. Consider what would happen if you attempted to compress or simply encode data with a sloppy processor. If you used any existing lossy algorithm the results would be worse than you'd expect, but still potentially within the realm of acceptability. If you used any existing lossless algorithm, the results would be useless. By including trustworthy error correction metadata you could perhaps overcome issue of data unreliability, but you'd have to perform additional calculations to correct the data, thus eating into the efficiency gain achieved by using a sloppy processor. The end result? Greater uncertainty of data correctness, greater complexity of design to overcome uncertainty, and greater difficulty debugging reported issues.

Slashdot Top Deals

"I go on working for the same reason a hen goes on laying eggs." - H. L. Mencken

Working...