Comment Re:Debunk? (Score 1) 301
heh. I do wonder if their use of these is considered 'fair use' or if, like in the case of the Obfuscated C entry, it really is a copyright violation and the coders should get compensated?
heh. I do wonder if their use of these is considered 'fair use' or if, like in the case of the Obfuscated C entry, it really is a copyright violation and the coders should get compensated?
I'm inclined to agree - "malaise" is an opinion, seen better from the outside than within. Ask any social commentator to look at today's workers or kids, with how much they 'veg' in front of a TV or on a computer, and you definitely know that the work itself, with the rare exception, is not in any way fulfilling.
He was right in that automation doesn't necessarily lead to total unemployment: it changes the jobs, and within a company it reduces them, but on the whole we've more jobs now than we did then (as we have more than twice as many people, but close to the same employment rate as we did then), so the system adapts to unemployment, eventually.
I don't think he was using the term 'spiritually' to in any way imply any aspect of religion, so those that interpret it that way are misreading him.
perhaps true...but then I wasn't coding on a laptop in the 4x3 era.
and even so, I still live in alt-tab hell because a laptop screen, no matter that it is 1440x900, can't get me my code, my web browser large enough to see the space I'm fixing, and the firebug console to figure out what the bleep is wrong with it, all at the same time.
gimme that thunderbolt...
all that said, the idea of very small pixels on a laptop-sized screen doesn't necessarily make things more readable. For a tablet where you can use vertical orientation to read, it might be better, but on a laptop that is fixed horizontal orientation, I can't see it being that much use for text, and while it may render a blue-ray more accurately, your eyes won't know the difference at that size.
(oh, and in addition to fixing it, I have a thunderbolt monitor for my mac laptop at work, and an imac with a larger screen ratio, too...and i've yet to meet a mac developer who wants to go back to a mere 1080 after finally getting a screen with serious real estate like that.)
I fixed it. I said I fixed it for myself: I have one monitor in vertical orientation. I also shrunk my font size down to 7 point. Seems to work, I can generally get most of a file on the screen at once.
why did it not catch on? Because the display was horrendously expensive since it was so mostly unique, compared to CRT components for 4x3 screens that were extremely cheap and off-the-shelf. It had nothing to do with coder preference at the time and EVERYTHING to do with how much a corporation was willing to spend on its coders. As such, few outside of Xerox even got to try it. Nobody knew what they were missing, because everybody knew it, "wasn't that difficult to work around".
Still, maybe someday it might be nice to have better options and better designs out there instead of stuff that "isn't that difficult to work around." Coding and building the right thing (or a more flexible thing but with better default settings) should be inherent in interface design, but it still isn't, because coders like you seem to be just fine with stuff that "isn't that difficult to work around."
Because of entertainment sources, laptops and desktop monitors are all wide-screen 16x9...
In short, it just doesn't work when the medium is text. (Say what you will about the coming illiterate age at this point...)
1080 is actually very uncomfortable for those of us who were coding in 1440x1280 4x3's prior to the HDMI standard locking us all down to 1080. I personally keep an external monitor rotated 90degrees in order to have a decent working space, separate from my "entertainment" and browsing space.
Who else had a long vertical orientation to the monitor, knowing it was a better way to work? Xerox PARC.
(and gee, many of our problems in Education go away when one addresses the poverty issue that makes education impossible rather than constantly trying to change the education system that has otherwise worked for generations)
So long as we just focus on *treatment* of the sick, costs will continue to spiral.
A general influx of cash doesn't just focus on treatment of those sick - it starts to alleviate the issues that allow disease to spread in the first place (lack of hygene, lack of vaccination, lack of clean water, lack of balanced food (and complete meals), and lack of general preventive care, and lack of birth control - ALL things people in poverty already lack).
Health care costs get under control when the focus is on prevention rather than treatment: you spend FAR less money when fewer people get sick. When you use the capital to address the causes of disease rather than just treating it, you spend much less on treating the ones that got away.
Relatedly, this is why insurance companies love birth control - a pill a day and a box full of condoms is far cheaper to them than the thousands of dollars for examinations, the birth, emergency natal care, and having to cover the kid for the next 26 years.
Recent obnoxious examples include Facebook (the one bundled with my phone was 9meg. The current is 34m + almost as much data memory used up), and Amazon's app store (a 5 meg app that brings in more than 17 and sometimes as high as 49 meg tagged *data*, not cache).
I dropped amazon from my phone (an older HTC Evo, so it only has 160 meg or so for app+data) because of this, and just gave up on those apps I'd purchased through it (some of which were on google play). Even so, any time I need to update FB, I generally have to wipe the data out to make room for the download.
Sheesh...
when the tulips start to bloom again.
College ain't supposed to be easy.
In CS, the killer is usually electro-magnatism or calculus-level probability.
In Physics, it is usually diff-eq's.
In math, it is usually partial diff-eq's.
Yes, the exceptions to the "rules" in org-chem is maddening...but if it wasn't, prescriptions and pharmaceuticals would be easy. Instead, they are rife with mistakes, side effects, false-positives, and a lot worse, and if you don't have the background to understand at least to a degree why, then I'll be damned before I let you write me a prescription for anything.
But seriously, this is college leading to one of the toughest post-grad programs our society has to offer. It is supposed to be hard. Deal with it or get out.
you really don't get it. money spent on a product that adds no value to the product is money lost.
if you don't get that simple *fact*, I'm not going to bother to say anything anymore.
Which was my point, I thought I made clear, from the very beginning. Keeping up with software updates (that theoretically aren't supposed to change anything unless they make it better, even if it means an API difference to code to) is one thing. Generally, one can hold back a patch and coordinate it with some other release cycle.
Having to adapt to an arbitrary, non-technical reason mandated by a clueless government is something else, a very EXPENSIVE, something else, as it is as i noted, something that adds no value to the product but costs as much as a new large feature would. And in this case, the deadline wasn't set by us either, but by the same government - we couldn't hold the patches back. Quite the opposite, we needed to get them in and tested ASAP as customers using our app to project into the future needed to have that date information accurate.
My response (see the title this thread has had all along) was in response to the continual suggestions that happen twice a year, every year, right on schedule, for getting rid of DST, but clueless dolts who think it is "easy" to do and wouldn't have any impact except some alleged positive one based on some analyst's arbitrary definition of "productivity". While getting rid of DST might eventually have a long term benefit, it will still have an expensive short-term cost that needs to be carefully considered based on the experiences of 2005-2007.
Just having some database isn't enough in and of itself. One needs to be able to process it and interpret it correctly so that it "makes sense" to the business model one is using.
In addition, code that looks backwards in time (reporting engines, for example, or event history viewers) need to be able to use the old rules as they look back, while looking at the current (and potentially new) rules looking forward, and have to know at what year/date some country made the decision to cut DST. This is a reason the TZ folder/file on linux boxes (for example) is as large as it is.
No, we're not mission-critical like medical stuff, but that doesn't stop the customer base from griping when the DST-handling code was broken before I was assigned to fix it.
And this DST issue is somewhat different from what I was originally ranting about, which was the amount of work I was going through building test plans and cases for actually being able to see if the software I was working on at the time handles the DST change correctly, coordinating and tracking all of the OS and software patches (there were 10 updates to Java's JDK/JVM alone in a matter of 5 weeks, ALL related to this issue) and ensuring our stuff would work.
"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."