Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Lochner v. New York (1905) to Parrish etc. (1937) (Score 1) 545

That is a great historical link, thanks! And that leads to this other on West Coast Hotel Co. v. Parrish, for two Supreme Court decisions that lead up to the Great Depression and then its resolution:
http://en.wikipedia.org/wiki/W...
"West Coast Hotel Co. v. Parrish, 300 U.S. 379 (1937), was a decision by the United States Supreme Court upholding the constitutionality of minimum wage legislation enacted by the State of Washington, overturning an earlier decision in Adkins v. Children's Hospital, 261 U.S. 525 (1923). The decision is usually regarded as having ended the Lochner era, a period in American legal history during which the Supreme Court tended to invalidate legislation aimed at regulating business.[1]"

Comment Studies show hours worked past 40/wk unproductive (Score 5, Insightful) 545

So, ultimately, the whole thing is self-defeating in general. Crunch times may be one thing, but on a regular basis, productivity declines even as people look busy.

One example:
http://www.inc.com/jessica-sti...
"The most essential thing to know about the 40-hour work-week is that, while it was the unions that pushed it, business leaders ultimately went along with it because their own data convinced them this was a solid, hard-nosed business decision....
        Evan Robinson, a software engineer with a long interest in programmer productivity (full disclosure: our shared last name is not a coincidence) summarized this history in a white paper he wrote for the International Game Developers' Association in 2005. The original paper contains a wealth of links to studies conducted by businesses, universities, industry associations and the military that supported early-20th-century leaders as they embraced the short week. 'Throughout the '30s, '40s and '50s, these studies were apparently conducted by the hundreds,' writes Robinson; 'and by the 1960s, the benefits of the 40-hour week were accepted almost beyond question in corporate America. In 1962, the Chamber of Commerce even published a pamphlet extolling the productivity gains of reduced hours.'
        What these studies showed, over and over, was that industrial workers have eight good, reliable hours a day in them. On average, you get no more widgets out of a 10-hour day than you do out of an eight-hour day."

With software, it is so easy to introduce a bug when you are tired or distracted (one reason team programming often saves money). A bug (especially a conceptual one) might be very expensive to debug down the road, especially if it makes its way to production. How many times have programmers spent days chasing a bug that was a one line fix? So, it may well be the case that longer hours mean *negative* productivity and higher costs for the extra hours worked past 40 per week even when the employee is not paid for the hours.

There is another complicating factor. Big companies in the 1970s such as HP or IBM invested in actually training employees, creating the pool of workers that Silicon Valley drew from initially. Investing in employee training is now rare, due in part due to little loyalty on either side of the employee/employer relationship in many companies. So, given that the tech industry moves so fast, where does the training time come from (including to read Slashdot :-)? Ideally, training should happen during those 40 hours. But in practice, many people working in IT have to keep current on their own time.

Yet training produces many benefits:
http://www.psychologicalscienc...
"A new study from a team of European researchers found that job training may also be a good strategy for companies looking to hire and retain top talent. When workers felt like they had received better job training options, they were also more likely to report a greater sense of commitment to their employer.
    For the study, psychological scientists Rita Fontinha, Maria Jose Chambel, and Nele De Cuyper looked at IT outsourcers in Portugal-who must constantly update their skills in order to keep up with the fast pace of new technology. The researchers hypothesized that when people were happy with the training opportunities their employer provided, they would be more motivated to reciprocate with an enhanced sense of loyalty to the company.
    This kind of informal balance of expectations between employees and management is known as a "psychological contract." When workers feel that their employer has fulfilled their obligations under the psychological contract, they're more motivated to uphold their end of the perceived bargain by working hard and staying with the company."

As you point out, this culture of (needless) overwork does discriminate against people with families. Likely it contributes to the focus often on finding young programmers so they can put in 60-80 hour weeks? The same programmers who often ignore customer service and will leave at a moment's notice chasing after the next shiny new thing that comes along, increasing workload from the cost of turnover?

So, there is a big IT cultural issue here... Not clear how to fix it though, but regulations like in the 1970s that had companies paying for overtime could be a start. As with the original 40 hour work week, if IT companies accept these overtime rules, it will probably be because they realize it is ultimately in their own financial interest and a way to level the playing field for everyone...

Soul Skill added the link to the DSLE page with overtime regulations, which say: "The exemption described above does not apply to an employee if any of the following apply: ... The employee is an engineer, drafter, machinist, or other professional whose work is highly dependent upon or facilitated by the use of computers and computer software programs and who is skilled in computer-aided design software, including CAD/CAM, but who is not in a computer systems analysis or programming occupation."

I can only think that engineers and machinists had better union representation back then? :-) I can wonder, with the successful and amazing Orion launch today by NASA, were most of the engineers and machinists working on that Orion project working 80 hour weeks with no overtime pay? Would launch have gone better with sleepy machinists milling the rocket exhaust nozzles? Would the launch have gone better if the ground crew fueling it has been up half the night? It seems to me to be crazy to even consider such things as even plausible -- no good engineering manager would allow that -- yet that is exactly the kind of conditions so many software projects labor under. Is it any wonder there are so many bugs?

That said, engineering managers have gotten good at estimating projects, whereas "Software is Hard". So, mis-estimates are going to continually be putting pressure on everyone to deliver by a superhuman effort. Here is an insightful idea, if also an ironic challenge:
http://gamearchitect.net/Artic...
"Scott Rosenberg coins this as Rosenberg's Law: Software is easy to make, except when you want it to do something new. The corollary is, The only software that's worth making is software that does something new."

Comment "Working hours: Get a life" at economist.com (Score 5, Interesting) 545

Thanks for the link, AC: http://www.economist.com/blogs...
"Working hours: Get a life ... The Greeks are some of the most hardworking in the OECD, putting in over 2,000 hours a year on average. Germans, on the other hand, are comparative slackers, working about 1,400 hours each year. But German productivity is about 70% higher. ... So maybe we should be more self-critical about how much we work. Working less may make us more productive. And, as Russell argued, working less will guarantee âoehappiness and joy of life, instead of frayed nerves, weariness, and dyspepsia"."

Interesting comments there like on work culture in South Korea, and I've just read the first couple comments of hundreds...

Comment Felt similar about the "firing" bit as extreme (Score 1) 254

I especially liked the link to "empathy is a core engineering value" though: http://www.listbox.com/member/...

Linked from: https://www.joyent.com/blog/th...

And if so, should not empathy extend throughout all levels of a learning organization, including between managers and subordinates? Everyone is learning stuff all the time, including about cultural changes. Firing someone rather than trying to understand the situation and the individual's motives more first and whether change is needed or possible does not seem "empathic". Perhaps that is the kind of thing you tend to learn after many years of experience being a parent or other long-term caregiver (including a long-term manager or mentor) when you see someone learn and grow and change over a long time?

Plus, as other comments suggest here, there is an assumption in this blog post that may ignore the possibility the issue was about consolidating minor changes rather than having them as individual commits. If this issue was deemed by enough of the community to be important, maybe a more systematic patch would indeed be in order? One tiny change is not much work, but it may set a bad precedent?

Also, it is not empathic to coworkers and the rest of a company and community depending on someone to fire that person without notice without reasonable review or attempts at remediation for a less than egregious offense (contrast with, say, someone accused of physically assaulting a coworker). The issue there is proportion and risk/harm assessment.

So, the response of "we would have fired him" seems too extreme in multiple ways.

I am all for meaningful diversity in workgroups, like discussed in this book:
"The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies"
http://www.amazon.com/The-Diff...

However, the problem with some of these "politically correct" initiatives or statements which seem on the surface to be helpful to promote "diversity" is that they can actually make workspaces more stressful for *everyone*. Someone can bully with the rules (or their interpretation) just as much, or more, than with a fist... Here is a website by psychologist Izzy Kalman that explores some issues related to bullying and truly creating happy productive workplaces by *really* emphasizing empathy and forgiveness and growth and free speech:
http://bullies2buddies.com/

Just think about it -- does everyone at Joyent now need to be afraid of getting fired if they check the word "he" into the codebase, even by accident? Or maybe by saying "he" accidentally as a meeting? There are potential unintended consequences of creating a different sort of hostile workplace climate, like many US schools are finding out these days as a result of "zero tolerance" policies (like biting a cracker at lunch to make it shaped like a gun can get you in deep deep trouble).

For reference, here is what makes for happy productive creative workplaces in general (Autonomy, Mastery, Purpose):
"RSA Animate - Drive: The surprising truth about what motivates people"
https://www.youtube.com/watch?...

Anyway, these are all complex issues about language, sex, management, control, gender roles, cultural change, recruitment, productivity, norms, and more. They are tricky to talk about or write about without seeming uncaring or inept because of various assumptions people make about the context or the people involved -- and the fact that none of us are "perfect" (and that perfection can be in the eye of the beholder based on priorities). It is sad to see such great software get mired in them. But I guess they are present in some form wherever we go or whatever we do.

BTW, since this whole furor is supposedly at the root about making women feel more accepted and happy in technology, here is a NY TImes story on what has made women happier than average in at least one Western country:
"Why Dutch women don't get depressed"
http://www.nytimes.com/2007/06...

A tangential quote from there: :-)
    "Once married, however, sex often took a back seat; for some early Calvinists even sex within marriage was sinful, de Bruin says, and Dutch women sublimated their sexual energy into domestic bullying.
    "They ordered the men around - there are many stories of bossy women and subordinate men," she said. "We know this from the literature of the 16th century, and it hasn't changed."
    Modern Dutch men are expected to share the chores at home, "without being told, or when told," de Bruin said. The Dutch woman "wants the man to do housework to help her feel equal, but he has to do it her way."
    Which perhaps raises the question, do Dutch men get depressed?
    Not much, according to de Bruin, who says that the behavior of the sexes evolved simultaneously, that Dutch men like their women bossy while Dutch women are not keen on macho men. Still, she sympathizes with men who have to negotiate a jungle of rules that they never understand and that are always set by women.
    "Luckily," she said, "most men have enough Tarzan in them to like a bit of a jungle.""

That is an example of of how happiness relates, in part, to cultural expectations and their acceptance. Reading that, I can perhaps understand my own parent's marriage a bit better (both being immigrants to the USA from the Netherlands) including with a contrast to the family lives of the mostly Catholic Italian and German families I grew up around -- as well as the stereotyped TV families...

Also on that theme of women in the Netherlands:
http://www.economist.com/blogs...
"IN AMERICA, as pretty much everywhere in the world, the happy narrative of development and freedom has involved more women working in the cash economy, achieving financial independence and thus greater autonomy. It's interesting when you find a country that seems to buck these sorts of universal narratives, and as Jessica Olien points out in Slate, the Netherlands bucks the women's-development narrative in a pretty odd fashion: it has extremely high indicators for gender equality in every way (education, political participation, little violence against women, ultra-low rates of teen conception and abortion) except that women don't work. Or not full-time, anyway, at anything like the rates at which women work in most OECD countries. ... "When I talk to women who spend half the week doing what they want -- playing sports, planting gardens, doing art projects, hanging out with their children, volunteering, and meeting their family friends -- I think, yes, that sounds wonderful. I can look around at the busy midweek, midday markets and town squares and picture myself leisurely buying produce or having coffee with friends.""

Personally, I feel the movement of women into the paid workforce (and out of roles in the subsistence, gift, and planned economies, see my website on that) has weakened the USA given men have not moved to take up the slack. As a consequence, in the USA, we see our home life suffering for want of real cooked food from private gardens, our non-profits suffering for lack of volunteers, and our politics failing for lack of anyone to pay attention or get involved as a volunteer in political campaigns or go to town meetings. Plus, women in the USA are a lot more unhappy now than before (according to studies), probably because many of them gave up the potnetial for a role with a lot of autonomy and creativity and mastery (raising kids and running a household) for jobs low status low wage jobs with a lot of supervision. My "gender neutral" approach towards the factoid in the nested quote is to want a "basic income" for everyone in the USA.... :-)

Related study:
"What's Happening To Women's Happiness?"
http://www.huffingtonpost.com/...
"First, since 1972, [in the USA] women's overall level of happiness has dropped, both relative to where they were forty years ago, and relative to men. You find this drop in happiness in women regardless of whether they have kids, how many kids they have, how much money they make, how healthy they are, what job they hold, whether they are married, single or divorced, how old they are, or what race they are. (The one and only exception: African-American women are now slightly happier than they were back in 1972, although they remain less happy than African American men.) ... The second discovery is, this: though women begin their lives more fulfilled than men, as they age, they gradually become less happy. Men, in contrast, get happier as they get older. ..."

That is the cultural backdrop behind so many cultural trends and changes in the USA. So, people are still tinkering with trying to improve that without really asking deep questions about a fundamental shift in most women's lives to ones with less autonomy, mastery, and purpose as they took paying jobs in an economy that has seen stagnant wages (and unpaid overtime) and lots of competition between workers for limited promotions? Sure, working as a software developer at a company like Joyent might seem purposeful (changing the landscape of FOSS web apps we all rely on), but really, most jobs in the USA don't have that level of obvious purpose, in part because work has often become so specialized and compartmentalized.

Obviously expectations about gendered pronouns are in flux in the USA (related to other social changes) and that causes various sorts of stress. Norms are definitely shifting. I pretty much always reword sentences to avoid choosing he/she for individuals or I use "they" incorrectly on purpose. A sentence with "he" for an arbitrary individual person does sound more archaic, although, to me, a sentence with "she" using the alternating she/he style also sounds forced. It's a bit astounding to think that issue has now bubbled up to the point where entire communities are getting forked over it.

Comment Re:Effort dilution (vs. Stigmergy) (Score 1) 254

"The scourge of Open Source disguised as choice.."

All too true too often. And the failure of the Linux Desktop to gain traction is a prime example of that (other than finally essentially via the Chromebook).

That said, "Stigmergy" is a way that large structures (like the FOSS landscape?) can get built by entities following relatively simple local rules. For example, termites build big complex mounds by getting excited when they see other termites having accomplished something small but interesting (creating an arch).
http://en.wikipedia.org/wiki/S...
http://www.evolutionofcomputin...

See especially:
http://journal.media-culture.o...
"Collaboration in small groups (roughly 2-25) relies upon social negotiation to evolve and guide its process and creative output. ... Collaboration in large groups (roughly 25-n) is dependent upon stigmergy."

Although in the termite case, they are increasingly joining together their separate actions vs. splitting apart the community in the NodeJS case. Maybe this is more akin to how a new generation of termite "queens" and their consorts takes to the air and finds a new place to create a new mound?

In any case, as a software developer moving into using NodeJS (and JavaScript in general) for new projects, this is not the sort of new I really want to hear. That is because it seems, in the short term, to increase risk (including from dilution of effort and community). In the long term I can, of course, be cautiously hopeful that the social and organizational issues will get worked through one way or another.

Fortunately, and why I like the JavaScript ecosystem even as I find JavaScript the language awkward to work with,there are many possible JavaScript containers to run stuff in. Here are a couple more for the server:

http://nodyn.io/
"Nodyn is a Node.js compatible framework, running on the JVM powered by the DynJS Javascript runtime"

http://ringojs.org/
"Ringo is a CommonJS-based JavaScript runtime written in Java and based on the Mozilla Rhino JavaScript engine. It takes a pragmatical and non-dogmatic stance on things like I/O paradigms. Blocking and asynchronous I/O both have their strengths and weaknesses in different areas."

So, in the Stigmergic sense, the idea of JavaScript everywhere (including on the server) is taking off as all us little FOSS termites get excited about the idea and work together on various arches. And with ways to compile C to code that can run efficiently on a JavaScript runtime, I wonder f we will see more and more adoption of JavaScript containers and further improvements in them.

While divisions of this look painful, when you step back and look at the landscape of millions of software developers who like to develop software (sometimes in different styles or with different emphases) this kind of forking is inevitable.

Reflecting on this though, I started shifting from Python around the time that the "Benevolent Dictator for Life" Guido van Rossum created a new (somewhat) backwardly-incompatible version of Python (3) while the community kept pushing support for the old one. Perl faced a similar issue with a new version going to version 6. I'm sympathetic to that dilemma for the original authors, but those are, to a lesser extent, and maybe with less drama, other examples of these sorts of tensions of priorities and individual vs. community control regarding priorities and future directions.

We probably need to develop much better understanding of what makes a FOSS project a success (in terms of community dynamics) and how it can stay a success despite trying to fix up early design choices.

Sometimes workarounds can keep things together for a time, like how JSLint/JSHint and "use strict" is a workaround for the problematical issue of JavaScript variables being global by default when not declared with var. That seems to be the right answer for the community, but the right technical answer for the programmer would really have been to change the language so that variables are scoped differently.

For JavaScript, it seems the influence of the original developer was lost very early on and a defacto community process happened very quickly. However, the evolution of JavaScript as a population of capabilities has been a painful process for anyone who has programmed in it over that time Again though, libraries and frameworks like jQuery or Dojo or others (NodeJS and CommonJS) tried to patch a community solution as a workaround on top of fundamental initial language spec issues (no modules, no Deferred, no basic string manipulation like startsWith, etc.).

Sadly, much of all this new development still has not reached the ease of use or consistency of, say, VisualWorks Smalltalk from the early 1990s. (which was killed by licensing costs, not being FOSS, even as it limps along as a proprietary product after missing its chance to be adopted by Sun instead of Java due the then "owner" of VisualWorks to wanting runtime fees for set-top boxes).

Anyway, back to NodeJS coding...

Comment ROI for Innovation vs. Conquest (Score 2) 140

I was reading "Pandora's Seed: The Unforeseen Cost of Civilization" by Spencer Wells this morning. He makes the point that hunter/gatherers tend to walk away from social conflicts, whereas people in large militaristic agricultural hierarchies instead tend to end up fighting wars for resources as they see no other alternatives. I had a lot of youthful optimism in the 1970s stemming in part from the US space program and many space-related TV shows (Thunderbirds, Star Trek, Space: 1999, Lost In Space). To be potentially capable of the military conquest of the planet Earth, a country probably has to be of the scale of WWII Germany or the USA -- having about 5% of the planet's population and land. So that means, ignoring moral aspects and such, the maximum return on military investment for Empire can be at most about 20 to 1 relative to the total resources (including people) you are starting with and essentially gambling. By contrast, investments in Research & Development, such as the space program like with Orion or new energy sources like hot or cold fusion or dirt-cheap solar PV or whatever have the potential to produce much greater returns than 20:1 on investment. Imagine if the USA had poured the cost of the Iraq war (three or more trillion US$ at this point) into fusion research. We might have 1000X as much cheap less-polluting energy to use (including for space launches) than we have now. Increasing human capability to get into space and live there in self-replicating space habitats potentially could produce another 1000X or more return in land area to live in. Even as 100 trillion dollars to make the first such self-replicating space habitat, the ROI is so much higher than that of preparing to fight a global war of empire-building.

Maybe we can see a return to other ideas, like those from back when NASA overall was more optimistic under Carter?
"Advanced Automation for Space Missions"
http://en.wikisource.org/wiki/...
"This document is the final report of a study on the feasability of using machine intelligence, including automation and robotics, in future space missions. The 10-week study was conducted during the summer of 1980 by 18 educators from universities throughout the United States who worked with 15 NASA program engineers. The specific study objectives were to identify and analyze several representative missions that would require extensive applications of machine intelligence, and then to identify technologies that must be developed to accomplish these types of missions. This study was sponsored jointly by NASA, through the Office of Aeronautics and Space Technology and the Office of University Affairs, and by the American Society for Engineering Education as part of their continuing program of summer study faculty fellowships. Co-hosts for the study were the NASA Ames Research Center and the University of Santa Clara, where the study was carried out. Project co-directors were James E. Long of the Jet Propulsion Laboratory and Timothy J. Healy of the University of Santa Clara."

There are probably nuances here regarding how much of the country is at risk in such a military gamble and so on, as well as the value of military investments for deterrence (how much is enough?), but that is the broad brush picture I've always seen based on that early optimism. And given that a supervolcano like Toba (mentioned by Spencer Wells as killing of most humans about 70,000 years ago) or a pandemic (like Ebola) could wipe out most people (from a decade long winter and a new ice age), it seems investments in cooperation to develop productive innovations including space habitats has a much better risk/reward ratio than most military investments which ultimately still don't secure you against supervolcanos or plagues and similar things.
http://en.wikipedia.org/wiki/T...

While it has sometimes been called "The Conquest of Space", it is a very different thing than the conquest of lands that already have people on them, just using them differently, like exemplified by the Sand Creek Massacre against Native American's that just passed its 150 year remembrance:
http://en.wikipedia.org/wiki/S...

Technology is an amplifier. What human emotions and aspirations do we want it most to amplify?

Comment Congrats to NASA on a great launch! (Score 2, Informative) 140

What great news to wake up to! Hoping for many more optimism-promoting successes like this on the road to humans living in space habitats that can duplicate themselves from sunlight and asteroidal or lunar ores.

Here is a PBS NewsHour video with launch footage:
http://www.pbs.org/newshour/up...

BTW, that PBS NewsHour Orion article led me to another PBS NewsHour article which formed the basis of my most recent "optimistic" Slashdot story submission on how restoring 1970s overtime regulations could boost the US economy:
http://slashdot.org/submission...

With a stronger economy, maybe there would be even more demand for space-related ventures of all sorts?

Submission + - Should IT professionals be exempt from overtime? (pbs.org) 1

Paul Fernhout writes: Nick Hanauer's a billionaire who made his fortune as one of the original investors in Amazon. He suggests President Obama should restore US overtime regulations to the 1970s to boost the economy (quoted by PBS NewsHour):
"In 1975, more than 65 percent of salaried American workers earned time-and-a-half pay for every hour worked over 40 hours a week. Not because capitalists back then were more generous, but because it was the law. It still is the law, except that the value of the threshold for overtime pay--the salary level at which employers are required to pay overtime--has been allowed to erode to less than the poverty line for a family of four today. Only workers earning an annual income of under $23,660 qualify for mandatory overtime. You know many people like that? Probably not. By 2013, just 11 percent of salaried workers qualified for overtime pay, according to a report published by the Economic Policy Institute. And so business owners like me have been able to make the other 89 percent of you work unlimited overtime hours for no additional pay at all.
    The Obama administration could, on its own, go even further. Many millions of Americans are currently exempt from the overtime rules--teachers, federal employees, doctors, computer professionals, etc.--and corporate leaders are lobbying hard to expand "computer professional" to mean just about anybody who uses a computer. Which is almost everybody. But were the Labor Department instead to narrow these exemptions, millions more Americans would receive the overtime pay they deserve. Why, you might ask, are so many workers exempted from overtime? That's a fair question. To be truthful, I have no earthly idea why. What I can tell you is that these exemptions work out very well for your employers. ...
    In the information economy of the 21st century, it is not capital accumulation that creates growth and prosperity, but, rather, the virtuous cycle of innovation and demand. The more innovators and entrepreneurs we have converting ideas into products and services, the higher our standard of living, and the more people who can afford to consume these products and services, the greater the incentive to innovate. Thus, the key to growth and prosperity is to fully include as many Americans as possible in our economy, both as innovators and consumers.
    In plain English, the real economy is you: Raise wages, and one increases demand. Increase demand and one increases jobs, wages and innovation. The real economy is simply the interplay between consumers and businesses. On the other hand, as we've learned from the past 40 years of slow growth and record stock buybacks, not even an infinite supply of capital can persuade a CEO to hire more workers absent demand for the products and services they produce.
    The twisted irony is, when you work more hours for less pay, you hurt not only yourself, you hurt the real economy by depressing wages, increasing unemployment and reducing demand and innovation. Ironically, when you earn less, and unemployment is high, it even hurts capitalists like me. ..."

If overtime pay is generally good for the economy, should most IT professionals really be exempt from overtime regulations?

Comment Thanks for response on lead & crime & Tipp (Score 1) 48

I wonder if the reply counts as a good enough "reputable source" to update the Wikipedia article on the Tipping Point?

I was also glad to see two questions mentioning automation issues (one referencing a basic income). Maybe we'll see a new book on that as Malcolm Gladwell explores those issues more in depth?

Comment Re:What's happening to Linux? (Score 1) 257

I went through this around 2007-2008 when after running Debian as a desktop for about five years on two desktops, my wife and I got tired of the breakage with every major update. While I was willing to put up with more, my wife got tired of me spending a few hours trying to sort things out on her desktop with every update -- often basic things like graphics driver stuff in a multi-monitor setup. Power savings never worked (I gave up on it).

What often drove updates was wanting to use the latest version of Eclipse or Firefox or other applications. My wife went first, going to a Mac Pro, and I followed about a year later. We're still using that hardware, although upgraded in various ways (memory, drives, graphics cards and monitors).

That said, Linux is everywhere and those years of working with it all the time have been very useful in maintaining servers (including in VirtualBox) and embedded hardware (NAS, routers, media, other) which generally face less updates that desktops. I feel Linux settled down to stability a couple years after that (driven in part by Ubuntu's widespread adoption) -- although it sounds like instability has picked up again. I feel that in general about FOSS -- maybe the old guard is getting bored or old or tired or busy or burned out and new people move to web stuff?

Of course now, my wife's Mac Pro from 2007 is not supported for an upgrade past Snow Leopard. Mine is, but I'm not sure if it is worth it yet. But, more and more, software coming out has a minimum of later versions. And there are no more Snow Leopard updates. And my wife's machine has a sporadic kernel panic or something once every few weeks or so. And mine has also been doing some lockups, although not recently after resetting the PRAM.

There were some big disappointments leaving Debian. I liked cut-and-paste under Linux where selecting something put it in the copy buffer. Mac is harder, including weirdness about having to menu click within the selected text to pull up a copy menu. Apt get was great (when stuff was compatible) and a sad loss to not have. Also, Mac's GUI design with a single global menu is just *terrible* on a multi-monitor setup, especially if the monitors are different heights; having a menu per application window like Linux makes so much more sense. I also don't like the fact that I could easily (without copyright concerns) virtualize old Linux setups, but you can't really do that with Mac OS X -- in that sense, all my work feels "contaminated" by copyright issues. That said, Apple Time Machine "just works" as a backup solution (ignoring the risks of having a plugged in backup hard drive in a worst case).

Comment Or Google could be made into a public utility... (Score 1, Troll) 237

Just saying, there are other options; whether we pursue them is a different story. Google's non-search activities (like Google Apps, Chromium, other Google Lab stuff) generally only make significant financial sense to the company in the context of their search business, so breaking up Google means those spinoff businesses would probably immediately go bankrupt.

What was really wrong with an AT&T that funded Bell Labs and created UNIX with government-mandated 5% or so of revenue to be spent on (free and open source) R&D like was the case with AT&T? As someone once said, Bell Labs was funded by people dropping dimes into boxes across the country. Telephone costs have changed in the USA since the breakup, *but* it is not really clear how much of that had to do with the "baby bells" and competition and how much had to do with Moore's law an an exponential reduction in computing costs per MIP that made packet switching (even in the home) so much cheaper.

See:
"The End of AT&T: Ma Bell may be gone, but its innovations are everywhere"
http://www.beatriceco.com/bti/...
"It's 1974. Platform shoes are the height of urban fashion. Disco is just getting into full stride. The Watergate scandal has paralyzed the U.S. government. The new Porsche 911 Turbo helps car lovers at the Paris motor show briefly forget the recent Arab oil embargo. And the American Telephone & Telegraph Co. is far and away the largest corporation in the world.
    AT&T's US $26 billion in revenues--the equivalent of $82 billion today--represents 1.4 percent of the U.S. gross domestic product. The next-largest enterprise, sprawling General Motors Corp., is a third its size, dwarfed by AT&T's $75 billion in assets, more than 100 million customers, and nearly a million employees.
    AT&T was a corporate Goliath that seemed as immutable as Gibraltar. And yet now, only 30 years later, the colossus is no more. Of the many events that contributed to the company's long decline, a crucial one took place in the autumn of that year. On 20 November 1974, the U.S. Department of Justice filed the antitrust suit that would end a decade later with the breakup of AT&T and its network, the Bell System, into seven regional carriers, the Baby Bells. AT&T retained its long-distance service, along with Bell Telephone Laboratories Inc., its legendary research arm, and the Western Electric Co., its manufacturing subsidiary. From that point on, the company had plenty of ups and downs. It started new businesses, spun off divisions, and acquired and sold companies. But in the end it succumbed. Now AT&T is gone. ...
    Should we mourn the loss? The easy answer is no. Telephone providers abound nowadays. AT&T's services continue to exist and could be easily replaced if they didn't.
    But that easy answer ignores AT&T's unparalleled history of research and innovation. During the company's heyday, from 1925 to the mid-1980s, Bell Labs brought us inventions and discoveries that changed the way we live and broadened our understanding of the universe. How many companies can make such a claim?
    The oft-repeated list of Bell Labs innovations features many of the milestone developments of the 20th century, including the transistor, the laser, the solar cell, fiber optics, and satellite communications. Few doubt that AT&T's R&D machine was among the greatest ever. But few realize that its innovations, paradoxically, contributed to the downfall of its parent. And now, through a series of events during the past three decades, this remarkable R&D engine has run out of steam. ...
    The funding came in large part from what was essentially a built-in "R&D tax" on telephone service. Every time we picked up the phone to place a long-distance call half a century ago, a few pennies of every dollar--a dollar worth far more than it is today--went to Bell Labs and Western Electric, much of it for long-term R&D on telecommunications improvements.
    In 1974, for example, Bell Labs spent over $500 million on nonmilitary R&D, or about 2 percent of AT&T's gross revenues. Western Electric spent even more on its internal engineering and development operations. Thus, more than 4 cents of every dollar received by AT&T that year went to R&D at Bell Labs and Western Electric.
    And it was worth every penny. This was mission-oriented R&D in an industrial context, with an eye toward practical applications and their eventual impact on the bottom line. ..."

In this content, "search" (and a related constellation of applications) has become a public utility. So, just treat it like one. Facebook likewise could be treated that way. As could Microsoft.

In general, these sorts of market failures (given the rich market leaders tend to get richer and more market leading) show a fundamental problem with free market ideology in practice in the 21st century. It does not matter in the social/political consequences if Google might someday be replace in our attention by some next huge monopoly market spanning entity. The point is that this keeps happening with significant effects on out social and political fabric, and the company names just change.

In any case, if Moore's law continues for another couple decades, today's Google server farm's computational capability might fit on a laptop of the 2040s, which could also store all the surface internet content of today. At that point, with all the possible human cultural content you might want stored and searchable just inches from your brain, what would Google's business model be? So, in that sense, this political power issue may be self-limiting, although we will see new issues, as "the right to be forgotten" will take on new complexities on the order of asking the populace to forget about what it previously learned about someone...

Comment Build or support alternatives where you are... (Score 1) 71

Whatever makes sense with your skills, resources, and connections... These alternatives are there to provide the seeds for a next generation. They can be things like non-profits, for-profits, hobbies, community organizations, libraries, social networks, barter exchanges, citizens groups focused on one important local issue like a better library or better infrastructure of some sort, a movement for a basic income, LETS systems, or whatever. A healthy society has a good mix of subsistence, gift, exchange, and planned transactions. If you think the system is out of balance, then create or support counterbalancing forces (in a legal, healthy, and optimistic way). Tiny non-profits across the USA are suffering from lack of leadership and members as TV and the internet and dual-income families soak up all the otherwise spare volunteer time. The "old" USA from a century or so ago had those strong traditions of a mix of all those things, and such a mix is at the root of "Democracy" IMHO.

I used to think Debian provided one example of alternative governance, although lately mostly bad news on that front regarding the systemd issue. Hopefully it will move past that and become stronger through some self-reflection.

Search on "Michael Rupert Evolution" on his "From the Wilderness" site for some related interesting reading where he tried to move to another country and it didn't work out (an extreme case, and I dismiss his worries about "Peak Oil" as overblown, but he had some insights there about building where you are now and are connected).

Comment You might like: "Marxism of the Right" (Score 1) 197

http://www.theamericanconserva...
"This is no surprise, as libertarianism is basically the Marxism of the Right. If Marxism is the delusion that one can run society purely on altruism and collectivism, then libertarianism is the mirror-image delusion that one can run it purely on selfishness and individualism. Society in fact requires both individualism and collectivism, both selfishness and altruism, to function. Like Marxism, libertarianism offers the fraudulent intellectual security of a complete a priori account of the political good without the effort of empirical investigation. Like Marxism, it aspires, overtly or covertly, to reduce social life to economics. And like Marxism, it has its historical myths and a genius for making its followers feel like an elect unbound by the moral rules of their society.
    The most fundamental problem with libertarianism is very simple: freedom, though a good thing, is simply not the only good thing in life. Simple physical security, which even a prisoner can possess, is not freedom, but one cannot live without it. Prosperity is connected to freedom, in that it makes us free to consume, but it is not the same thing, in that one can be rich but as unfree as a Victorian tycoon's wife. A family is in fact one of the least free things imaginable, as the emotional satisfactions of it derive from relations that we are either born into without choice or, once they are chosen, entail obligations that we cannot walk away from with ease or justice. But security, prosperity, and family are in fact the bulk of happiness for most real people and the principal issues that concern governments."

I would add "community" and "health" as public goods government should also help support.

BTW, to underscore the point that charity only tends to work well in communities where people are well known to each other (either that or an abstract gifte economy like JP Hogan wrote about), see:
"Switzerland's shame: The children used as cheap farm labour"
http://www.bbc.com/news/magazi...
"Gogniat, his brother and two sisters were "contract children" or verdingkinder as they are known in Switzerland. The practice of using children as cheap labour on farms and in homes began in the 1850s and it continued into the second half of the 20th Century. Historian Loretta Seglias says children were taken away for "economic reasons most of the time⦠up until World War Two Switzerland was not a wealthy country, and a lot of the people were poor". Agriculture was not mechanised and so farms needed child labour.
    If a child became orphaned, a parent was unmarried, there was fear of neglect, or you had the misfortune to be poor, the communities would intervene. Authorities tried to find the cheapest way to look after these children, so they took them out of their families and placed them in foster families. ...
    The extent to which these children were treated as commodities is demonstrated by the fact that there are cases even in the early 20th Century where they were herded into a village square and sold at public auction. ...
    "Children didn't know what was happening to them, why they were taken away, why they couldn't go home, see their parents, why they were being abused and no-one believed them," she says.
    "The other thing is the lack of love. Being in a family where you are not part of the family, you are just there for working." And it left a devastating mark for the rest of the children's lives. Some have huge psychological problems, difficulties with getting involved with others and their own families. For others it was too much to bear. Some committed suicide after such a childhood.
    Social workers did make visits. David Gogniat says his family had no telephone, so when a social worker called a house in the village to announce that she was coming, a white sheet was hung out of a window as a warning to the foster family. On the day of this annual visit David didn't have to work, and was allowed to have lunch with the family at the table. "That was the only time I was treated as a member of the family... She sat at the table with us and when she asked a question I was too scared to say anything, because I knew if I did the foster family would beat me." ...
    The Farmers Union agrees with the principle of compensation, but is adamant that farmers should not have to contribute. You have to understand the times in which these children were placed into foster care, says union president Markus Ritter. Councils and churches had no money. Farming families were asked to take children who had fallen on difficult times or had one parent so the farmers were fulfilling a social function. Does he acknowledge abuse occurred? "We received a lot of feedback from children who were treated really well⦠But we are also aware that some children were not treated properly." ..."

Of course, either big business out of control or big government out of control (or both at once) is a terrible thing, like a fire let loose to rage and burn everything good in its path. Libertarian criticism is often valid, even if solutions put forth by "propertarian" libertarians may be found wanting in various extreme aspects. (BTW, there are also "Libertarian Socialists" lwhich are better represented in Europe, and that is what the rest of the world outside the USA thinks of when people say libertarian -- an example being Noam Chomsky.) So, given that our society is no longer small-scale enough for some older social processes to work well (short of rethinking and remaking our infrastructure, which is maybe a good idea in any case), we need to think about a healthy balance, which can be a very hard thing to achieve or maintain.

Slashdot Top Deals

Any circuit design must contain at least one part which is obsolete, two parts which are unobtainable, and three parts which are still under development.

Working...