Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).


Comment: He's right, but the conclusion may require nuance. (Score 2) 145

by aussersterne (#49384039) Attached to: The End of College? Not So Fast

Here's the thing—we may not actually want every otherwise unmotivated late teen to be sitting dubiously through college courses just because it's either that or go back to their dorm and twiddle their thumbs. Some things:

- There is an oversupply of graduates these days in most fields and at most levels
- A dawdle-dawdle unmotivated student is not doing their highest quality learning
- Even students that will eventually use what they learn may not do so for years
- In the meantime, what they learned is getting very rusty between learning and use

So with these things said, *how about* a model in which:

- People are not motivated to learn something until they need to
- Once they need to, they are happy to blast through it intensely
- And they will put it to use right away
- And their motivation comes from needs (for a raise, to be competitive, etc.)

I would think this would help to mitigate some of the particular supply/demand problems on all sides (for an education/for students/for graduates as employees).

The one caveat, and it's an important one, is that we do of course want people to be generally mature, thoughtful, capable, and culturally literate if they are goint to be participating in society, and right now high schools are failing utterly at even touching these points.

So to address that need, let's just require a minimal level of "general" college-level education, say a one-year or two-year degree that as no "major" or "minor" selections and issues no grades, but certifies literacy about politics/citizenship, social science (particularly social problems), national culture, basic quantitative reasoning, and so on—enough to become a careful thinker and to better understand "how to learn stuff."

This general education certification would be required in order to:

- Vote
- Get a business license
- Sit on a corporate board

But would be disconnected from particular vocational or other subject-oriented learning issued via, say, MOOCS as well as face-to-face alternatives. And instead of a major in a single discpline, outcomes from MOOC courses could be used to calculate a nationally databased and relatively involved (many measures) "bar chart" for each student, that tallied their experience and competence with particular subject areas, expressed quantitatively as a figure without an upper bound, that is added to with each additional course, and perhaps incorporating quantitative feedback about their performance from employers as well:

So instead of wanting someone with 4-year degree and a "major" in computer science, employers could seek someone with their general education certification along with "at least a 1400 in OS design, a 650 in Java, and a 950 in medical organizations and systems" and so on.

Over the course of a lifetime, scores in any particular area could continue to increase, either by taking additional MOOCs to get more exposure, or by having employers report on accumulated skills and experience to the system.

So that someone that took only a few courses in X in school, but in the real world and on the job, became—over 20 years—the best X in the country, would have this gradually reflected in their national education/experience scores as the years of experience and successes mounted.

Meanwhile, we'd also no longer have the weird mismatches that come when an employee has a degree in Y but actually works in Z, and then has to explain this in various ways to various parties. First of all, at the level of the 1-or-2-year general education, they would no longer gret a "degree in" Y. That would be handed by MOOCs and represented in varous numbers that increased as the result of completing them.

But if someone did do an about-face and choose an entirely different subject or work area in life, this would also gradually be reflected in their education/experience scores. We'd know when someone who'd studied chemistry in their '20s finally became a "real biologist" because their scores in biological areas would begin to overtake their scores in chemical areas, and so on.

Of course none of this is plausible because social systems simply don't work this way. But (after a long digression) getting back to the point, it's not all that bad that a MOOC isn't the same as forcing someone in their late teens or early '20s to sit in a classroom and drag their feet through a BA/BS.

Comment: A social scientist translating for them (Score 2, Informative) 442

by aussersterne (#49366895) Attached to: Experts: Aim of 2 Degrees Climate Goal Insufficient

What they're trying to say, using the usual feminist sociology over-loquatiousness is:

For some on the planet, keeping it under 2 degrees will preserve a relatively familiar or at least acceptable quality of life.

For others on the planet, quality of life can only be preserved by keeping it under, say 1.5 degrees, or even one degree.

The first group (that can live with a higher threshold) are those in the upper portions of the global economic scale, and it's an acceptable rise for them because they can also afford technologies and tools (getting crude, say, air conditioners, new home materials, new kinds of agricultural output, etc.) that make a 2 degree rise tolerable.

The second group (that can't live at the 2 degree threshold, and really need a lower one) are going to tend to be in the lower portions of the global economic scale, who won't have access to the technologies and tools that make a 2 degree rise livable for those at the top of the scale.

Policymakers and scientists tend, by virtue of their privileged position, to be in the first group, and have thus set the 2 degree rise in connection with thinking of their own, best-case lifestyles, rather than—say—a member of one of the globe's largely impoverished equatorial populations without access to much in the way of resources, tools, or technologies already.

It's a good point: the effects are not uniform, and if 2 degrees is the upper bound for the people who are the globe's *most* comfortable, then it's probably a bad upper bound in general, because it will "cook" (even more than already occurs) those that are the *least* comfortable.

It was, however, bad language and clarity—which is a sin that social science commits far too often.

Their point is well taken:

Comment: I'll join the chorus: Mac. (Score 1) 385

by aussersterne (#49289585) Attached to: Ask Slashdot: Choosing a Laptop To Support Physics Research?

Get a nicely configured MBP and be done with it.

It's the most common platform in research and academic settings for individual use these days, which means that there is a social dimension to the available support (i.e. people around you can help with problems). Meanwhile, the platform is narrow enough and the OS and hardware tightly bound together enough that one-off bugs and edge cases are exceedingly rare (which is not the case for Linux).

And Apple has very reasonable quality control in both hardware and software.

Having done a Ph.D. and dealt with the pressures and complexities that come therewith, I'd say that the overriding concerns should be reducing the PITA factor, keeping downtimes short, eliminating unexpected behavior and gotchas to whatever extent possible, and buying in to the largest on-the-ground support network (i.e. installed customer base) that you can find with identical hardware/software.

All of these things point to Mac for academic research settings.

Comment: You're right and wrong. (Score 2) 320

You're absolutely right about incentives and grant money.

How you tied this to the Nobel Prize is beyond me, so let's drop that.

The incentives are all about grant money and outside (the campus) capital. As a result, the science takes a back seat to market economics, market-ing (both of corporate partners and of academic institutions themselves, which increasingly operate in a competitive marketplace for enrollments), management concerns, investors, etc.

This incentive structure is increasingly becoming the norm well beyond U.S. shores.

So the problem isn't that science is increasingly wrong, it's that scientists are increasingly doing labor that may *involve* science, but that is in fact product-oriented R&D driven by short-term investment timelines and economic and investor-friendly optics, and whether any of it is good *science* is secondary or tertiary to whether it's profitable, whether directly or indirectly.

Let the scientists go back to doing science first and money-making (whether to support their own tenure lines or to support corporate profits) second or even better, third, fourth, or fifth, and you'll find that the ship rights itself.

Comment: Mr. Moynihan should have read on the (Score 1) 375

by aussersterne (#49162227) Attached to: Google Wants To Rank Websites Based On Facts Not Links

problems of epistemology, including in science.

Note that there are no shortage of facts whose veracity depends on nuanced facets of context and condition, some of which are disputed.

For example, fact or not: "Linux is a difficult operating system to use, and is a better choice for geeks and hackers than for regular users."

Or how about:

"Android is an operating system written by Google."

Or how about:

"The Bermuda Triangle region has seen an unusually high number of ship and plane disappearances over the years, and may be a particularly dangerous place to travel."

Because unless Google's algorithms are very, very nuanced in their approach, each of these is going to be seen as carrying high levels of factuality based on the preponderance of content out there, particularity in high-authority sources.

Of course, statements like the first and third are too complex for Google's rankings to evaluate and rank, and it can only work with very simple assertions on the order of "Milk is white," or "Obama is a Democrat," the it's going to do practically nothing (good or bad) at all for the rankings, since facts with this level of consensus are generally undisputed, even by those that promote falsehoods.

Comment: This shifts the weakness in Google's rankings (Score 3, Interesting) 375

by aussersterne (#49160739) Attached to: Google Wants To Rank Websites Based On Facts Not Links

from gameability (in short, SPAM) to politics. Rather than punish above-board or non-predatory websites, it will punish both subversive and innovative thought that runs well ahead of social consensus. Sure, it will also eliminate willful misinformation, but it turns Google into an inherently conservative, rather than socially innovative, force.

Can't say I think it's better. Probably not any worse, but certainly not panacea.

Comment: Sociological problem: CYA (Score 5, Insightful) 158

by aussersterne (#49148335) Attached to: Invented-Here Syndrome

Part of the problem is the CYA issue.

If you're writing the code, you sound like a laborer ("I have to..."). If it breaks, it's your fault and you're on the hook publicly.

If you present a third-party component in a meeting, you sound like a manager ("I propose that we..."). Once three or four other people in the meeting have concurred, if something breaks it's the third party's fault. A ticket or two are initiated, it's someone else's problem and everybody gets to cast blame somewhere beyond the walls of the company.

Rational behavior, regrettably.

Comment: You're absolutely right. The desktop is over. (Score 1) 393

by aussersterne (#49071329) Attached to: PC-BSD: Set For Serious Growth?

I have no idea why people are arguing with you about this. The evidence (not least from the desktop computing industry) is everywhere, with catastrophically declining sales over the long term, offset by increases in mobiles and tablets—which, incidentally, Linux has already won, though in large part by leaving the distro community behind.

Linux could actually conquer the desktop in the end—a few years down the road when desktop computing is a specialized, professionals-only computing space. The users of other desktop operating systems are slowly bleeding off to mobile and tablet.

But this can only happen, ironically, if distros and devs stop trying to conquer the desktop in the present. If they continue down the path they're on, the long-term desktop community, which would be a natural fit for the Linux of yore, will probably be on some other OS. (MacOS? Surely not Windows at this point.)

Comment: Are the job description and your actual requiremen (Score 1) 809

by aussersterne (#49048685) Attached to: Ask Slashdot: What Portion of Developers Are Bad At What They Do?

That is to say, did you call for applications from *deeply* experienced people that know esoteric systems X, Y, and Z and that have previously worked with New Hot Language Q and Languages Of The Week I and J?

Or did you ask for unusual gurus that understand and have a *broad* range of experience with a wide variety of fundamental computing concepts and theory and can apply them correctly while rapidly getting up to speed on new environments and/or languages?

Because you're complaining about not getting the second group, while most of the job listings posted in industry are the first group.

There is often little overlap between the two, and HR departments and managers seem to default to looking for the first even when they actually need the second.

At a more prosaic level, if you specifically need someone that is going to understand general purpose encryption tools, you can also put that in the description.

A lot of the frustration with "not being able to get talent" in tech comes down to not asking for (or being willing to hire based on) what is actually needed. Instead, everyone is in CYA mode and making job listings and hires that are buzzword-rich and, thus, easily quantifiable ("he hit the right series of checkboxes, it's not my fault that he sucks, I did my part...")

Comment: Re:How about just don't buy a phone from the carri (Score 1) 100

by aussersterne (#49045179) Attached to: Starting This Week, Wireless Carriers Must Unlock Your Phone

Try NET10. If you got in last year, you could get 2GB + throttling to 3g HSPA unlimited everything for $40/mo., month-by-month (no contract).

New signups right now get 3GB + throttling to 64kbps unlimited everything for $45/mo., month-by-month (no contract).

AT&T's 2-year contact for 3GB is currently $80/mo.

NET10 GSM plans use the AT&T network, so the coverage is the same and the phone compatibility is the same.

Comment: How about just don't buy a phone from the carriers (Score 4, Interesting) 100

by aussersterne (#49043037) Attached to: Starting This Week, Wireless Carriers Must Unlock Your Phone

in the first place?

There are some FABULOUS devices coming out of China these days, readily available on eBay and Amazon, with high specs, Android KitKat or Lollipop, and sold at half the price or less vs. offerings from the carriers.

Just got a Huawei Honor X1 and am using it with an MVNO in the US. The retail price of the new off-contract phone from China, purchased on eBay, was about what the two-year on-contract retail price of a similarly specced Android device is in the U.S. The MVNO contract, with "unlimited" data (throttling to HSPA+ after the first several GB every month) is less than half the price of a similar contract at a major carrier.

There's no reason to buy on-contract phones any longer.

Comment: Yup, bewildering management. (Score 2) 294

by aussersterne (#48995839) Attached to: Radioshack Declares Bankruptcy

They seem to have decided a number of years ago to try to be Best Buy, only in 1/20th of the floor space, with higher prices, and while ensuring that they rebadge any major brand products to bear their own, woefully antiquated and little-known brand badges instead, to ensure that consumers would gravitate to Best Buy instead, where said major brands with which consumers were familiar continued to remain on display.

It started to make zero sense sometime in the late-1980s and it just got worse and worse from there.

I still buy parts, diagnostic equipment, and accessories for many tech items in the house. Just now I buy them on I just bought a pack of about 30 DPDT switches the other day for $5.00 or so. I don't need 30, I just need one. I'd have just as well paid Radio Shack $2.99 for a switch and had it the same day—only the local store doesn't carry that stuff any longer.

Comment: Mom-and-Pops don't survive in America (Score 3, Insightful) 294

by aussersterne (#48995829) Attached to: Radioshack Declares Bankruptcy

because suburbanites and flyover folks won't shop in them. Mom and pop and competing national chain open on the same block, the entire crowd flocks to national chains, particularly in smaller communities. Hell, they're even proud to have them. Getting a Wal-Mart means they've arrived, it puts them on the map.

The only place where Mom-and-pop shops still survive are in heavily blue urban areas, where they continue to do well. That's no accident.

Make sure your code does nothing gracefully.