Become a fan of Slashdot on Facebook


Forgot your password?
Trust the World's Fastest VPN with Your Internet Security & Freedom - A Lifetime Subscription of PureVPN at 88% off. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. ×

Comment Re:Beware the gig economy... (Score 1) 136

If most people swap jobs every few years, does it really make sense for employers to be responsible for their retirement savings?

I have to agree with you that the whole "mandatory retirement contribution" thing is pretty much just an annoyance for all concerned today. I understand the rationale for it (because most people are incapable of long-term planning), but it's a terrible idea for people who are in jobs for a short term and need to deal with all sorts of different retirement accounts spread out in various systems.

When I was younger, I took a few short-term jobs and once worked as a state employee for a couple years. In all of these I was required to make mandatory "contributions" out of my pay toward retirement accounts. Well, with the state employee system, I was told when I left that I didn't have enough years to qualify for a pension, so unless I returned to the system, my account would accrue interest for 5 years, after which it would become dormant. I could withdraw the money and put it in an IRA or something at any time, but by doing so, I would forfeit the years of experience I had accrued in the system.

Anyhow, at first I wanted to keep options open, so I didn't withdraw immediately. And then the interest rates the account was paying was pretty good compared to how things were going in the market at the time, so I figured I'd just withdraw after the 5 years. Well, after 5 years, I contacted them, and the money was GONE. Turns out they changed the policy since I was employed, and rather making accounts just dormant after 5 years, you were summarily deleted from the system, and all of your retirement money was forfeited and returned to the state pension pool.

Next, I had a very short-term gig (actually indirectly for a different state government) and was again required to contribute mandatory retirement. In this case, it was only a few hundred dollars, but they didn't really notify me of all the details, so I wasn't even aware of this balance or where it went. A couple years later, I had moved, but they decided to send my money to some 3rd-party account management service, which charged a $15/month fee to hold my money. I wasn't aware of any of this (because I had moved and wasn't even really aware that I was owed benefits) until the 3rd-party company finally located me and sent me a statement that this account was being closed, since my balance had fallen to $12 (all the rest had gone to fees), and they could no longer maintain it... so they literally sent me a check for $12 after taking all the rest of my retirement money.

I'm currently fighting to get another small amount out of such a 3rd-party company because of other retirement benefits I wasn't even made aware of for a short-term contractual position nearly a decade ago.

Cumulatively, all of the money I've lost isn't that much compared to my main retirement savings, but it's preposterous that people all the time are making forced contributions to retirement accounts for short-term employment -- sometimes where they're not even told clearly that they even are being given those benefits other than on an unclear line on a paystub -- and then having to jump through hoops to track down or keep that money, and then deal with the hassle of moving it to other accounts.

Comment Re:I hate euphemisms.... (Score 1) 136

This is easily mitigated these days: don't have a family. Now you only have to worry about making enough money for yourself, and saving enough for yourself on days you don't have any work.

I'm not sure if you're being serious or sarcastic. But if you're serious -- Even assuming you can personally save enough money to avoid begging on the street, you do realize that most people in the world have families whether they like it or not?? E.g., Parents, who might get ill and can't take care of themselves on their own, or even siblings who might end up out of work or whatever. Just because you don't have kids to feed doesn't mean you won't end up with any family obligations to help support. Such situations happen more commonly than you might think, and it can be a significant strain on people who have low incomes.

Comment Re:I hate euphemisms.... (Score 5, Insightful) 136

The "gig" economy is a bullshit attempt to glamorize and hide the real issue, which is a population outpacing the availability stable employment that provides necessary benefits.

In some cases, this is being driven by population vs. employment. But in many cases, simple greed is a major contributing factor. It's so much cheaper to operate a business with a bunch of part-time workers. Many businesses would prefer it, if they could get away with it.

Instability should never be viewed as a good thing.

Yeah, unfortunately most folks in the past couple generations never had to see what the "gig" economies of the past were actually like. Back when you'd go to your local town square or down to the docks or whatever, and stand in line waiting for some potential employer to choose you for work FOR THE DAY. And then you'd break your back doing labor for the day, make enough money to feed your kids, and you'd be off again begging for work the next morning. If you hurt your back or got sick or whatever, you and your family were screwed. End of story.

This was what employment was like for LOTS of people for millennia. Skilled workers like craftsmen could sometimes get more stable jobs, because their skills increased the productivity of the business and employers recognized that.

But for laborer jobs or other things you could likely be trained to do in an hour or two? Not so much. And that's what many modern "gig economy" employers are exploiting again -- can you drive a car? Fine: you're a part-time Uber driver. Be sure to show up on time and be pleasant enough to keep the high ratings, or you won't have a job tomorrow.

Lots of people today criticize unions (sometimes rightly) for corruption, etc. But what unions fought so hard for for a century or so was to finally get modern civilization out of that and recognize that even laborers and unskilled workers deserve to be treated with dignity in their jobs, rather than discarded at the end of the day.

But no more -- I'm frankly shocked at how many younger folks seem brainwashed by all the hooplah over the supposed benefits of the "gig economy." People who know anything about history, on the other hand, see this as exactly what it is: an opportunity for businesses to return to a model where they make greater profits and don't have any obligations to their workers beyond today.

Comment Re:All these words (Score 1) 104

The problem is that someone (I think intentionally) co-opted "fake news" to mean "biased reporting"- that's not originally what it meant at all and a lot of people are still using the term to mean something else. "Fake news" originally (as in a couple of months ago) meant completely fabricated stories.

You gotta understand free expression of language these days -- I guess they're just "free wielding" the term.

(Or, at least that's what I'm guessing these two words from TFS might mean when put in sequence. But from the summary, I still can't understand exactly what Zuckerberg's note was "wielding." Didn't know Slashdot was into actually spawning NEW eggcorns.)

Comment Re:Not accidentally! (Score 2) 106

Well, as other posts have already replied to you, the strict distinction you're trying to make here doesn't really hold in English. Both accidentally and inadvertently can easily apply to something that was trying to be avoided.

But I sense a problem with the headline too, and I think the real issue is -- why are the POTTERS mentioned at all? There's a kind of implication with the way the headline is worded that the potters "recorded" information, except they had no concept of what such recording might amount to or what that information might be. Really, the potters had no intentionality here at all.

This is even worse in the original NPR headline, which is "Iron Age Potters CAREFULLY Recorded Earth's Magnetic Field -- By Accident." That's more problematic from an English usage standpoint, because "carefully" implies they did something with care... specifically they "recorded" with care. But they didn't know they were recording anything, so how could they do it with care?

The more clear way to word all of this would of course be to take the potters out of it completely, since they weren't "recording" anything -- intentionally or unintentionally. They were making pottery. A better headline might just be "Strength of Earth's Magnetic Field Recorded in Structure of Iron Age Pottery" or something. Even better, leave "record" out of it entirely, since that usually implies intentionally preserving information to begin with -- maybe "Historical Variance in Earth's Magnetic Field Strength Measured Using Iron Age Pottery" or something.

Comment Re:Management doesn't know what it wants (Score 1) 158

Those that are quite horrifying. I'm thinking call center jobs or any such service level position. Ones where you are not measured by how well you resolve the customer's issue but how many calls you get through and how quickly you do it.

Yeah, exactly. And this is NOTHING new. A couple of decades I took a summer job in a collections department. It was a horrid job, but it paid reasonably well for a summer position. Our productivity was measured almost solely in the number of accounts we handled and the amount of money we brought in through collections. Whether we actually handled the accounts "well" wasn't really a factor (which led to gross inefficiencies and hoards of "problem accounts" that simply went more and more delinquent as they were passed off because nobody wanted to take the time to deal with them). I managed to figure out ways to make my own account handling more efficient, so I actually processed significantly more accounts than anyone else in the office.

Anyhow, after I had been there for a couple months, they decided we still weren't "efficient" enough and they weren't tracking us enough. (We spent more time filing in stupid useless spreadsheets tracking all the calls we did so that someone in management could monitor our "productivity" than we frequently did making calls.)

And so they instituted a policy that we had to "log in" to our phones while we were at our desks, and log out whenever we were on break or at lunch or whatever.

About two weeks after this policy started, one day I ended up skipping lunch because I was dealing with a particularly problematic account. But then I took a longer afternoon break to make up for it and was out for 18 or 19 minutes instead of 10 minutes -- I figured this wouldn't be a problem, since I had given up my 30-minute lunch break and effectively worked "for free" for 20 extra minutes that day.

The next day, I get called over to my boss's desk, where she had been instructed by company-wide memo to reprimand me, because my name ended up at the top of some list of people who took extra-long breaks. My boss was apologetic, since she knew I was more productive than anyone else in the department, but this is what management were wasting their time doing by sorting some spreadsheet column or whatever and looking only at how long somebody was out for a single break.

Anyhow, with only a few weeks in the summer left, I simply said, "Sorry, these are unacceptable working conditions -- I'll pack up my things and leave," and simply left the job on the spot. (There had been other similar crap leading up to this too.) Only time in my life that I did that, but I think it was entirely justified. Last I heard, the entire collections division ended up shut down and outsourced a few months later, probably because workers were spending twice as much time filling out spreadsheets and logging into phones to prove how "productive" they were rather than actually doing work. Idiots.

Comment Re:Fake news is real (Score 3, Insightful) 893

Both items were passed off as "news" by seemingly legitimate news organizations. Both items are fake news - literally fake.

You seem to not understand the difference between "fake" and "incorrect/erroneous" If you hand a bouncer a "fake ID" at a bar, it doesn't mean you accidentally handed them someone else's ID or maybe accidentally handed credentials that were expired or otherwise unacceptable to get into a bar. A "fake ID" implies that you KNOWINGLY manufactured a false ID (or had someone do it for you) with intent to pass it off as real.

Do you have evidence that the reporters in question actually INTENTIONALLY passed along false information? If not, they were not "fake news" according to the standard definition of the English word "fake."

And they offered corrections. Here's the detailed account from Time about the MLK bust. The reporter corrected his tweets as soon as he had recognized an error. That's NOT what actual "fake news" sites do -- because fake news sites KNOW their information is false when they MAKE IT UP, so they don't offer corrections.

As for the other incident, it's yet another example of poor reporting, but only because the Olympian gave an interview that IMPLIED a connection with Trump's immigration policies and only FOUR DAYS LATER tweeted that actually the incident occurred in December. Again, we should be critical for poor reporting here that then made an EXPLICIT connection with Trump, it should have fact-checked when the event actually occurred, but the Olympian in question was vague in her original interview and implied it had happened recently.

So, who exactly is at fault here? The Olympian was expressing concern over current immigration policies and made a vague reference to detention, which was only later clarified. Was she part of some massive media "conspiracy" to hide the truth until four days later? Or did she just innocently make reference in an interview to an unpleasant experience that occurred to her in immigration recently -- and some media articles misinterpreted her vague timeline?

I'm NOT going to excuse those media reporters who implied a Trump connection -- they made a serious journalistic error by not doing appropriate fact-checking. We should condemn their actions and poor journalism.

But once more detailed information became available, they corrected their stories -- once again, that's NOT the practice of "fake news."

There are various bad journalistic practices in the world. And we should condemn them, and even fire journalists sometimes for making truly egregious errors or showing unreasonable bias or whatever. BUT UNINTENTIONAL ERRORS ARE NOT "FAKE NEWS." Fake news is a separate problem -- and a serious one that we ignore by misusing the English word "fake" or redefining it to dilute its meaning.

Comment Re:Uber? (Score 2) 640

This is a very interesting and well thought-out post. Thank you.

Most posts here miss the point that almost every event has multiple contributing factors. Obviously the driver was drunk. That's probably the most major contributing factor. But could a car with unusual acceleration characteristics also be a contributing factor? Possibly. Heck, the car ran into a tree. Maybe somebody planted that tree there 40 years ago. Did that person contribute something to this accident? Yes, obviously... maybe if that tree hadn't been there, everything would be okay.

The issue isn't whether or not there were hundreds of contributing factors to an event, or whether any one of them could have prevented it ("Darn that tree planting guy!"), but rather which contributing factors may have displayed negligence and created a "hazard" -- either legally or morally.

I haven't driven a Tesla, so I don't know how it handles. Clearly there are a lot of Tesla drivers who like how they drive and don't see a problem with them. But the parent here has a valid point that at some point we may get to a place where accelerating power and handling in some cars become more hazardous for the average driver.

And that's about the only part of this story that's worthy of debate.

Should this story be on Slashdot? NO. NO. NO. Clearly, it is an attempt by the editors to play off of the libertarian sympathies of many people here who are still pissed at how Tesla has had to do battle with car dealership laws, etc., and whose ire has already been inflamed by ridiculous charges about how the media seems to be attacking Tesla whenever it can... and now here comes a grieving father who is lashing out at something that really COULD have contributed something to this crash (in addition to the alcohol, etc.).

Let's all just take a deep breath, acknowledge that we all would not want to be in a place where we are grieving for a child and if we were, we'd likely want to find "someone to blame" too. And then let's move on from the the TROLL NAMED BeauHD WHO POSTED THIS STORY HERE TO GET EVERYONE YELLING.

Comment Re:He does have a point... (Score 3, Interesting) 251

However, I'd like to see some discussion of his statement.
Would a better connection between humans and machines be beneficial?
What would be the benefits/ problems?
How could this be achieved?

To discuss something meaningfully, you need to have a freakin' clue how it might work. At this point in time, we don't. We don't know how the brain works. We don't have anything close to strong AI. The best interfaces we're looking at now are stuff like moving artificial limbs or whatever. To speculate on what might happen IF we could all of this would be sort of like walking up to Isaac Newton in the 18th century and saying, "Sir Isaac, what problems do you think will occur with the internet next year? What will the major benefits/problems be of new advances?" Even if you explained the basic idea of the internet to Newton, I doubt he'd have enough perspective to meaningfully debate what might happen.

But, having put forth that disclaimer, I'll just note a few complete speculations in response to Musk. First is that his argument seems premised on the idea that a faster interface from brain to world would be beneficial to humans. Maybe it would. OR maybe our brains are somewhat limited in maximum input/output in ways that we can't really understand yet because we've never tried what he's proposing. Typing is about the right "speed of thought" for me to create coherent text. I've tried dictating, and I need to pause, correct, and reword too much for it to be useful to me. That seems to be how my brain works... although if I really needed to, I probably could retrain myself to dictate better.

But what if you increased my potential output by 10-fold, 100-fold, a million-fold. Would that actually be useful for me to interact with the world better or faster? Or would it just result in gibberish because my brain literally can't adapt to working much faster than it already does in USEFUL output? Or maybe the plasticity of the adult brain isn't enough to adapt -- so we try hooking up infants from birth with these things. Maybe it works... or maybe it just drives them to be insane or to have other brain development that effectively renders them LESS functional than "normal" humans. Not saying this WILL happen, but it's a possibility when you're talking about an interface with absolutely NO IDEA on what specs might work. Human brains have spent millions of years evolving into what they are to work efficiently at the speeds they do. Just because you could theoretically hook up a device to increase input/output doesn't mean the brain can actually change and adapt enough to make use of the throughput meaningfully.

Also, I think it's important to note in a discussion like this that one of the PRIMARY hallmarks of human intelligence is FORGETTING. One of the things that makes humans so much better than machines is our ability for abstraction -- finding larger patterns so we don't have to parse the "stream of consciousness" directly all the time. And then we sleep, and our brains revisit the memories of things that we've evolved to assimilate as "important" data, while we forget millions of random little details of our day at the same time.

Effectively, we take a very TINY percentage of the "noise" that is input into our brains and actually remember it in any detail, mostly through complex pattern-matching that we're only even beginning to emulate in specific cases with computer algorithms. But the point is that there's only so much that we CAN assimilate into our brains -- and that goes not just for memories, but for new skills or whatever. (Think about when you've tried to learn a skill by "cramming" for a full day or two vs. when you've done practice for a few minutes/day over weeks or months. Your brain needs the "downtime" to assimilate new skills... increasing input or output seems unlikely to make that process faster.)

My speculation is that Musk's idea is rather pointless for somehow keeping humans "relevant" or whatever. IF we develop strong AI that can actually learn as well as humans do, the increased capacity and efficiency will potentially grow exponentially and render humans obsolete within a generation or two. Human/machine interfaces might provide some interesting advantages for humans, but I sincerely doubt we'd actually maintain an "edge" over pure machine intelligence once strong AI occurs. But who knows? Perhaps there is something special about biological systems that won't be replicated easily with circuits, so maybe a "hybrid" could still have a purpose. But to speculate on that would be even worse than Newton speculating on new developments of the internet.

Comment Re: Another Black Mirror episode (Score 1) 130

Kind of like the TIna Fey comments in an awards show setting against Bill Cosby years before serious accusations became public and widespread.

Just to note, the accusations WERE public and widespread back in 2004-2006 or so. It's just few people took them seriously... the "Cosby Show dad" mystique and years promoting kids Jell-O etc. seem to have protected him back then though. Tina Fey was one of many back then who DID pick up on it, but most of the media just forgot about it.

I don't remember that stuff in 2005 or whatever, but I distinctly remember when I myself discovered this stuff about Cosby when I somehow happened upon a story entitled something like, "How we all forgot out about how Bill Cosby is a rapist" -- and that was back in 2012, I think. And that was a couple years before it was plastered all over the news again -- but once I read about it somewhere, it was easy to find all sorts of stuff on it, even prior to when I was reading in 2012.

Tina Fey was reacting to something that was actually public knowledge and had been the subject of news stories at major media sites... it's just that the rest of the media didn't pick up on the "drumbeat" until a decade later.

Comment Re: but but but (Score 1) 557

Yup, and this essentially amounts to doing things the way that MS Office does them. The way you've already learnt to do things is the easy way, because doing things any other way first requires unlearning the way you've already learnt.

Your point would have greater import if Microsoft itself didn't have a history of upending the UI and forcing users to completely unlearn the way they used to do things and learn a whole new system -- see the "Ribbon" debacle.

I'll admit you have points otherwise, but at least with a FOSS solution, you're likely to see a fork if any development group attempts to change a major application so drastically. And even if they don't, with FOSS you can at least pay some developers to maintain the old code -- likely for a lot less than licensing for deploying commercial software across an entire city/corporation for decades.

Comment Re:It's not office. (Score 1) 557

I don't need formula and all the other reasons for using latex are no longer that relevant.

Word processors are not appropriate for large documents that need consistent formatting. They're not desktop publishing applications. Aside from a more "pretty" output with less work, LaTeX also ensures lots of consistency across your document without having to think about it... whereas Word tricks you into thinking you've done something right with the WYSIWYG environment... until you accidentally do something that messes up the formatting.

Also don't want to spend time learning what is essentially a new language with often cryptic build tools so I can write a document.

One word -- LyX.

It's basically a GUI word processor of sorts, with LaTeX under the hood. Click a button to get a PDF.

Yes, if you need really specialized custom formatting or unusual features, you may need to dig around a bit to figure out how to do them. But the good news is once you solve a problem in LaTeX (and LyX), your solution usually "sticks." Solve a problem in MS Word with layout, and change some other random feature in your document, and suddenly your custom formatting screws up in all sorts of unpredictable ways. That's because MS Word is NOT a desktop publishing tool. If you want proper handling of large documents with consistent formatting, etc., you want to use something appropriate -- either LaTeX or something commercial like InDesign.

With LyX, you won't have the learning curve for LaTeX in pure text form. Mostly you just choose a document class appropriate to your task, select a few options from it to customize your formatting, and you're good to go. Even use a non-TeX font with built-in XeTeX/LuaTeX support. I'm not going to oversell this, though -- you will probably spend a couple days setting up a custom document preamble to get everything exactly the way you want it (if you care about typography... but if you care about typography, you wouldn't be using Word or any normal word processor).

But if you don't care about typography so much, you just need to conform your thesis to your university's requirements. Some schools actually have LaTeX templates available for use (officially or unofficially)... but if not, you may need to do some customizations. Luckily, you can often just ask in a TeX forum somewhere (e.g., on StackExchange) and people will frequently just give you the appropriate commands to include if you ask your question clearly.

As someone who went through the process of writing a thesis and also helped a couple others deal with last-minute formatting problems in MS Word, let me just say: you're going to spend at least a few days dealing with formatting issues no matter what. If you go with MS Word, unless you're a wizard who knows all the possible places Word will screw things up, you're going to spend several days at the end dealing with headaches where the text just doesn't flow properly or that figure/table/image/whatever simply disappears or completely ruins the formatting for an entire chapter for no apparent reason.

LyX isn't a perfect solution, since ideally you need to be familiar with the underlying LaTeX code to fix the few things that do go wrong. But if you're just doing one document like this, you can likely get the support you need on a forum. Chances are many of the questions you may have are already answered for you out there.

Comment Re:A lunar eclipse at full moon... (Score 5, Informative) 28

A lunar eclipse at full moon is a given ??? Are you crazy? No, its not!

I think you misunderstood GP, which admittedly was a bit poorly worded. I'm pretty sure GP meant that IF a lunar eclipse is happening, it IS a full moon. So two of the "triple features" mentioned in the summary are bound to be together anyway... all the nonsense about the "snow moon" notwithstanding. (What is the sudden obsession with old moon names in the past year or two? Very few people used these terms anymore outside of the Farmer's Almanac for years, and suddenly they're all over the news... and people keep acting like they're significant -- "Ooooh a 'SNOW moon'... ooooh a 'HARVEST moon'..." -- these happen every single year and mean nothing other than what month it is.)

The fact that the summary doesn't acknowledge that full moons are just normal at lunar eclipses makes it sound silly at best, ignorant of basic astronomy at worst. Terms like "rare convergence" make it even worse. And lunar eclipses aren't exactly "rare" events to begin with, although there are some unusual features of this penumbral one... but the fact that it's penumbral will actually make it less interesting to look at to the average observer.

Is it just me, or has there been an increase in hyperbolic astronomy stories recently? There was all the "supermoon" nonsense last year -- again, mildly interesting for astronomy nerds, but not so impressive for the average Joe who barely looks up at the sky. (The moon really wasn't THAT much bigger.) Now we're billing a "triple feature" for a lackluster lunar eclipse coupled with a full moon (which would be there anyway if an eclipse is happening), and a comet that you need binoculars to see.

I'm all for getting people to look at the sky and to get interested in astronomy, but if you overbill the significance of such things, I don't think it helps.

Comment Re:"...which begs the question..." (Score 1, Insightful) 341

No matter how many people use literally to mean figuratively, no matter how many dictionaries take note of the inverse usage, it is still wrong, and anyone trying to avoid looking like a moron would be wise to steer clear of incorrect uses. Ditto "begging the question".

While I absolutely agree with you that educated speakers/writers need to simply avoid "begging the question," I also absolutely disagree with you about your use of the word "wrong" here.

Language is about communication of meaning. It's not a "game" where you get to "win" if you check off enough of the "rules." I'm not sure there is ANY English speaker out there familiar with the phrase "begging the question" who is unfamiliar with its meaning to "raise the question," and generally it's clear from context if this is the meaning intended.

Meanwhile, I can guarantee you that outside of philosophical circles and wacko grammar pedants, NO ONE will understand you if you use "begging the question" to mean petitio principii.

Hence, 98% of people will understand the common meaning of the phrase "begging the question" to mean "raises the question," and of the remaining 2%, the 1% of philosophers won't much care which meaning you use. And the other 1% of wacko grammar pedants actually KNOW about the modern usage, so they'll understand it too, even if they mutter under their breath.

So, if we're looking at language as successful communication, using "begging the question" to mean "raising the question" has a near 100% success rate in communication, and a 1% failure rate among the lunatics who don't realize language isn't a weird game where you keep score. But if you use "begging the question" to mean petitio principii, you'll likely only communicate with 2% of your audience at best (and that's assuming an educated audience). Communication failure.

There are all sorts of reasons "begging the question" was doomed to failure as an English phrase from the start -- it was a bizarre and archaic attempt at a translation of the Latin phrase even when it was coined hundreds of years ago, and it was based on a poor Latin translation of the original Greek. The "modern" meaning of "raising the question" has been used in learned discourse and by good writers for well over a century -- in previous threads about this, I've posted an example of a debate in Parliament from the 1820s I think where the "new" meaning was already so well established that a representative could make a pun on the two meanings.

The battle has been lost. "Wrong" is meaningless here.

That said, I'll agree with you that "literally" is a different sort of beast, since it has much greater potential for confusion between the two meanings. That doesn't mean I would condemn the new meaning as illegitimate -- but I agree that there's a good reason to stick to the original meaning there. "Begging the question" is no longer even in the running. I avoid it everywhere not because of confusion (since EVERYONE knows what it means, i.e., what you declare to be "wrong"), but because of the tiny minority of self-righteous lunatics who can't understand that educated usage has already changed... about a century ago.

Slashdot Top Deals

There is no time like the present for postponing what you ought to be doing.