IL had free rides to all senior citizens 2008-2011 costs forced them to cut it to just low-income seniors.
No, there are no free rides. What you mean is, "Illinois decided to have taxpayers buy rides for certain people from 2008 to 2011"
IL had free rides to all senior citizens 2008-2011 costs forced them to cut it to just low-income seniors.
No, there are no free rides. What you mean is, "Illinois decided to have taxpayers buy rides for certain people from 2008 to 2011"
However, many independent artists would leap at the change to sign with a label, since 10% of something is better than 100% of nothing.
THIS. People always make the mistake of looking at high revenues that big-name artists get and dream of doing that themselves.
But that's kinda like dreaming of playing for the NBA or NFL or whatever -- sure, it happens, but the 99% of the kids out there playing high school sports will never have a chance at those sorts of salaries.
It is common for creative people to assume that they create the only value that matters, and that marketing, promotion, and distribution are all worthless.
Exactly. There's this new myth of "YouTube-o-genesis" -- just put your stuff up on YouTube, and users can "discover you," and then you start raking in the big bucks, no labels or whatever needed.
And yes, that HAS happened. But for every sudden "YouTube sensation," there are 10,000 people out there who are uploading stuff that gets 5 views only from their friends. And among those 10,000 unlucky people are usually loads of talented folks... they just need some help getting attention.
Labels can still be a path to help that (though they're not the ONLY path). Getting a few percent of revenue from a label that actually promotes you, gets you gigs, etc., is likely a lot better than the beer money people chip in when you just sing at the local karaoke bar.
And I hate the RIAA's abusive copyright tactics as much as anyone else here, and I'll be the first to criticize labels that do bring in large revenues for their executives and staff, but pay a pittance to artists. Nevertheless, they CAN still serve a function, and thus many independent artists still DO sign on.
Really? So I just got a 'raise' at work from deflation, somehow that is demotivating?
Here's the fallacy -- how long do you think that "raise" will last in a persistent deflationary economy? Prices are going down, because monetary value is going up. That means corporate revenues go down. People with large amounts of money invest much less, because an investment would have to have a LARGE rate of return to actually be worthwhile... otherwise, you just hide your money under your mattress.
So, fewer investors, decreasing prices... corporate revenues go down. And somehow you think get to keep you "raise" at your current salary in deflated dollars?? Fat chance. Eventually, they need to start decreasing your salary -- probably even more than to keep "pace" with deflation, because of the decreased revenues. Or they just start laying people off.
But that's only the tip of the iceberg. Why would you buy property in a deflationary economy, when it is likely to be a depreciating asset? Loans become nearly impossible to justify -- banks would still have to charge interest on them to justify them, which means you're throwing money at a depreciating asset, while the principal of your loan and your payment sizes effectively grow due to deflation. And given the depreciating value of assets, banks are likely to require additional insurance fees in case of default (a lot more than they have on risky mortgages today).
People stop trying to get loans to open new businesses. Investors stop financing them, unless it's basically a "sure thing," since they can "make money" just stashing their cash away. People stop taking out loans for basic things like real estate and houses.
"But," you say, "Maybe that's a good thing. Maybe people should learn to save up more before buying a large purchase." Okay, except who do they rent from in the meantime if they don't take out a mortgage on a house? The people owning rental property face the same difficulties in maintaining a rationale for owning it. If it's decreasing in value, along with other goods, rents will eventually be driven down too (along with the decreasing salaries). Why invest in maintaining property? -- it's just throwing money at a continuously depreciating asset.
If you're a landlord in such an economy, the best strategy is probably to dump your property now and get more money out of it while you still can before its value decreases further.
And we can go on and on. People hoarding cash and dumping most other investments leads to economic stagnation, then worse. Eventually this results in a deflationary "spiral" and the economy tanks.
Oh sure, throughout all of this SOME people will still invest and spend money, but it becomes increasingly hard to justify.
People who support deflation generally never think through even the basic next steps in their logic. They just think they'll magically have "higher salaries" coming from somewhere to spend on cheaper goods. That doesn't happen in real economies. The only people steady deflation is good for are people who have giant money bins already. For everybody else, you'd be much better off with the 1-2% mild inflation and actually having a more active economy.
They would not even seek to stop it, considering the value of 1-2% per year "normal" (that's a tax on wealth, BTW).
Small amounts of inflation are NOT a "tax on wealth." I suppose you might consider it a "tax on money you hide under your mattress."
But in the real world, mild inflation encourages people with wealth to get that money out from under their mattress and invest it somewhere or do something with it.
Deflation, on the other hand, encourages hoarding of money, which means investments have to have much larger returns to seem worthwhile, so most people prefer to just keep their money "under their mattress." And why buy anything unless you need it right NOW? If you wait a year, it will be effectively "cheaper" since the value of your money has grown.
Basically, deflation means economic activity decreases significantly, which means EVERYBODY loses except for those who have huge amounts stuffed under their mattresses.
There are some folks under the mistaken impression that deflation will be good even for normal folks, since their money will buy more. Except do you really think your salary can stay the same in a consistently deflationary economy? Fewer people are investing or buying anything, which means economic activity goes down, which means less revenue at a lot of companies. That means the average Joe either is looking at a salary decrease to "keep pace" with deflation (actually likely more than that, due to the effects of decreased economic activity overall, which depress revenues) or else they just start firing people.
Staying at a steady "0" (no inflation or deflation at all) is nearly impossible. So, yes, government policy tends to try to stay stable at a low inflation rate, which both gives a mild buffer to the rate (avoiding the dangers of deflation) and gives a mild encouragement to investors to keep pumping money out, rather than hiding it under the mattress. Having a slight preference for economic activity over inactivity benefits us all significantly.
Unless you have a giant money bin full of enough to keep you going for the rest of your life already, you likely would do much WORSE in a deflationary economy than having mild 1-2% inflation with its supposed "tax."
See, if you actually want to debunk this, these are the questions you have to answer.
I wasn't trying to "debunk" anything. I was just offering a definition of "fake news" vs. erroneous or incorrect reporting vs. biased news.
Why are we calling this "fake" news instead of "incorrect news" or "wrong news" or "wacko conspiracy theory"?
Because "fake news" has a very clear meaning that should be apparent to anyone who knows what the word "fake" means. Where do you use the word "fake"? You use it in places where something that is known to be false by the originator has the appearance of truth.
That's different from "wrong" or "incorrect" because those can result from simple errors. "Fake" implies that the person who creates the "news" KNOWS it's fake.
Fake news can have lots of different motivations.
-- It can be satire or parody, like the Onion.
-- It can be produced by people who just want to make money -- as it apparently were in this past campaign by some Balkan teens (who are hawking this fake news just like people hawk fake watches or "designer" purses).
-- It can be deliberate propaganda, made up by someone with a particular perspective intended to energize (or outrage) other people with that perspective.
-- It can even be a hoax created by those who want to embarrass their opponents by getting them to "take the bait" and then reveal that it's BS all along (again, something that multiple people have admitted doing to try to sabotage the past election).
All of these things are encompassed by the clear and unambiguous word "fake," i.e., something KNOWN TO BE false that looks like the real thing.
There are lots of folks who have been reading headlines about "fake news" recently and assuming it's about something else -- e.g., partisan sites spreading biased propaganda. But that's NOT FAKE NEWS. That's opinion or biased reporting or whatever. It may have its own problems, but biasing or distorting news by selectively choosing what to report or how to report it is NOT FAKE NEWS.
Actually making something up and knowingly publishing something literally false ("Person X did Y in city Z" when you know that didn't happen) *IS* fake news.
If AI makes people obsolete, who will those companies peddle their wares to, and obtain income from?
Here's an even better question, given the headline: If CEO's believe AI makes people irrelevant, does that imply AI will make CEOs obsolete too? (Setting aside the obvious quip that "CEO aren't really people.")
Given that there are various studies over the years showing that CEOs don't necessarily provide significant benefits to corporations (e.g., studies have shown that CEO pay does not correlate well with company performance, past CEO performance does not correlate well with future results, etc., to the point that some have wondered if it's mostly random chance whether a given CEO succeeds -- studies that attempt to quantify the chance component vs. the skill component in CEO performance seem to indicate the CEO's performance is only a few percent out of the total factors) -- then isn't that an argument for putting an AI in the CEO chair??
After all, even a lowly janitor can clearly point to clean floors at the end of the day to prove his worth to a company. CEOs seem to lead to mediocre performance or even declines about as often as they lead to successes, and those ups and downs are pretty unstable over the course of a career. (And if you get one "down" that's just too much, they give you the golden parachute, and it's off to the speaking circuit.)
By the way, I'm not at all questioning the fact that many CEOs are smart people or whatever. I'm saying that markets are volatile, and CEOs who are determining a future course are navigating uncertain "waters," just like stock pickers and fund managers -- another group who has been shown, overall, to rarely exceed chance in terms of success as a collective group (or over long enough periods of time). If anything, the critical function of a CEO is to provide "leadership" by making it look like there's a steady, clear course... but whether that course needs to be determined by somebody with a seven-figure salary vs. an AI vs. a magic 8 ball... I'm not so sure.
I agree with what the parent says here, but two other basic issues I'd point out in terms of "how Linux is better" are:
(1) Choice. I'm not talking about distros necessarily, which sometimes seem to fragment the Linux community unnecessarily. I mean just choice in general about what to install, how to manage it, how to use it, etc. I've definitely noticed a trend in software over the past 15 years or so to HIDE or outright DISABLE more and more choices.
Linux is sometimes criticized for TOO much choice -- too many configuration options, etc. But there are, well, choices in Linux (distros, desktop choices) that hide a lot of that stuff if you want. But if you WANT to be a "power user" and configure exactly what's on your system and how it works, you can.
(2) Support for lesser hardware. To me, this is a HUGE deal, because there are loads of cheap computers (both desktops and laptops) today that you can get for a few hundred bucks, sometimes less. And they're perfectly functional for the basic stuff that most people do 95% of time -- email, web browsing, basic documents, etc. Compared to hardware specs computers of 10 or 15 years ago that did the same stuff, they are hugely advanced, and a large number of people who don't do significant processor-heavy tasks simply don't need more.
But often these cheap computers come with the latest Windows release and other bloated software that just serves to slow them down. Or, even if they work well for the first year or so, various "updates" and "patches" and just the accumulated crap on a Windows system slow the system to a crawl.
With Linux, I'm able to pull a 10+ year old laptop out of my closet, install a modern, up-to-date OS with up-to-date security patches and support, and it runs well. Yeah, I'm obviously not talking about people who need advanced gaming support or complex video editing or whatever -- but if you just need a basic cheap system, Linux has plenty of options for you.
That's actually where it has shined for friends and family I've recommended it to. They have a computer that's 2 or 3 years old and just seems to be "slow." They ask me for advice for buying a new one -- and I have to tell them. "Well yeah, you COULD get that cheap laptop for only $300, but it's probably going to run slow out of the box. So you might want to spend a bit more." Or, I can just say, "Here -- install Linux, then see what happens." And now their old computer will work for several more years.
To me, that's one of the little-discussed aspects of what makes Linux better for some people. Over the years, it has literally saved me thousands of dollars, just by not having to buy more expensive hardware or upgrade just to run the latest bloat put out by Microsoft. That cost savings alone (even setting aside the cost savings of using free application software, etc. too) is not insignificant.
Look: I'm not an idealist by any means. I get exactly why it's a pain to support Linux. But I started replying in this thread to someone who was blaming Ubuntu and Linux Mint for AMD not supporting a new version of Xorg. To me, that's a completely BS argument. Get mad at AMD. Even get mad at the Xorg Devs. But this is nominally a thread on Linux Mint -- why the heck are we even having this discussion??
And why, as end-user, do i care this? I need something that works.
I completely understand your frustration. I was only pointing out that it may be misdirected.
A newer version of xorg was apparently more important to drivers compatibility for the package maintainers.
Probably because this was an LTS release. Ubuntu was forced to make a call about whether to include a newer, better-featured version of Xorg or support an older one for the next 5 years.
To elaborate on that: somewhere along the road the xorg developers decided to break something. How hard is it to design something and keep it (forward) compatible? Apparently for xorg very hard. I totally am ready to believe they had their reasons to do so, but you simply cannot expect all other involved developers to run behind them, within months, if they make make a change breaking stuff, totally ignoring the significant amount of testing the AMD developers would have to do.
Do you think other hardware companies make such excuses when Microsoft releases the next Windows version and stuff breaks? No, they are happy to provide support for their own devices. Also, the "months" thing is disingenuous -- the prospective changes to Xorg were likely known before this: that's just the date after the final version of Xorg was RELEASED.
But just take a second and consider what the implications of your proposal are. You complain that the developers for hardware should not be required to support their own products for some OSes because it's excessive work, but you have no problem asking the Canonical developers to spend the next 5 years supporting a deprecated version of Xorg on an LTS release, just over a video driver that will likely be obsolete in a year or two?
I'm absolutely sure that the Canonical developers didn't want this stuff to break, but they had to make a call on providing the best service for their product overall for the next 5 years.
In case of Ubuntu 16.04 the AMD user is left in the cold, no matter who to blame.
Okay, I might agree, but you clearly have strong feelings about who to blame.
And this is why people who say 'Linux will never be ready for the desktop' are proven right.
What you have proven is that developers for closed-source drivers are unwilling to provide the same resources for supporting Linux as they are for Windows. This fact was and is well-known. Linux does the best it can at spending HUGE amounts of trying to reverse engineer software to provide support for this kind of hardware, because the closed-source folks don't share their info.
If Linux could spend all the time they put into developing and maintaining reverse engineered drivers into actually MAINTAINING decent drivers based on specs direct from the manufacturers, Linux would likely have significantly better hardware support than Windows.
Nothing to see here, move along.
Unless you lost your laptop, in which case you may want to look closely at the photo and maybe try to get it back.
After many years of Ubuntu use as primary desktop, the thing that drove me away was ending the support for the closed source AMD video drivers.
What does this have to do with Ubuntu? AMD ended their support.
Someone decided that the open source drivers were 'good enough'. Well, they are not, at least for what i was doing.
Yep, that "someone" was AMD. They apparently decided to focus more on a new Linux driver project, as noted in the posts from AMD folks quoted in the above link. Ubuntu isn't able to offer "support" for a closed-source driver that apparently breaks with the newer versions of Xorg. (I'd note that AMD had months to prepare before the new version of Ubuntu upgraded to the newer version of Xorg, and it's been a year or more and AMD hasn't updated their driver.)
And the choice to use the drivers as released by AMD was removed
Because it might break your system.
Imho, Ubuntu, and all derivatives like Mint, suddenly alienate half their user base with that decision.
How was it Ubuntu's fault (let alone Mint's, who didn't do anything here) that AMD stopped updating their drivers for Linux? Ubuntu and its derivatives aren't the only distros that this created problems with -- anyone who is using a version of Xorg released in the past year will have the same problem. And since Xorg is standard across most Linux distros, this truly has nothing to do with Ubuntu (or Mint) per se.
So, i'm back to windows 10 which serves my need
Yep -- AMD decided to update their drivers for the latest Windows version. Ubuntu can't do so, because they don't have the source code.
Why are you angry at Ubuntu when the people who stopped the support are AMD?
I don't mean to sound insulting, but you do understand what the implications of "closed-source driver" are, right? Ubuntu would likely be happy to provide support and updates if they had the source code... but they don't, and AMD won't release it.
Fuck experts. You're not right because of a certificate or credential.
I think you make the common mistake of equating "experts" with "credentials." Experts are people who actually KNOW stuff. Credentials are sometimes useful for determining experts, sometimes not. There are plenty of people who don't have a certificate in X which nevertheless may BE an expert in X simply due to their experience, their own independent study, etc.
So, no -- the fact that you have a credential absolutely does not mean you're right. But the fact that you KNOW more stuff does make it more likely that you're right in that area than someone who doesn't know that stuff.
If you have a cogent argument, the argument is right.
Nope -- this is a return to Aristotelean and formal logic thinking. Just because I can create an argument that LOOKS valid according to whatever rules doesn't mean it is correct.
If you're good at your job (whatever you're an expert in) you can explain your argument persuasively.
Ah, now we're getting down to what you actually want to talk about, i.e., rhetoric and the art of persuasion. Back in the day (19th century, parts of the 20th especially in elite academies), high schools used to offer classes in rhetoric. Not only were you taught the art of persuasion, but you were also taught about a lot of logical fallacies in argumentation, so you could spot BS as well as deliver it. Rhetoric was a standard subject in formal education dating back to ancient Greek and Roman times, but it has fallen out of favor in the past few generations with rather disastrous results.
Being a good, persuasive speaker or writer actually is something you can be taught (at least somewhat). It's not just a "natural talent" or whatever, which many people think today. The problem is at some point in our quest for increasing specialization, we stopped teaching specialized "experts" rhetoric and good communication skills.
Can you see why "experts" are worthless, and what is needed is persuasive arguments?
Actually, far from being worthless, it's often very beneficial when an expert is also trained in the art of rhetoric and persuasion. The best people at persuasion are those who know the facts thoroughly -- i.e., experts.
How to persuade "I understand why you think that way, plenty of smart people would, knowing what you know. Here are some things you don't know, and why they're important".
If you think that's the key to persuasive argument, you're on the right track, but there's a LOT more to it. If you think that merely stating things that other people don't know and why they're important will win most arguments, you obviously haven't been involved in most arguments on the internet for example. People simply aren't very much into listening to detailed arguments anymore. They choose a "camp" and read 140-character tweets and hear 15-second soundbites. If you don't make your case, or if you don't seem on "their side," they tune you out and find a different source.
There are really only a couple ways to win in that culture: (1) redesign a rhetorical strategy that works in 140-character tweets. Trump seems to have mastered this, but it's hard to get across nuance. (2) Convince the "other side" that you are one of them, so they'll actually sit down and listen to you talk for more than 15 seconds. That often involves a LOT of work, if not outright pretending to be something you're not.
There are precious few places where you're going to get people's attention for an entire oration anymore -- at least not without half of your audience just tuning out or changing the channel or going to the next video clip on YouTube or whatever.
TLDR: saying "experts should be respected" is how you get Trump.
Partly -- that in itself is a rhetorical fallacy known as "appeal to authority" (or, if you want to be fancy, argumentum ad verecundiam). Except you're turning that fallacy on its head -- since it actually is NOT a fallacy when the "expert" is actually an expert. Experts DO matter. Your argument that they need to be trained better in rhetoric is valid, but it doesn't follow that experts are irrelevant, since true expertise coupled with good persuasion is the way to win an argument.
But I have a hard time understanding how anybody could forget their laptop at a TSA checkpoint.
A lot of people experience anxiety and distraction when they're going through the security line. You're being led around like cattle and are subject to a bunch of random rules that could result in a pain and a bunch of delays (maybe worse) if you aren't careful to pay attention. Doing an extra check to make sure you have everything may not always be at the top of your list.
Just a few ways that immediately come to mind:
(1) You're getting on a 6am flight, so you're going through security at 5am and haven't had a cup of coffee yet because the TSA won't allow you to carry one. So you're just in a "haze."
(2) You have small children or are accompanying a person who can't take care of their own stuff for some reason, so you're juggling a huge number of bins and bags and trying not to forget anything, while also trying not to hold up the line.
(3) The TSA personnel distract you with some bogus extra search procedure that makes you feel uncomfortable... or they are overly brusque with you, which makes you a little paranoid (because they have the power to detain you). So you're distracted by this other stuff -- in ADDITION to having to deal with the indignities of putting back on your belt, shoes, packing up you little "baggie of liquids," etc. while people are crowding around trying to do the same.
Lots of other scenarios. I had a good friend (not at all an idiot or scatter-brained) who forgot his once, but luckily realized it when he got to his gate and went to do some work. He came back and retrieved it in time. I had another acquaintance who lost his and did NOT recover it.
I actually ended up adopting my own crude "reminder procedure" after hearing about these -- I commonly carry my laptop in some sort of sleeve in my bag anyway. I used to just reach in my bag and grab the laptop to put in the separate bin. Now I take the sleeve out of the bag and put it in the bin with my laptop bag (but outside of it), and my laptop obviously in a separate bin. I obviously will need to deal with the sleeve before I depart the TSA area. Just in case I'm distracted, I think there's a much lower chance that I'll just unthinkingly place my empty laptop sleeve back into my bag without realizing my laptop's missing. I doubt I'd forget my laptop, but I know how often it happens, so a little extra precaution doesn't hurt.
[banning] under-18-year-olds texting sexually explicit images. Of course, he doesn't have the slightest idea about how to go about tackling these problems
I can just see tomorrow's news story:
HEALTH SECRETARY ANNOUNCES NEW PARTNERSHIPS TO CURB SEXTING AMONG TEENS
Jeremy Hunt has reported that he has not only found partnerships to reduce teen sexting -- he claims he can do it without costing the UK government anything. In an interview, Mr. Hunt said, "I was shocked at how easy this was! A U.S.-based organization contacted me within a few hours of my announcement saying they'd are 'highly experienced' with such materials and were willing to screen most of the images for no cost whatsoever. The organization is called NAMBLA, though I don't recall exactly what the acronym stands for; they said it very quickly in our conversation. I queried them about privacy concerns, and they said, I quote: 'We will treat these images as if they were a prized possession.' I was also concerned about security, and they said, 'We have a lot of experience handling such materials and keeping them secret.' "
Unfortunately, Mr. Hunt said the group will only handle male images for free, due to unspecified issues in their screening apparatus. But an open call on the Health Secretary's website for "Experienced people willing to screen images and search for nude teenage girls" has already received hundreds of applications from volunteer organizations.
"I prefer rogues to imbeciles, because they sometimes take a rest." -- Alexandre Dumas (fils)