Sort of. It wasn't that they didn't "want an expensive show", but that Farscape's time slot had a good enough lead-in (which I believe was SG1 at the time) that they could put something cheap and crappy into the time slot and still get decent ratings. Not necessarily great ratings, not even ratings as good as Farscape, but good enough that the savings in making a cheaper show would still make the time slot more profitable overall.
This is a fairly common thing. Let's say you have a popular half-hour sitcom at 8pm, and another popular one at 9pm. You might think that the smart thing is to put a 3rd popular show at 8:30, so that you'd really lock people in, but if you pay attention, that's not typically what happens. Instead they put a weaker show (or a new show they're trying out) into the 8:30 slot, since they know that being sandwiched between two popular shows will mean it gets pretty good ratings, even though it stinks. Basically, a lot of people will watch a crappy show because they're too lazy to change the channel.
So that was the rumor on what lead to the demise of Farscape. It was getting good enough ratings to make money, but SciFi thought the time slot had good enough shows around it that they could fill the slot with a cheap crappy show, and people would still watch it.
Even if that's not really what happened to Farscape (and who really knows?), the point remains that the way advertising and time slots work have an influence on the kinds of programs that are produced. As DVR and streaming services have become more common, some of those effects are probably becoming less pronounced. In a service like Netflix (where their original programs aren't aired on broadcast TV at all), these kinds of considerations shouldn't have any effect at all.
The story is that no-one, including Apple, has figured out how to keep improving smart watches with new features or longer battery life. We seem to have hit the limit after two generations, so it seems unlikely that they will achieve mass market appeal any time soon.
I think it's just too soon to say that. We just hit the second generation Apple Watch, and as I pointed out there are reasons why they probably shouldn't want to make massive improvements in the second generation. Chips keep getting smaller, more powerful, and more efficient. Batteries are continuing to get better. There's no reason to think we've "hit the limit" and won't see further improvements in the future.
To keep the phone market going Apple has to have a big bit of "innovation" every year, a reason to upgrade.
That's not really true. Apple releases a new model every year, but their only introduce a new major revision every other year. Part of what my post was trying to point out is that this is upgrade cycle is intentionally set to the same upgrade cycle that most people have for their phones (cell phone contracts tend to be 2 years, so people tend to buy new phones every 2 years). The big question here is, how long does Apple think that people will generally want to hold onto their watches before upgrading to a new model. Every two years? Maybe three? Their research might say it'll only be every five years, and if so, we can probably expect that Apple might provide minor revisions every year, but only release major watch revisions every five years. (I suspect it'll be longer than 2 years but less than 5)
This might possibly make some sense of my general view that I have about lying, which is that it's not quite as simple as "honest people" and "dishonest people". I'm sure there are some people who are truly dishonest, in that they've thought very clearly about what the truth is and are being intentionally deceptive. However, I know a number of people where I'd be more inclined to say that they're just not really thinking about it.
That might sound weird or a little nonsensical, but what I mean is, there's a certain level of mental activity to "be honest". It's not just about the courage to voice your opinion, but also whether you go through a certain kind of thought process. To give a common example, if you ask your coworker, "How are you doing?" there's a decent chance that person will say, "Good" without even thinking about it. They might be miserable, but it's not necessarily an intentional deception. Maybe you're just being polite, or you don't want to share. Or maybe you're just responding because that's the proper conventional response to the question.
To give a slightly more complex example, if I ask what your favorite movie is, you might just say "Pulp Fiction" even though that's not your favorite movie. Maybe it's a movie that came to mind that you liked. Maybe it was a movie that your decided was your favorite movie well over a decade ago, and you've just used that as your answer when people ask, even though there are other movies you like better. Or maybe you said "Pulp Fiction" just because you thought it was a good answer that other people would agree with.
I used to think that it was as simple as "being honest" or "being dishonest", but I've realized over the years that a lot of times, we just end up giving whatever answer is quick and easy, or the safe answer that won't cause trouble. Some people do it more than others, and I've known a few people for whom communication isn't really about conveying information, but more about social maneuvering. And I don't even mean that it's malicious, since it may be as innocent as just saying whatever will get you to like them and make everyone get along. I think it's not even necessarily an intentional deception, but instead it's more like they're not even thinking about the truth content of their answer in the context of "true" or "false", but more like "achieves the desired effect" or "doesn't achieve the desired effect".
So I'm rambling a little, but I wonder if the amygdala has a role in the evaluation of truth content. If my general thought is correct, it'd be reasonable to think that there's some part of the brain with is being under-used in people who "end up giving whatever answer is quick and easy".
Netflix knows exactly what people want
Also, they're in a position to care about what the viewers want. The TV networks, meanwhile, are built to care much more about what advertisers and their clients want.
You might expect it's the same thing, since advertisers will want whatever people will watch. However, there are some subtle differences that have big effects. For example, they don't like controversy, so while they're trying to get a big audience, they're also making sure they don't ruffle anyone's feathers. If they're trying to get Walmart or Chick-fil-a advertising money, then there'd better not be anything in the show that could be considered anti-Christian or pro-homosexuality.
There's also a tendency to look for shows that will hit certain demographics who are thought to be likely to buy specific kinds of products. So, for example, a children's show might get cancelled in spite of critical acclaim and high viewership, if it turns out that kids aren't buying the toys and merchandise associated with that show. Two shows with similar budgets and viewerships might have very different fates, depending on whether the viewing demographics are expected to have a lot of disposable income, or to correlate with products that the advertisers want to sell. So networks are going to focus on making teenager shows to market Clearasil, and they need old-man shows to market Viagra. If you're their target demographic that's considered a desirable market, then they're not particularly trying to make shows for you.
There's also another similar problem that that Netflix avoids by having an on-demand viewing model, as opposed to having shows compete for a time slot. On network TV, a show might be making enough money in order to pay for production and make a profit, but it might still be cancelled if a network thinks that another program would make more money in that time slot. This was one of the rumored reasons for the cancellation of both Firefly and Farscape, for example.
All of this is why you see a lot of cheap reality TV that appeals to the lowest common denominator. It doesn't much matter whether the show is good or whether there's a substantial audience on the edge of their seat waiting for the next episode. Networks are just looking for cheap, uncontroversial programs that will make it easy to sell advertising.
I'm not even sure how they would monetize it and I don't think they know either.
If the rest of the web is any indication, it'll be by shoving advertising down our throats.
the Apple Watch 2.0 only really offers waterproofing. no real advances that people would dump another $350+ to replace their 1 year old Apple Watch 1.0
I think this really needs to be taken into account in the whole discussion. The big story is that Apple Watch sales are down from last year?
You have to figure that a large percentage of people who wanted Apple Watches bought them last year, when they were first released. Most people don't usually replace their electronics after only a year. Even with cell phones, they wait 2 or 3 years, and that's about as frequent as it gets. Given that smart watches are mostly being used as watches and to display notifications from your cell phone, it seems possible that the smartwatch upgrade cycle will be less frequent.
Also, the "Series 2" model is ultimately a minor upgrade. It has GPS in the watch, which may be important to some people. It's waterproof and the old one isn't officially waterproof, but was still more water resistant than advertised. It's not thinner or lighter, the battery doesn't last longer, and it doesn't even look different. Some people will want to upgrade after only one year, but I wouldn't expect most Series 1 owners to think it's worth buying a Series 2.
Given that, I would assume that there'd be a big spike of sales when the Apple Watch was first released, followed by a few years of diminishing sales. I even had a theory (which so far has worked out) that Apple would avoid making a lot of small incremental changes every year. Given the novelty of the product, some people probably held off buying it the first year because they wanted to see if the following year's model would show substantial improvements. Now that we've seen only minor improvements for Series 2, that may have lead some of those people to go ahead and buy one, which may explain why their sales aren't even worse.
My basic theory is that Apple has a cycle in mind for how often they'll release major updates with major design changes, and it's basically on the same time frame that their marketing experts tell them that people will be willing to buy a new smart watch. I don't know if that's 2 years or 4 years, but it's not going to be 1 year.
If women choose not to go into computing fields, why should they be forced (or even encouraged) to do so?... How about letting people pick the field(s) they want to go into without telling them what they "ought" to do based on a pointless metric or percentage?
My brain jumped to a few different places when I read these questions. The first is, in pushing for greater inclusion of women, I think there's an implication or assumption that women would like to get into these fields, but are not able to. It doesn't really seem true to me, but maybe some people have other experiences? My experience has been that most of the places I've worked (admittedly doing support, not programming) would have loved to hire more women, and made efforts to do so, but very few women even sent in resumes. But like I said, it's possible that some women could tell stories where they felt discriminated against.
The second thing that went through my head was, it does seem fair to ask the question, "Why are there so few women in tech?" Even if the answer is that women aren't generally interested, it only raises the question, "Why not?" Some people might not like the idea that there's a innate/genetic reason for it, but it also might have to do with our educational system, or something about how technology managers work. It may be a larger societal message, where we're telling women that they're not going to be good at that kind of job. If we had a clear understanding of why women weren't pursuing those kinds of careers, we would then be in a position to say, "That's fine, and not something we want to try to solve." Not knowing what's going on or why, I don't think we can say that it's not something fix. It may even be that women are seeing a problem in the industry that's harming all the workers, and it's a thing that men are just more willing to tolerate. If so, fixing that problem may benefit everyone.
The last thought I had might begin to answer your questions more directly: When you want to hire people who are good at a job, it's good to attract everyone you can and maintain a large and diverse talent pool. It increases your chances of finding the people you need. I'm not even talking about anything related to social justice, but just the practical matter of trying to hire people. You want a big talent pool. As people are fond to point out, it also potentially drives down the cost of labor, but it also increases the changes of finding someone with the exact qualities and skills you're looking for.
the iPhone 7 Plus is expected to be the main model benefitting from this transition.
So when they say they're switching to the iPhone 7, they're really saying they're switching from Android to iPhone. The iPhone 7 Plus (5.5" screen) is only a little smaller than the Galaxy Note 7 (5.7" screen), and larger than the Galaxy 7 (5.1" screen).
The Electoral College isn't particularly helping Clinton here. If anything, it's probably going to end up helping Trump in that it skews political power *toward* less populous states. For example, Wyoming will go to Trump. While Wyoming accounts for only
The Electoral College was designed to prevent populous areas from exerting too much control over the federal government, and given that populous areas tend to be more liberal, it usually works in favor of the Republicans.
until they can figure out why Donald Trump sniff's constantly when he's talking.
I can think of a few possibilities:
a) As you suggested, he's doing cocaine.
b) It's not cocaine, but Trump actually gets off on snorting the ashes of your dead grandmother.
c) He has Parkinson's, but is going to claim it's pneumonia.
d) He's trying to hold back the tears because mean old Hillary hurt his feelings.
The real reason? Simple: people are lazy as shit. If you give them a chance to slack off, they will.
I don't agree that people are lazy, but you are pointing out one potential problem with telecommuting.
A potential problem with having a Slashdot discussion is that you're generally talking to a bunch of programmers. In this case, this is a problem because you're talking to people who are used to having a particular kind of job, where it's relatively easy to measure output. I wouldn't generally have a problem with programmers telecommuting because what I care about is their output, and you can assess whether they're doing what they're supposed to by looking at the quality and quantity of their output.
But there are different kinds of jobs. With some jobs, there's not a real "output" that you can look at. They aren't building something where you can look at the results and say, "If this is well made, then this person did a good job." The job might have deliverables that can't easily be produced remotely, or the job's purpose might have completely different dynamics. To give a really simple example, it doesn't make sense for a McDonalds worker to telecommute.
I've managed a few helpdesks over the years, and I generally don't like people telecommuting for that purpose. One reason is that I need to make sure there's coverage at any given time, and it much harder to gauge who's actually available when if they're not physically present. Another is that it really helps to be able to see who's frustrated, who's struggling. I can overhear what's going on, and just as important, the technicians can overhear what's going on. They can hear how others are handling their calls. They can pick up good habits from each other, and they can hear when someone is struggling and say, "Hey, let me help you with that." Sure, I could try to use metrics and base people's performance on number of cases closed per week, or customer satisfaction surveys. Anyone worth their salt knows that, at best, those metrics don't tell the full story.
Meetings are also more problematic with telecommuters. Things like Google Hangouts seem like they'd take care of it, but you end up wasting a bunch of time because someone is having webcams issues, or you can't hear people very well and people have to repeat themselves. If you can just get away with text chats, I find that actually works better, but that doesn't work for all communications. Sometimes a quick in-person chat is really so much easier and more effective.
Companies pay people for being at their desk 8 yours a day
Sometimes that's reasonable. It depends on the job, but I've managed IT support staff, and yes, some of them are being paid to be at their desk during certain hours. Essentially, they're being paid to be available and answer phones, so that when users call in, someone is there, ready, available to help. It's really important, then, that they're there for the exact hours they're supposed to be.
You make it sound like a bad thing, but there's something that many good managers do that it's bad: walk around and get a feel for how things are going.
It's not about spying on people. It's not necessarily about catching people slacking off, though sometimes that happens. More often then not, it's about helping people. You hear someone getting frustrated, you see someone struggling with something, or you catch the vibe that one group has too much on their plate. As a good manager, you step in and help them find a solution.
More often than not, when I see someone slacking a little, I ignore it. My people work hard, and deserve an occasional break. I'm much more interested in keeping them on the right track, and keeping them from overloading.
Everybody likes a kidder, but nobody lends him money. -- Arthur Miller