I assume you mean licensee.
I assume you mean licensee.
And when a reader vendor messes with their design, they get cranky. Still, I suspect that there's a good reason for this change.
If memory serves, Kindle historically had significant bugs in its rendering, caused by bugs in WebKitGTK. One of the bugs I've seen involved fonts with miscalculated baselines showing up with text that was squished in bizarre ways. Another bug resulted in fonts being rendered either too heavy or too light with the default antialiasing mode—I don't remember which. If they fixed these bugs, it probably resulted in significant rendering changes to ancient fonts like Helvetica.
In other words, assuming the device actually got closer to correct rendering, this is a good thing.
At least they finally started allowing you to ignore the publisher-preferred font in recent years. Some books published that way were illegible and it's obvious that Amazon employees do not use their own products.
It's a fine line. If a reader goes too far in overriding default fonts, you can have readability problems with things like drop caps. Same goes for overriding the font color (e.g. forcing the color to black could result in black-on-black text if you have an inverted-text decoration at the top of a chapter). And some manufacturers' devices annoyingly override fonts by default, which results in a diminished experience in books that use different fonts to convey meaning (e.g. computer books that use code font for symbol names).
IMO, what we really need are standards that all the reader vendors agree upon, including:
And so on. Then again, half the readers ignore large swaths of the CSS specification already (and probably the HTML and EPUB specs, where applicable), so I cynically wonder if they would just ignore these sorts of standards, too....
Their EuqalLogic and Compellent lines are developed and supported out of Nashua. Might move around now that they own EMC but for now, it is there.
This would have to be done carefully, i.e. you can't post an edit after someone has clicked the reply button (not actually posted the reply). And the person replying would need to be notified if the post had been changed since the page was loaded.
I would suggest instead:
This simplifies the logic by not needing to globally track whether anybody is currently replying to a post, whether they've cancelled that reply, etc. It also maximizes the window for edits.
When I hear someone say "Get rid of AC," I interpret that as "Children should be seen and not heard,' where adults == people who have taken the time to register, and who have some form of local reputation on the line. You're not wrong, but you're missing out on some priceless truth from time to time if you do that.
Agreed. Eliminating AC would be bad, m'kay.
Sometimes there are valid reasons to post as an AC, like when you work for a company that's being talked about or whose product/technology area/business model is being talked about, and you want to correct other people's mistakes without taking the risk that something you say might get quoted by the news media as "a(n) [insert company here] employee said". Yes, some people abuse that privilege for shilling, but lots of people take advantage of that privilege to avoid risking their jobs when they're saying something critical of those companies, too, or saying something neutral that still might be taken the wrong way.
The same goes for sensitive topic areas (though these tend to be kind of outside Slashdot's normal focus area). Sometimes, an AC post might take a contrarian devil's advocate position to encourage people to dig deeper and form a more nuanced opinion, without taking a huge risk of personal embarrassment if someone thought that the poster actually believed that position. The poster might even be willing to share certain personal insights anonymously that he/she would be embarrassed to post in an attributed fashion because of the stigma associated with it, such as talking openly and honestly about having been sexually abused or something.
And heck, sometimes an AC post might even be a whistleblower. Obviously, without any way to verify the authenticity of that person, there's the risk of abuse and even libel, but on the flip side, there's also a very real possibility that an AC might say something that gets people looking in the right places to find out about something really bad that a company is doing.
So I think that eliminating ACs would be a really bad idea. With that said, I wouldn't mind limits on how many times you can post as an AC from a single IP without logging in, nor would I mind stronger spam filters/redundant content filters on AC posts. There are probably a fair number of things that you could do to reduce the most egregious AC abuse without affecting its legitimate use much at all, and I'd be in favor of those approaches.
I was thinking the same. IMO, the rule should be no moderation of descendants or ancestors, i.e.
I've seen it abused in other places. A staff member was losing an argument and started threatening bans for people who disagreed with him.
That doesn't really apply to people who get only 5 moderator points once every few months, though. There's only so much damage that we plebs can do.
Nah. It should be "Geeks React", and it should look like either goatse or an ASCII middle finger.
The wasted power argument isn't silly. Yes, there's a lot of power waste that I could shave off elsewhere. I'm not one of those people who puts mechanical power switches on their TVs to save a few dollars a year or anything. But when I buy electronics, I do expect manufacturers to do their best to minimize power consumption, within reason, and I actually have replaced equipment when I determined that the power savings would pay for the replacement in under 5 years, so as a consumer I do at least pay some attention to power consumption. It isn't the most important thing, but it isn't unimportant, particularly when my marginal power rate is almost 35 cents per kWh.
More importantly, I'm savvy enough to understand that every device in my house wastes some amount of power, and when I upgrade equipment, I expect the new equipment to be at least as efficient as whatever it replaced, and to always exhibit lower idle power consumption. Whenever technology goes the opposite direction, there had better be a darn good reason why, and saving two seconds when I plug my phone in at night doesn't qualify as a darn good reason to me.
As for convenience, as I said, it takes all of two seconds to plug my phone into a charger when I get home at night. To me, any convenience argument is silly, because the difference in convenience between a charging pad on my bedside table and a bare cord is so slight.
My rMBP just turnes 1 year old this month, though, and while I'm still on the firt spoer supply (and using the Magsafe to Magsafe 2 adapter to make use of my wife's power supplies in order to limit the overall number of failure-prone devices), the jacket has severely yellowed and recently started to brown around the last 2 inches or so of cable.
Take it to an Apple store ASAP. If it really is that new, they should replace the supply under warranty. That yellowing indicates massive overheating, probably caused by a partial break of the insulation between the hot wire and the shield, resulting in arcing internally within the cable. This is potentially dangerous (fire risk), so you should stop using the cord immediately and get it replaced.
That brings back memories of the yo-yo power supplies for the PowerBook G3 series, early iBooks, and maybe early PowerBook G4s. At some point, I discovered a smoky haze on one of mine, and ignored it. Later, I saw it plugged in with the overhead lights off, and you could actually see the electrical arcs through the clear cable jacket. It was pretty cool looking, but needless to say, I stopped using that cord.
There's a wide range of MagSafe cables, with widely varying robustness:
With that said, none of them rise to the level of "good", much less "great". Presumably in an effort to reduce the power filtering hardware inside the computer itself, Apple used a shielded two-wire cable. So even though the outer jacket is amazingly tough, the wires inside have to be pretty small or else the shield and outer jacket would be ridiculously unwieldy. This means that there are fewer strands, so the wires fail sooner, and there's less insulation around each wire, resulting in a higher probability of shorting out against the shield than if they were larger (and infinitely higher than if there were no shield to begin with).
What can Apple do to fix the problem? A simple three-step solution would go a long way towards fixing it:
This won't fix the breakage, but replacing a $10 cable will be a heck of a lot more palatable to customers than buying an entire $80 power supply just to replace a $10 cable. It will also decrease the number of SKUs that they have to keep on hand, and (assuming they use a standard connector for the other end) will also make it possible for third parties to build alternative power supplies for Macs (including external batteries) without having to resort to horrible hacks like cutting the wires off of Apple power supplies and soldering connectors onto them.
More significantly, because they wouldn't have to rev the entire power supply every time they tweak the cables, the lower overhead of improving them might result in faster iteration towards something that's more reliable. For example, they might use a slightly larger plug with a small filter cap inside it. That way they could maintain backwards compatibility with hardware that lacks sufficient filtering, while allowing them to move to a more traditional, bonded two-wire cable without a shield (or, if they really feel the need to provide a ground that is separate from the negative side of the DC, a bonded three-wire cable). I suspect that this would fix the breakage problem entirely.
Tesla is attracting engineers, because what they are producing seems something useful and world changing, while Apple products while nice, and remain to be great products. But Musk's empire Tesla, Space X, and SolarCity. are bringing grander changes to the world, Something that perhaps history will look back with fondness, on our generation and say we accomplished something. While the generation preceding this Apple was credited for the personal computer for the masses, the iPhone and iPads while wonderful technology are at best would be footnotes in history.
No, not really. They brought PDA technology to the masses in a way that Palm et al failed to do repeatedly. Unfortunately, now that Apple has done that, there's really no more room for significant innovation in that space. The products exist, and the market is mature; you can make small, incremental improvements, but otherwise, the only thing left to do is to drive the price down to the point where everyone can afford one, and that's not likely to increase profits by much.
Apple's biggest problem has nothing to do with coming up with the next iDevice, and everything to do with the flawed assumption that this "killer new device" should be an iDevice in the first place. Apple has gotten into a rut where every new product since the QuickTake (or, arguably, the iPod) has basically been a new variation on a computer. I mean, an iPad is basically a laptop without the keyboard. An iPhone is basically a smaller iPad. So when you look at it from a high enough altitude, Apple hasn't really come up with any new hardware products in two decades. Even the Apple TV is basically just a little general-purpose computer. Apple is basically a one-trick pony at this point, and they've had an amazing run at it, but like all companies that are too focused on a narrow area of technology, there's only so far that they can go down that path before the market becomes saturated. And eventually, a disruptive player will enter the market and start eroding its market share, and then the company will become another Microsoft, and that new disruptive player will become the next Apple.
There's only one way for Apple to avoid that fate, and that is to expand its focus by building something that isn't just a glorified computer in a different form factor. Apple could do this either by disrupting an existing, struggling market or by starting a new market for a new class of products. That second one is hard, but the first one is easy. And Apple's existing technology could be used synergistically to dramatically improve the products in those markets.
If Apple is willing to take on a niche market, the DSLR market has a lot of room for improvement. It won't ever be a huge market, but it is likely to remain a solid pro market for the foreseeable future, and would be easily disrupted by a company with the hardware engineering resources that Apple could bring to the table. And if they bought Canon (market cap 30.59B), they'd get a great collection of well-built lenses, good lens engineers, and a respected brand name that they could build upon. Now imagine a DSLR that integrates well with iOS, runs a better OS under the hood, and has a more capable CPU to allow for amazing features that just aren't possible right now.
Apple could also buy iRobot and add iBeacon support to Roomba's existing Wi-Fi triangulation to produce an absolutely jaw-droppingly accurate automated vacuum system. It could automatically avoid areas that you mark as sensitive, use visual sensors to avoid sucking up your Lightning cables, and so on. It could tell the difference between the cat and a piece of furniture, and slowly creep up on the cat until it gets the message and moves. And they could bring the price down to something that normal people can afford, by taking advantage of the Apple hardware teams' skill at reducing manufacturing costs and increasing yields.
Apple could buy Tesla, use their battery tech to improve Apple's laptops and cell phones, and use Apple's tech to improve the in-car experience. It could then make Apple stores be "showrooms" to get around all those pesky state laws requiring a physical presence for sales within those states. Synergy.
Apple could buy Disney/ABC/ESPN and make live streams available by subscription on Apple TV for people who have cut the cord. This would, of course, immediately bring all the other networks back to the table in a state of panic, and they would finally succeed in turning cable companies into dumb pipes. It could then spin it back off to avoid various conflicts of interest that would likely hurt the quality of Apple's hardware and software products in the long run (see also Sony).
Apple could put money into solar and biotech research, along with any number of other fields that could use a shot in the arm.
There are so many amazing things Apple could do with its huge cash horde that could revolutionize the world as we know it, while simultaneously being great business moves that would open up new markets and new revenue streams. Not all of them are insanely exciting, but all of them are potentially quite profitable. Apple just needs to have the vision to take that next step towards building products that are more than just a computer in a different form factor.
I question Facebook, too. I can't imagine anybody in their right minds wanting to work in Menlo Park. IMO, the peninsula is the worst part of the Bay Area in which to work. From anywhere that mere mortals can afford to live, figure half an hour of driving at two miles per hour up 101, and an hour driving back in the evenings. That entire time is basically spent just going through Palo Alto.
And if you don't eat at FB's cafeteria (I assume they have one—I've never worked there, just near there), good luck finding any food at all. There are two tiny shopping plazas that have a couple of restaurants, none of which are walking distance from anywhere. The parking lots are full from about 11 to after 1, so you either drive around for ten minutes waiting for parking or drive a couple of exits up the 101.
Compare this with, for example, Apple, where there are probably a dozen restaurants within an easy walk of the main campus, and where you're right at the confluence of two major highways, one of which is usually passable at any given moment, and right next to De Anza, which is a viable city street alternate for 85 if you're heading south towards Los Gatos, Saratoga, or Santa Cruz or north towards Sunnyvale. (Unfortunately, Stevens Creek isn't a viable city street alternate, in my experience, thanks to very poorly timed traffic lights. Otherwise, Apple's location would be utterly amazing.)
But Google stealing people from Apple? Sure. It happens all the time. And startups steal people away from both of them. Honestly, Apple is a victim of its own success in many ways. Nobody goes to Apple thinking that they'll get stocks and options that will skyrocket in value these days, because the stocks aren't going that direction at any appreciable rate. And lots of the old talent made enough money off of AAPL to let them retire, so the company would have to be God's greatest gift to humanity if it wanted to retain most of those folks.
There is nothing anyone has to offer Johnny Ives that he doesn't already have.
His own company?
Life would be so much easier if we could just look at the source code. -- Dave Olson