If — no, when — age reversal becomes a reality, who gets to live?
Well, me... duh.
If — no, when — age reversal becomes a reality, who gets to live?
Well, me... duh.
A half dozen or so pieces of fossilized bone. One of which is part of a femur. NONE of which are hips, legs, feet, brain case, etc. With what they have, they might as well have pictured our long-snouted proto-croc riding a Harley!
RTFA. One of the comments in the article points out that "skull and spine morphology is highly diagnostic of locomotor adaptations."
Bigger battery - yeah it would be nice but its not hard to keep it charged up
(and keeping two charged up is more aggravating)
You're kinda stepping all over your own point, there.
- thunderbolt and the ability to drive an insane number of displays
Have you tried this? My wife's work HP has a mini-DisplayPort, and I have a splitter that allows us to plug one cable into the laptop and drive two DVI displays, for a total of three desktops. When plugged into the Mac, all it does is mirror; OSX doesn't see the displays as separate displays. Does the multiple-display-over-one-cable only work when daisy-chaining Thunderbolt, or did you find a Thunderbolt-to-DVI that actually works with OSX? I have to say, IME the Windows multiple display support has been superior. I haven't tried Linux yet.
Laptops that had all these features have always come in at similar costs.
Maybe. IME Apple quality control is crap.
Also, I really love how my current MBP plugs into my display. One cable for power, USB, and display. The thunderbolt displays are basically a solid docking station.
I agree, that's pretty awesome, as long as you have some sort of Thunderbolt-capable receiver, which always seem to be unreasonably expensive. Thunderbolt is serial, which means any device that doesn't have a pass-through becomes a terminator. This means that USB will always be in the mix, which means a USB hub, and most devices will be USB. Connecting multiple drives means USB, or finding Thunderbolt drives that have a pass-through, which limits options severely and again pushes up the price. As long as you enter into Thunderbolt with the expectation that it's only a glorified docking port, I think it's a great solution.
Try to do the opposite now.
Build a $800 desktop PC and try to build a Mac with similar specs and look at the price. You will end-up with that expensive Mac Pro.
As I posted elsewhere in this thread, in my experience that would be a mistake. You'd get a superficially pretty device with third-grade internal components. Every one of the three MBPs I've purchased have had some sort of internal hardware failure within days of receipt (when I was lucky) or just after warranty expired (when I wasn't). I have no fewer than six non-Apple laptops in my basement, the oldest dating back to 1997, and they all work; they were changed only because I wanted to upgrade the platform. These include laptop upgrades for both my wife and myself, so we were averaging upgrades about once every 4 years. After we switched to Apple, we were averaging upgrades about once every 18 months, and every time prompted by hardware failures.
I'm a "lapsed Apple guy"... ran MacBook pros for years, had iPhones... now I'm Android and Windows. Reason? The "Genius Bar".
Oh, man... I feel your pain. Although I'm still running Apple products, I'm slowly migrating away as technology upgrades permit. Your portrayal of the Genius Bar is right on, but the real reason for me is the quality of the hardware (which you later reference in passing).
Apple products are beautiful on the outside, but they're crap on the inside. My very first MBP had a faulty CDROM -- sounded like a Harrier jump jet when it span up. Since I took it back within a couple of days, they just swapped it out, and I wasn't worried. 14 months after they gave me the second MBP, something on the motherboard crapped out (or so they say... it was suspiciously immediately after an OS upgrade), and I had to pay $400 to get it replaced. Not long after that, the battery swelled alarmingly, deforming the case -- they fixed that for free, but I don't know if that's because I made such a stink about it just having come back from the shop. I upgraded to a Retina, and just after the warranty expired on that one, the hard drive went out. So, my take away has been: you're paying a premium price for crap hardware; when they offer you the extended warranty, take it, because the internal hardware is not designed to survive past a year.
All of this would be annoying, but the real kicker is that both heterogeneous and homogeneous Apple solutions are crap. Time Machine doesn't work well with mounts served by Linux; for me, after a few months the backups start taking hours to complete, so I bought an AirPort Time Capsule. Apple doesn't put software on either the Apple TV or the AirPort to allow streaming content from the AirPort to the Apple TV without a Mac in the mix: you also have to be running a Mac with iTunes for audio streaming (or use AirPlay with a third party app -- but it still requires a Mac). That's either gross incompetence, or blatant commercial greed driving customers to buy more products when there's more than enough processing power on either of the two devices to decode compressed video. AirPlay is really tempting, but it's flaky; I often need to reboot my MBP to get it to see the Apple TV (and in my house, this is with an AirPort providing WIFI, so there's no non-Apple technology in the mix), and sometimes the Apple TV would stop seeing the machine running iTunes and I'd have to reboot that to get streaming to work again.
After a couple of times having to run around rebooting machines just to watch a movie while the family waited, I gave up. I'm now running XMBC on an Odroid, connected to another Odroid running SqueezeServer. It wasn't as easy to set up as the Apple products, but ease-of-setup is worth nothing to me if the products don't work reliably. Oh, and the TV remote will control XMBC over the HDMI interface, meaning an end to having to use two remotes. I haven't gotten around to testing AirPlay, and I still have the AppleTV in the mix because there's no Netflix app for XMBC on Odroid; it isn't all rainbows and unicorns, yet.
My next laptop is going to be an XPS 13, or an X1, or whatever is thin and has decent hardware support in Linux at the time I make the purchase. OSX is nice, but if I'm spending that much, I want more than just a sexy shell: I want quality internal hardware, and I would really prefer to never have to deal with the Genius Bar again.
Thank you. While I don't agree with you, it's nice to see an opinion opposing SystemD that consists of more than a mixture of cussing and ad-hominem attacks.
The problem is that Poettering and company have hijacked mainstream linux that almost all linux users use and changed it into something unrecognizable.
This is amazing! Where did they get such an ability? If only we could convince them to use their hijacking super-powers for good; imagine, maybe they could finally make the Year Of Linux a reality.
I realize that on Slashdot, where people tend to be highly math-oriented, it's a popular fallacy to believe that a calorie is a calorie is a calorie. However, studies like this one have been coming out for years now showing that that's simply not true.
Amen. As you say, some foods are more difficult for the body to extract calories from. The body will end up extracting (roughly) the same number of calories from foods with the same caloric content, but the rate of extraction differs, and this can make all the difference. That's one of the ideas behind diets like the South Beach diet: it tries to avoid insulin spikes, which helps control hunger, which helps dieters resist over-eating. Insulin spikes cause all sorts of interesting physiological reactions beyond making a person hungry, such as fatigue, and can contribute to the development of diseases such as diabetes. While not the cause of all the world's woes, insulin spikes (and the foods that cause them) are good things to avoid unless, you know, you suddenly need to outrun a bear. If you're being chased by a bear, by all means suck down that energy gel. And don't bother running downhill; that's a myth -- bears can run downhill as fast as they can uphill, and they can run up to 37 miles an hour. So the bear will still get you, but at least after eating the energy gel you'll taste a little sweeter for the bear.
140 calories from a can of coke is not equivalent to, and will not have the same effect on your body, as 150 calories from 1/4 cup of steel-cut oats. Your health will be better for eating the oats. I don't know whether bears prefer coke or oats; they probably prefer coke, but don't quote me on that.
I honestly can't remember the last time I read software documentation. API docs, yes; functional specs, yes; but documentation for an application with a user interface? Wow. No.
Then women are the one advantaged. They get more pardon, reduced sentencing and easier parole. I don't even start on domestic issues and divorce where they are more than likely to be seen as the victim, get alimony and child care...
The domestic and divorce issue depends heavily on which state in which the divorce/domestic violence occurs. In California, the law does tend to favor women in divorce proceedings; in Georgia, on the other hand, men are favored. Marriage equality laws are too young to provide any evidence about whether these biases are sex related, or are a reflection the state's definition of fair division. With a national average of around 15% of dual-spouse families having woman as the primary earner, it may be difficult to get a statistically realistic answer, although without marriage or income equality, it's essentially a gender bias.
I'd like to hear from more people with smart watches who are happy with them, to better understand the appeal.
I'm really happy with the almost-smartwatch Pebble. I wear it in preference to my other watches (and I've always worn a watch). There are three things I like most about it:
I have gotten more positive comments about this watch from strangers than any single other thing I've ever owned. I've been asked about it on public transport in NYC and in check-out lines in Philadelphia and London, and at twice I've had people literally stop mid-sentence to ask about it.
I hear this a lot from people who write unmaintainable code that's full of 'clever' tricks that usually have no measurable impact on performance and, when they do, actually end up making things slower.
A microcontroller has almost no relationship to the kind of system that you find in a modern desktop or even mobile phone.
And I hear this a lot from developers who write buggy, inefficient code that fails under load. Ain't anecdotal evidence great? It's almost as good as straw man arguments.
OP was talking about the fact that, when people don't understand the basic fundamentals of how computers work, they make poor design and coding decisions. Having a good understanding about what's going on under the hood, at all levels, is critical for a lead developer.
Computers are not intelligent. They only think they are.