Kevin Mitnick's twitter has this update:
We'll Know, now. See, there are two astronauts who happen to be twins. And they have sent one astronaut into outer space and the other astronaut will stay here on Earth. As time passes on board the International Space Station, we will see whether NASA Astronaut Scott Kelly develops strange new neurochemistry the likes of which humanity has never before seen, or if he stays normal like his brother Mark. Time will tell whether this theory about the brains of people in space twisting and contorting in untold seemingly impossible ways holds water or not.
Maybe this is all an elaborate test to make sure that the paper wasn't computer-generated content (something that apparently has made it through peer review before).
I was going to wait and hold my thanks until after I actually had the operation done and my "transition" complete, but right now I feel like a complete idiot so here goes: thank you, science, for twisting my brain in a knot.
I'll go right down in the basement and haul up my holographic projector, I'm sure the wife won't mind if I borrow the kids' sleep chamber cryo unit to fire up the display. After all, how can they mind since they're all holograms? Ha, ha! Clever people we are these days.
Yes, I've been wondering when they were going to get around to building the operating system for that machine all of us ran out and bought in 1974 or whenever. I knew it was a good investment. Everybody said, "no, dude, don't buy the holographic projector, there's no O.S. for that, yet." But I just punched them in the god damned groins and ran away laughing because I'm a genius.
I'll have to dig it out of my giant mountain of 3D glasses and virtual reality headsets and body hoists, but that will give me an opportunity to sort them all by decade. Maybe they will make an operating system for the VR things or 3D glasses next, who knows, it's Microsoft -- whose motto is, "If It's Further Away From the Command Line Then It's The Future, We Will Guarantee You That Much."
Hopefully this return to common sense and keeping our high-tech up to date with actually running software signals that very soon Microsoft will publish the first operating system that runs entirely on teledildonics. Then we can call customer support and ask them why they're fucking us so bad!
it's so awesome seeing these old-timey side show barkers peddling snake oil from their wagons.
"my goodsh are the REEEAAL goodsh!"
"no god damnit he's a liar, he sells poison!"
"snake oil is the best poison money can buy!"
"see that, he's just after your money!"
"you heard it here first, folks, he's giving his away for FREE!"
"no wait a minute!"
I have a phone with an FM receiver active (it uses the headphone cord as an antenna) and this thread got me wondering about things like emergency radio, "scanners", etc. I ended up finding some old threads in other forums with people who found that this phone model's FM rx is in a chip that also has tx capability. But Broadcom doesn't want to share the pin outs and it looks like the threads all died. HTC EVO 4G if anybody's interested. This is along the lines of transmitting for a number of meters, of course, not for communication at a distance.
Correct me if I'm wrong, but I think most cell phones don't have the power supply to transmit FM broadcast band several miles away, let alone the antenna. And if you can't reach across several miles then what is the emergency purpose of the FM rx/tx capability? Anybody nearby is probably going to be similarly affected by said disaster. If you're worried about being separated from loved ones during a tumult, then you're probably going to want to be able to scan for them over a large area.
I'm just not sure what the argument is for having a bunch of underpowered FM rx/tx going on in the middle of a disaster.
Now, if you really want this to go through, what you have to do is find a disaster where all of the following can be clearly shown to be true:
a) numerous people died
b) there was an internet or mobile outage or lack of signal
c) it can be shown that the lack of internet or mobile outage of lack of signal contributed to the untimely deaths
d) it can be shown that a little FM rx/tx would have saved their lives
Just one such occurrence could actually get you want you want. No company likes to be the actual literal bad guy, and these dormant rx and tx capabilities would start showing up a lot more often.
Still I don't think it will get any better than walkie-talkie across a few dozen meters.
Let's say there's a rescue team after you and you need to transmit your location details. If they're that close, you can yell. If you're under a bunch of debris, you're not transmitting very well any way. If you're thinking triangulation and mapping a bunch of blips of potential victims, well they already have GPS operating.
I'm just not seeing the point.
Probably a mistake considering the "new" one is next to impossible to use.
Or maybe Evil Google just felt like making it hard for people to look shit up.
and i'm not suggesting that C wouldn't make a good first language, in fact I already suggested that in another comment. but it's not "an educational language".
the fact that LOGO is not on the list the author linked to kind of makes me feel a little perturbed. like, wtf is wrong with people? there's a mention of "brick logo" as a footnote in some other language's paragraph, and "c" is mentioned as "educational" (wtf?) but not one shout out to LOGO.
did anybody else feel any sort of reaction to that? or did everybody just not even notice it?
Also, I took a look at that list and: no.
Don't teach him a useless joke / toy language like these ones on this list. It'll build a bad habit and the kid will be one of these losers saying "I don't know how to program but I got code::blocks and here's my console emulator, shouts out to the one guy who gave me that voo doo asm to build in line and make it work real fast, everybody please stop sending me e-mails about getting root kitted, this thing totally passes a virus scan."
If you want something that has a strong visual appeal but teaches actual programming practices and has been actually used in industry, I suggest you teach LOGO. LOGO is super-super-super simple easy shit, and you can learn it yourself as you're going. Most LOGO primers and tutorials are practically on a child level any ways because the language is so simplistic. Of course, if you don't know a single thing about trigonometry or geometry you'll probably see it as a useless language.
Over any single one of the weird "robots" and "kids oriented" languages on that list you linked to, I would recommend LOGO.
I bet you can even find a "LOGO for kids" or some shit if you looked for it. For decades, LOGO has often been used as a first language for youngster so I'm kind of scratching my head how it passed you by.
I think 7 is too early. The kid should be outside playing and using his imagination with real world objects at that time.
I also recommend using C. It's simple and C-derived compilers typically support some version of it.
I learned to program in BASIC for the Atari and the Sinclair ZX-80 when I was 8 and a half. I don't recommend using numbered line BASIC or any BASIC, at all. If I could go back and somehow influence how I was taught, I would tell my parents to find something that supports parameterized function calls instead of GOSUBS. C would be best. But if you're really intent on using BASIC for some reason, if you're on the PC I recommend Microsoft's QuickBasic as it allowed you to get away from the rather intimidating edifice of Visual Studio. You'll also have to sandbox it inside of something like DOSBox to get it to run on a Windows environment so there's a plus.
If you opt to just get an old clunker instead of emulating DOS, I recommend a 486 dx4/100. The architecture is simple enough for a kid to learn into adolescence but powerful enough to show how impressively computers can complete some tasks very quickly. I also recommend an ATI "all-in-wonder" graphics card, because it features CGA, EGA and VGA so your kid can learn about legacy graphics as well as switching modes. You shouldn't use a CRT if you can help it, if the kid wants to get into the inner workings of the screen you'll have an armful of stuff to teach him about electricity safety first. So get a modern flatscreen and get a VGA/EGA plug adapter if you have to, to keep on the side for EGA projects. As long as the flat screen is unplugged he shouldn't die of electric shock touching anything inside of it.
The great thing about older machines is that a lot of the components are visible on a macro-scale. It's a lot easier to differentiate between the resistors, capacitors, and inductors in older machinery. Now a days they'll all tiny little squares with little print designating what they are. It's also easier to work on older boards in terms of soldering and other "circuit bending".
All that being said, I recall some hobbyists telling me back in the day that the Apple computers made the best projects. One guy said he had obtained a dozen Apple IIe's on the cheap and because apple computers are made to network easily, he was able to use a later Apple model to organize all the IIe's into parallel computing. An exercise like that could be fun, albeit space-consuming.
If you're going this sort of computer-engineering route involving getting to know the hardware, I recommend also teaching the kid assembly. On older machines like the ones I mentioned, and using older operating systems, this is less of a headache. By comparison, I was looking into "high level assembly" for windows systems and the skeleton just to have a window open with a button to close it again was large enough to dissuade me from going much further. ASM in DOS was far more elegant, which is why these days if you mention writing something in assembly most people think you're crazy. Even though once again many popular compilers support inline ASM.
When I was fiddling with old Sinclair or Atari machines the latest hardware was stuff like the 80286. And when I finally got an 80286 the latest hardware was the Pentium, and so on. Getting things done with older hardware gives you two special perspectives on everything: (1) getting to know how everything works because the machines and operating system aren't so enormous and bloated that it's overwhelming, and (2) having to make do with less memory and processing power forces you to learn things like optimization and paging. People use memory like it's crack today and talk all tough like their memory is infinite, but little do they know RAM is paging quite often in Windows because of programming practices like that. And those same people speak about memory management in their favorite object oriented languages like it's impossible to perform. Trust me, you would much prefer that your kid is one of those people who can do their own memory management. If you give them a shiny brand new computer to learn on, they'll have no incentive to do better than use it like crack like everybody else does.
They should keep the original melody and syncopation, but they should play it in really whacked-out "80's b-movie" style metal guitar.
I forgot the photo shoots FHM did of Anderson, as well.
Also, considering the stars basically stated publicly that they had some romance going on and had considered marriage, what does it matter if Mulder is getting any action with Scully? Duchovny was (maybe is) getting action with Anderson. The best thing the show could do is keep the tension high by not having the characters hook up.
There is a "new McGyver" project going, involving the original producer (or was it the original writer?)
A new Quantum Leap would be cool. There's a pretty cool John Maus song by the same title that's kind of about the same subject.
If it's wanking you're concerned with, maybe you should do a google search for the images of Anderson (and Duchovny perhaps) that Rolling Stone did.