Comment Didn't the movie industry try something like this? (Score 1) 737
Oh yeah, Divx.
Whatever happened to that?
I found DMT did not produce any of those effects for me personally. Instead in my case it disabled my synaesthesia for about 15 minutes or so. This was extremely disconcerting at first but once I understood the effect a bit better (and realized I was still breathing despite not receiving the normal feedback from my lungs) it was more interesting than alarming.
While I'd been aware that I have multiple forms of synaesthesia, ranging from the common ones like grapheme-color, to more rare like lexical-gustatory, to just outright weird like numeric-topology, there were quite a few more subtle ones I hadn't taken into account. Some standouts in particular were synaesthesic mixes from channels like emotional state, internal sense (from organs), facial recognition, and temperature. It can be a hard thing to explain because to me that's just how the world is.
Before I saw a documentary on it in my 20s I thought everyone experienced things this way. Then I learned otherwise. After experiencing things without it for a little while, I feel kinda bad for everyone else. But on a positive note, I was able to understand how I was generating social anxiety in a feedback loop (emotional-visual overlay with complex things like facial recognition bias) and haven't needed anything for anxiety since the experience with DMT.
My thanks to the creator of this poll for specifying "caffeine" and not "coffee". As a person with the (rather rare) condition of an allergy to coffee, it is appreciated.
It may sound petty, but you have to realize how annoying it is to constantly explain why I don't want to go for coffee on a date, don't want any after eating at a restaurant, or on a plane flight, or when buying a donut, bagel, or breakfast combo. You should see the looks I get at McD's when I ask for milk with an Egg McMuffin. People don't often take just "no thanks" for an answer, they want to know what's wrong with me that I don't consume the default beverage.
And of course then once I explain they say something along the lines of "Oh that's so horrible, I couldn't live without caffeine!", and then I have to explain that I enjoy caffeine very much (within moderation), it is probably one of the other thousand or so chemicals, possibly one of the dozen-plus carcinogens that is a problem for me.
The worst part is the sensitivity to the smell. It is by far the most sensitive allergy I have in terms of air quality. I sympathize with those allergic to peanuts because I have some inkling of what they must have to deal with. Fortunately it's not quite that bad for me, but bad enough that I forbid it from my own house/car and constantly battle severe nausea in places that specialize in brewing it.
Interfaces are often overrated like that.
Having dealt with the very specific headache of having to use and modify an FFT function before, one problem that is difficult to avoid is the fact that most coders who write math libraries and things like FFT functions tend to write them in math terminology. I.e. there are lots and lots and lots of confusing little single letter variables that are obvious in purpose to a math major (or someone who's done the math for engineering/physics/CS) but completely obscure to anyone else. Sometimes x or w or n or k as a variable choice is just a lazy programmer. Other times it's mathematically significant. And if you don't have the experience to know it's the latter and not the former, especially since many math programmers are very lazy about documentation since they assume everyone will either know the math or go look up the math, it can turn into a real headache real fast. And if you're somehow in the position of having to deal with functions written by programmers from different scientific disciplines (math + physics for example) you can just forget about consistent and descriptive variable naming. Because to them a single letter IS descriptive and not obscure at all, but to anyone else it's not, and the systems are not consistent to each other (too few letters for that).
You only THINK you had a horrible weekend. You're actually just a personality fragment of the guy still in the deprivation tank from a few posts up.
No, the problem is that EA made a game out of a tool. Granted they picked the best person possible for doing this (Will Wright), but the fact remains that Spore was an R&D testbed for next-generation user and procedural content generation systems, and not a game.
I'm pretty sure that Spore started off as the Creature Editor on someone's desktop at EA and they showed their boss how they'd made "3D Studio Max For 8-Year-Olds." Which granted is a phenomenal achievement. I'll gladly give them credit for making a tool that is an absolute triumph of technology overcoming complexity and making advanced computing accessible to the mainstream. But it's not a game.
So after seeing this fantastic tool, EA calls in "That guy who made the awesome tool-based games." i.e., Will Wright, and said "Ok we're going to throw an entire game worth of money at R&D on these new tools, because they'll let us make all our other games in the future far cheaper (and we can fire all the redundant texture artists and animators). But to justify it we need it to become a game. So your job is to make these tools into a game, so that users will buy it and beta test it before we use it seriously in a real game."
This is why Spore got delayed and delayed and delayed and "reworked because it just isn't fun according to testers." Wright basically promoted it as Sim Earth 2.0 and tossed in the rest of his bag of tricks and I would say did a reasonably good job at making a boring production tool into a fun game. But you can see the rough edges (city/civ stage is laughably simplistic for example) on it where the game was kinda pasted into the gaps between tools. And just to make people not feel cheated they were nice enough to throw an entire mediocre real game (the space stage a.k.a. Sim Earth 2.0: Sim Earth In Reverse) on top so it feels like you actually played something instead of just designing content for them.
So gee think I'm surprised that suddenly they're outsourcing the API interface testing to the mainstream as a "contest" too?
But to be fair, EA sees the production costs of games skyrocketing, and they knew something would have to be done sooner or later. That "something" is likely to be procedural asset generation and user content generation, and everyone in the industry knows that's where the time and money bottlenecks are, asset and content generation. So producing something like the modeling, texture, and animation systems Spore first means they're a generation ahead of their competitors.
P.S. I still love Spore, but I also still wish for a more technical Sim Earth 2.0. At least he was smart to put the hard part (terraforming) at the end this time instead of the beginning like it was in Sim Earth though.
Wow, you mean MS actually has one up on Apple for once? Needy windows (like new IMs) flash on the taskbar a few times and then become the solid "attention orange" if you ignore them. I only wish the name of the user/title of the window on the taskbar icon would change to reflect which one has updated last, rather that which one I interacted with last (i.e. sometimes it looks like someone has messaged but it's really another user's window in the group). I'm not sure if this falls upon the application (Pidgin, Google Talk) or the OS to update that info, since I don't know if the attention code just needs to be updated to change to the title of the updated window displayed on the taskbar as the representation for a group, or if it's actually the application deciding what to show in the taskbar.
Here's my most frequently bitched about UI complaint:
18. Faster access to High Performance power plan
Clicking on the battery flout from the taskbar notification area offers two different power plans: Balanced and Power saver. Windows 7 laptops are configured by default to use the Balanced plan since this setting best balances a good experience while promoting more environmentally friendly power use. However, some customers tell us they want to be able to quickly toggle between Balanced and High Performance (yet another power plan). Weâ(TM)ve taken a change to now show the latter in the flyout menu when it is enabled under the Power Options Control Panel.
This has been perhaps my biggest complaint (which goes to show you something) about Win7 beta on my laptop (Acer Aspire 6930). It takes 2 clicks to switch from high performance or power saver to balanced. But to switch from high performance to power saver or vice-versa takes 5. For no good reason. It involves clicking the taskbar icon, opening a window for "more power options", clicking "show additional plans" despite ample room to show the third plan, clicking the selection button, then closing the window. 5 clicks vs 2, because we can't handle a third power choice? I'm glad someone is awake over there.
And here's probably my second most bitched about UI complaint:
33. Reviving familiar entry points
Mando writes, âoeIn Win7 the Win+E shortcut opens an explorer window but the path is âoeLibrariesâ instead (which isnâ(TM)t where I want to go most of the time). Is there a way to configure the target folder of âoeWin+Eâ or is there an alternate shortcut that will get me to the âoeComputerâ path like it did in Vista?â RC reverts the behavior and now the shortcut will launch the âoeComputerâ Explorer. Also, we changed the link in Start Menu -> Username to match the Vista behavior.
And bonus, here's my most bitched about hardware support complaint, which I mentioned in another slashdot thread a couple days ago:
29. Improving the headphone experience
Customers informed us that sometimes their audio streams did not properly move from the default speakers to their headphones. The fix required an update to the algorithm we use to detect new devices. In RC the transition works more reliably.
Most of the rest of the stuff sounds pretty good too. I'll admit I've been a bit skeptical about this whole pinning things on taskbar which is now also the quicklaunch at the same time type deal. Mostly because I'm used to all my quicklaunch apps being on the left and not having to hunt between open apps to launch a new one. But that win-# shortcut sounds like it will justify the whole deal for me, so I will withdraw my complaint on it pending testing of that feature.
#1: Acer Aspire 6930 bought on post-xmas sale from Staples. Core 2 Duo T5800, 4GB DDR2 667, 250GB SATA HD, Integrated Intel 4500MHD, Intel 5100 wireless.
Problems: Sometimes audio driver doesn't automatically detect headphones plugged in and switch speaker output to headphone jack. Oh and HDMI audio may have the same issue if turned on while hooked to a TV that's off.
#2: Piece of Junk (literally) desktop. Core 2 Duo E6300 @ 3.63GHz on Asus P5B, 2GB DDR2 1066, ATI HD4850, 400GB SATA HD.
Problems: None.
#3: Toshiba Portige 4010 (So old it came with Windows 2000 installed because XP wasn't even out yet): Intel Pentium III mobile 933MHz Low Voltage, 512MB RAM, 30GB IDE HD, Intel 2200BG wireless, Ali integrated video and MB chipset from hell.
Problems: Newest Video driver for integrated Trident Blade3D (DX7 class) video is circa 2002. Windows 7 build 7000 automatically detects the install issues and retries with compatibility settings and succeeds . The driver works, except when it tries to create an overlay surface it locks up. This is not a bluescreen, the chipset actually freaks out because it's crap and the driver is badly written. Same issue under XP (which the driver was written for) on this machine. Using the video in SVGA mode solves the crash problem but is too slow for video playback. Fine for browsing and word processing though.
Performance is slow, but usable on a 9 year old laptop. Checking memory usage with the default install of "Ultimate" edition using Win7's Resource Monitor shows it defaults to only using about 300MB of RAM, leaving about 200+ free for apps and cache. This is with all the bloated defaults running like Homegroup services etc. Despite the fact that it's still beta, it fares much better than Vista and I say even on par with XP in terms of running within limited resources, while delivering more features than XP.
So yeah, color me impressed. No it's not going to render Toy Story in realtime on a 386 with EGA while making toast and finding Sarah Conner, but still that's a decade old laptop (which means it's a steaming turd of proprietary crap) and Win7 is still usable on it, without a week of fiddling with settings first. Considering MS is talking about "Netbook versions" of Win7 I'd say there's definitely a chance of them producing a contender for the lower-spec hardware out there that fares much much better than Vista did.
Without life, Biology itself would be impossible.