Comment Re:Too late, but not entirely too little (Score 2) 355
Android has an NDK to develop native apps that target the CPU instruction set directly. Unreal Engine for Android isn't written in Java.
Android has an NDK to develop native apps that target the CPU instruction set directly. Unreal Engine for Android isn't written in Java.
Well, the Chinese have managed to design a phone with a screen, dual radios, WiFi, Bluetooth, FM radio, and a dual core Cortex-A9 CPU that can be effectively sold for $40 or so (if you buy it in China, not online). If the Chinese can build in a CPU core that's two generations newer into a product with support for 3 radio standards and a screen that sells for $5 or so more than the Pi, why is Broadcom struggling with an outdated 12-year-old core on a product with no wireless?
You can buy a dual core Cortex-A9 Android phone in China for about $40, give or take. And that comes with a screen. Sorry, SoCs are dirt cheap these days and the price point isn't an excuse to ship a 12-year-old core (seriously, ARM11 came out in late 2002).
Why are they still shipping the same CPU core that was in the iPhone 2G? ARM is at least 3 generations ahead already. ARM11 doesn't have NEON (proper SIMD) instructions, so it's crap for multimedia processing (sure, they make it up for the usual codecs with their GPU core, but that doesn't help if you want to write your own code).
Seriously, when the Pi first came out one of my first complaints was that the CPU core was woefully outdated and I already owned several boards with much more recent ARM cores, and several years later they still haven't upgraded it? WTF? What sense does this make? Does Broadcom not have a license for more modern ARM cores? Are the licensing fees too high to ship it in a low-cost product? (Answer: no, plenty of Chinese SoC vendors are doing it). What's the issue?
Twitter measures tweets in characters, not in bytes. Japanese is not 2x denser than English in bytes, but it's 2x denser than English in characters, and that's what Twitter cares about.
Try writing a few tweets (or tweet-sized snippets) in English and in Japanese sometime, see for yourself.
140 characters isn't enough
SDKs are useful to investigate and develop homebrew exploits (they provide information on the system architecture), but they are not useful for actually developing homebrew unless you want to end up with a situation like the Xbox 1 (the original) where all homebrew (except for Linux) was basically illegal because compiling it meant using the SDK and the resulting binaries were not legally redistributable. As a counterexample, the Wii has a fully open source homebrew SDK (though some bits have a questionable history and are arguably non-cleanroom reverse-engineered SDK code from games, but that's a much finer point than outright using the official SDK).
Given what I've heard of the Xbox One security architecture, it's going to be a tough nut to crack, SDK or not.
Moving faster causes time to slow down (special relativity), but so does beeing in a deeper gravitational well (general relativity). As you move away from the Earth, both effects have opposite (but not equal) magnitude. I'm too lazy to do the math right now, but here's a walkthrough (for the case of GPS satellites, but the same equations hold; you just need to know the distance from Earth's center to Death Valley and to Mount Everest, and work out their linear velocity from that).
Um, no, x86 CPUs are nothing like ARM and I'm not aware of any commercial x86 CPU with an ARM backend. Yes, modern x86 cores use a RISC-ish microcode backend with an x86 decoder frontend, but that doesn't say anything in favor of ARM. All it means is that the industry has collectively agreed that CISC as a microarchitecture is a stupid idea - not necessarily as an instruction set.
I'm not a fan of x86 myself, and I think it's a stupid design with a vast amount of baggage causing a significant power/performance impact when designing an x86 CPU (that Intel can get away with because they're a generation or two ahead of everyone else in silicon tech), but then again ARM isn't the pinnacle of RISC either (though I do think it's better than x86).
Me, I'll take whatever microarch gets the best performance per watt at whatever TDP is relevant. If Intel can pull that off with x86-64, by all means. If ARM AArch64 ends up ahead, awesome. If both are about equal, I'll take whatever's more practical based on other factors.
And since this is a camera passthrough, not an optical overlay, that's a glaring implementation flaw. Properly aligning the head tracking framerate, camera framerate, and rendering would let them render the virtual objects in lockstep with the physical ones (at least at speeds where motion blur isn't a significant issue; you can fake that by minimizing motion blur in the real image by using a short shutter time on the cameras).
So, they're locking out things that can brick the card (flash ROM/fuses, screw up thermal sensors) and apparently a hint of OS security (the Falcons that respond to userspace commands can no longer access physical memory, only virtual memory). The latter sounds somewhat bizarre, considering the firmware should be fully under the control of the driver, not userspace (I guess/hope?), but not unreasonable. Maybe there are software security reasons for this.
Nouveau is free to continue using its own free blobs or to switch to nvidia's. If they start adding restrictions that actively cripple useful features or are DRM nonsense, then I would start complaining, but so far it sounds like an attempt at protecting the hardware while maintaining manufacturing flexibility for nvidia. This isn't much different from devices which are fused at the factory with thermal parameters and with some units disabled; the only difference is that here firmware is involved.
NV seem to be turning friendlier towards nouveau, so I'd give them the benefit of the doubt. If they wanted to be evil, they would've just required signed firmware for the card to function at all. The fact that they're bothering to have non-secure modes and are only locking out very specific features suggests they're actively trying to play nicely with open source software.
Not 2^16 (Unicode already has way over 2^16 codepoints assigned). The maximum Unicode codepoint value is 1114111, which is somewhat over 2^20 (and happens to be the highest codepoint encodable in UTF-16).
It's 2Ah, so 240A.
Now, it could be that their battery runs at a higher voltage (and thus not really 2Ah, but they're using that figure as a 3.7V li-ion equivalent capabity), or that there is a power converter built into the battery pack (unlikely for a prototype, though). Still, even for a 37V battery (vs. 3.7V for a normal Li-Ion cell), we're talking 24A. That cord didn't look like 24A cord, and I highly doubt they were using a voltage higher than 37V to charge (especially not with exposed banana jacks like that).
I call the demo highly dubious if not an outright fake/mock.
Sorry for the threadjack, but this is yet another case of horrible security reporting.
From watching the video, what it seems happened here was that eBay chose phpBB for their community forum, but did not integrate its authentication system directly with eBay's on the server side. Instead, the site was set-up as a standalone system, and whoever implemented the integration had the bright idea of hardcoding the forum password for everyone as username+123456, and then just having the eBay login page issue a hidden POST request behind the scenes to authenticate users to the community forum section.
Thus, this allows anyone to trivially impersonate anyone else on the forum. It shouldn't have anything to do with the rest of the site, though. Nor does this have anything to do with initial passwords, salts, or any of the other terms that have been thrown around.
A case of absolutely retarded login integration for the community site, but not something that would allow people to take over others' main eBay account. What this says about the people running eBay is another matter entirely...
Real Users never use the Help key.