Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re: "Destroyed" is such a harsh term... (Score 1) 77

plus, "JTAG" is very implementation-dependent. AFAIK, for example, you can't use an Atmel JTAG-Ice200 to reflash anything besides specific Atmel MCUs. Want to reflash a PIC? You need to buy yet another proprietary JTAG programmer. And so on, for every common microcontroller family. And if some third-party (Keil?) DOES make a multi-platform JTAG, it will be (a) mind-blowingly expensive (like almost every 'pro-grade' tool for non-hobby embedded development), (b) suck, or (c) both.

Comment Re: 24 cans (Score 1) 215

If you consumed 24 cans of Diet Mtn Dew or Diet Pepsi per day, you'd spend the NEXT day with horrific, explosive diarrhea and a pounding caffeine-withdrawal headache... assuming you didn't die from cardiac arrest first. At the VERY least, the cardiac arrhythmia would be pretty unpleasant. 24 cans has about 1.2 KILOGRAMS of caffeine (24 cans * 50mg/can).

Comment Re: Neat--until... (Score 2) 218

Though they weren't really relevant to PCs or Windows, the Coldfire chips are a good example of this kind of design change. Though they were marketed as having a m68k heritage, they basically took away most of the instructions and addressing modes that made the original 680x0 so incredibly convenient to program in assembly language.

RISC processors were developed to be efficient and cheap. The m68k was developed to be convenient for assembly-language programmers. The 680x0 family indulged programmers in ways that would be almost *inconceivable* today (it even had a instructions for manipulating binary-coded decimal... they weren't terribly useful on computers like an Amiga, ST, or Mac, but apparently were a Very Big Deal(tm) back when programmers had to routinely deal with legacy BCD-encoded data from mainframes. Being able to directly manipulate BCD values meant not having to go through the trouble of converting them to and from 8/16/32-bit values first.

Examples of things Coldfire took away:

* the "decrement and branch conditionally" instructions. Sure, behind the scenes, they were basically two simpler instructions automatically glued together and executed back to back from a single opcode... but damn, they were nice to have.

* most of the immediate addressing modes not involving a register as the source or destination. On a 680x0, you could stuff a specific byte value into an arbitrary memory location by doing something like, "MOVE.B #$69, $dff000" (storing hex 0x69 in address 0xdff000 in a single gulp). On a Coldfire, you have to load-then-store (load $69 into a data register, then store that register's value at the desired target address).

* and of course, all the BCD-related instructions (ok, losing THEM didn't really bother me much, as you probably guessed... but I probably would have loved them if I'd been born about 10 years earlier).

Comment Re: Nope. Bought a Nexus years ago; disappointed. (Score 1) 119

OK, fine. Try this: name one real phone available for purchase today by end users with the following features:

* full-speed compatibility with at least one American phone network. This is a hard one, because thanks to bastardized American LTE, even our nominally-GSM carriers have become as de-facto proprietary as Sprint & Verizon.

* 2GHz+ CPU, 3+ gigs of RAM, and 64+ gigs of fast flash. Bonus points for microSD, removable battery, and/or the ability to charge quickly.

* 2160x1440 or better display.

* Released with all the sourcecode, build scripts, and documentation necessary for knowledgeable end users to independently implement support for later releases of Android, even without the active blessing or cooperation of the vendor.

The problem isn't that Linux EVER breaks binary compatibility... it's the fact that it routinely and casually breaks binary compatibility up, down, left, right, diagonally, and with "three snaps in 'Z' formation" with every single new build (let alone version).

The fact is, end users are powerless to exert any kind of meaningful market influence or economic pressure over Qualcomm, because they have a de-facto monopoly over American LTE. If you want full-speed LTE on an American network, it's basically "Qualcomm or nothing". At least if we had some degree of meaningful binary kernel module compatibility, we could limp along with the original binary drivers when a new version of Android gets released and the phone's manufacturer has abandoned it because it's no longer a current model.

Comment Re: Nope. Bought a Nexus years ago; disappointed. (Score 1) 119

> Seemed to contradict yourself there, son.

Not really. There's no contradiction between, "they have the institutional knowledge and resources to do it" and "Google's management isn't interested in dedicating their best senior developers for several months to take leadership of Android's binary kernel-driver problem".

The fact is, if it weren't for Android, Linux's device driver issues would be mostly irrelevant, because they'd meaningfully affect *maybe* a few thousand actual users. Google is the entire reason why roughly 97% of the Linux-running devices on earth actually RUN Linux, and it's high time they took responsibility and assumed leadership for fixing its driver and bootloading mess for the sake of Android's own users. Because god knows, there just about the only ones in a position to actually DO it. Gnu would rather have everyone rot in Tivo-ized hardware hell for all eternity than concede defeat to Qualcomm for the sake of empowering end users to make the best of a situation with only bad and worse alternatives.

The biggest single problem with Android phones and tablets is the fact that, with the POSSIBLE exception of Intel-based Chinese devices capable of dual-booting Windows and Android, there's no direct equivalent to a PC BIOS (and even with dual-boot devices, it's iffy). On a PC, there are well-defined universal standards for making an operating system bootable from fixed and removable storage media that have evolved in compatible ways since the 1980s. Everyone agrees upon where the boot sector goes, where in RAM it should be loaded, and how it should be interpreted during the first moments after powering on the device. With Android devices, there's no such thing... every single vendor does it differently, and most of them take advantage of the opportunity to lock down the device and exercise control the owner's experience long after its purchase by the end user.

Comment Re: where's the PC of Mobile Computing? (Score 1) 119

Actually, I developed software for Windows Mobile for work back around 2006-2008. ;-)

One feature I really, really miss from dotnetCF -- it didn't force you to bend over backwards and write explicitly-asynchronous code when you were trying to implement some blatantly-linear activity (like "display a form on the screen", "submit its contents to a server and wait for the response", "deal with its response", "display the next form", "submit its contents to a server and wait for the response", and so on). You could literally just wrap it in a dotnetCF class that allowed you to write it as a faux single-threaded sequential task, and let dotnetCF itself do the UI thread-juggling for you. It wasn't the OPTIMAL way to write an app like that, but it made implementing a simple sequence of submitted forms absurdly easy to do. I wrote my first server-submitting dotnetCF app in about 3 hours (most of which was spent reading a few chapters of a book)... I think my first Android app that did something comparable took the better part of a week to write. It amazed me how complicated Android managed to make things that were absolutely TRIVIAL to do with Windows Mobile.

The biggest single weakness of WinMo was the fact that it was LITERALLY impossible to develop a custom WinMo 5 or 6 "phone/dialer app" using dotnet compact framework... you could only do it in C, using (semi-)private APIs with minimal documentation and no example code to speak of. But from what I recall, that was actually one of the new features that were supposed to be in Windows Mobile 7.0 (before Microsoft abandoned it).

Comment Re: Nope. Bought a Nexus years ago; disappointed. (Score 1) 119

Another major annoyance: no Android phones -- not even NEXUS phones -- allow you to use the stock rom as a STARTING POINT for further modifications (by furnishing a build script with complete source and any binary blobs required to build the stock ROM). Instead, you're forced to throw the baby out with the bathwater, and reimplement the phone's functionality in its entirety (since AOSP itself usually has major stock features missing, even for a Nexus).

For YEARS, I've been wanting to make a slightly-modified kernel that acts like you have the display orientation set to "manual", BUT reads the accelerometer and sets the orientation ONE TIME immediately after the user toggles the display off and on by pressing the power button twice. Basically, offering a compromise between "auto" and "manual" -- "semi-automatic". Toggling the power button twice is a quick & easy gesture, and IMHO setting the orientation immediately (but ONLY) after turning on the display is just common sense. It blows my mind that anyone at Google thinks the way auto-orientation has worked ever since we lost slide-out keyboards is actually acceptable. At the very least, Google's auto-orientation-setting routine should have enough logic to notice the user violently rotating the phone (and maybe listen for an angry "God DAMN it!") in the immediate aftermath of an auto-orientation change... especially when the phone's display is angled downwards (ie, the user is lying in bed holding the phone above his face).

I'd also love to implement what I call "CrashCam Mode" -- Crash, as in "you see a jet about to crash (or some other newsworthy event, like a police officer beating the crap out of a 95 year old woman in a wheelchair) and only have a second or two before it'll be too late to film the million-dollar video for CNN". Basically, if the user presses the power button four+ times within 400ms, instantly disable autofocus, set focus it to infinity, and start capturing video at the maximum resolution and framerate while launching the camera app itself. For good measure, if the camera supported 120fps, you could have the odd frames be set to some exposure suitable for either indoor lighting or morning/late-afternoon daylight, then alternate the even frames between under-exposed and over-exposed (to ensure that you'd end up with at least 30fps of usable video if the lighting were really dark or bright).

Oh... and I'd also remove the 911 emergency-dialer-without-unlock that seems to be the new norm, and make it impossible for the dialer screen to activate unless I've either fingerprint-unlocked the phone, or done a complex gesture like the Donut/Eclair/Froyo-era "deliberately slide the dot along a precise arc to unlock". Frankly, I'm more likely to die from an aneurysm in a moment of rage after hearing DTMF tones coming from my pocket (when the phone is SUPPOSED to be locked) than I am to die because some random onlooker couldn't use my phone (instead of their own) to call 911.

Comment Re: Nope. Bought a Nexus years ago; disappointed. (Score 5, Insightful) 119

The difference is, "desktop" Windows has historically given us compatibility with drivers written for older versions (sometimes, as old as NT4) -- imaging drivers being the one notable exception due to TWAIN's brain-dead pre-WDM architecture).

In contrast, Linux only abstracts its ABI for *applications*, not the kernel itself. For example, suppose I have a 4.10.10 kernel compiled for AMD64 using gcc, and a loadable kernel module built for that kernel. Now, suppose I have an identical computer running a 4.10.10 AMD64 kernel compiled with Visual Studio (just to give another widely-used compiler as an example). In most cases, the .ko file built for the "gcc" kernel will die a horrible death on the "Visual Studio" kernel... or possibly, even another 4.10.10 kernel compiled with gcc using slightly different options.

Basically, Linux doesn't even *try* to maintain driver binary compatibility, even within THE SAME KERNEL VERSION, while Windows bends over backwards to maintain driver compatibility more or less "forever". AFAIK, it's an ideological decision... Linux's developers *want* to punish users of binary drivers & inflict the maximum possible pain, totally ignoring the reality that end users (or at least, users of cell phones capable of doing LTE on American mobile phone networks) have ZERO influence on Qualcomm or Nvidia's licensing policies... ironically, empowering VENDORS over end users in the process.

Riddle me this: why could Linux use binary wifi drivers built for fsck'ng WINDOWS (via NDISwrapper), but can't even maintain binary compatibility between two sequential kernel releases with only minor differences? It's insane. I don't even blame Linus... I blame Google. Google has some of the best Linux kernel experts on planet earth. They could EASILY add an abstraction layer that preserved binary .ko compatibility across at least a few releases (think: a stable, open-source thunking layer that Android-certified drivers were required to use instead of directly referencing kernel structures... new release of Android? Just compile a new thunking layer for old binary drivers to use instead.)

Comment Re: Flash another ROM (Score 3, Informative) 119

It might be a Verizon S4... VZW takes bootloader-locking 'evil' to creative new heights (lows?).

Apparently, when the Note 4 came out, Verizon actually paid extra to Samsung for them to protect the Sprint version's bootloader the same way (Sprint itself was indifferent) just to make sure there wouldn't be another CDMA model with easy-to-unlock bootloader. From what I recall, the Verizon model of one of Samsung's earlier phones could be cracked by flashing a Sprint bootloader to the Verizon phone... it temporarily bricked the phone (or at least disabled the radio modem), but then you could unlock the easy-to-unlock Sprint-version bootloader & reflash it with a second bootloader that was basically a Sprint Android bootloader w/ripped Verizon radio modem firmware to give you a working, bootloader-unlocked Verizon phone. Verizon was determined to keep it from happening again.

Comment Re: where's the PC of Mobile Computing? (Score 1) 119

We were well on our way towards getting it until Microsoft decided to kill off Windows Mobile for a replacement that was inferior to, and 2+ years behind, every other mobile platform at the time (instead of 5+ years ahead).

If "Windows Phone" (a/k/a Danger Sidekick OS, ported to C# & dotnet compact framework) had never existed & Samsung's latest & greatest phone today ran Hypothetical Windows Mobile 14 instead, upgrading your old Hypothetical WinMo 12 device to WinMo 14 would be like upgrading an older PC or laptop to a newer version of Windows... some new driver .dll files copied from a newer phone (or generic reference drivers downloaded from Qualcomm or Nvidia), and you'd be set.

WinMo 5 & 6 were ugly as sin out of the box, but the core OS itself was generally good, and had capabilities that were YEARS ahead of anything about to be released by Apple or Google. That's a big part of the reason why Microsoft makes about ~$14 in royalties for every new Android & Apple device sold... Android might have made the technology pretty, and Apple might have made it usable by nontechnical people, but MICROSOFT was the one who first delivered it as a working product to YEARS before an iPhone or Android was even a "thing".

RIP Windows Mobile.

Comment Re: Okay, but someone wrote the algorithm (Score 1) 388

Exactly. Imagine taking a snapshot of an Android device's RAM & using it to attempt reverse-engineering a running app without access to the .apk file used to install it... by reading the bare ARM assembly language of ART executing JIT-compiled .dex code from compiled Java bytecode. Without assistance from an app like Ida Pro (which is somewhere between "AI" and "black magic" to begin with), it's basically impossible. Computers can grok 700 levels of recursion & dereferencing. Humans max out after a dozen or two (usually more like 6-9 levels).

Comment Re: My list? (Score 1) 467

The best point of PC joysticks was when they moved the reading logic into a microcontroller running inside, but used the old SB joystick port as a MIDI-speed serial port. Near-instant response, without soaking up 5-10% of the CPU just to repeatedly poll the goddamn USB port to ask the gamepad, "do you have anything to tell me?" hundreds of times per second.

A pox on the bastards at Intel who decided USB should be 100% PIO. In theory, USB 3.0 added an IRQ line, but AFAIK, it's still generally ignored & unused by HID drivers & devices.

Comment Re: Being able to understand the whole stack (Score 1) 467

Another double-whammy with modern documentation: extraordinarily poor use of screen space, and no real "editing" to speak of. Like Google's Android docs, with oceans of empty whitespace around maybe 12-16 lines of actual text content, half the screen's width consumed by sidebars, and hyperlinks to hyperlinks to still-more hyperlinks (ok for reference, but awful for learning how to do something for the first time when what you *really* need is a coherent & complete start-to-finish explanation of the topic.

Comment Re: Th best of days (Score 1) 467

Believe it or not, for a brief period circa 1997, installing ActivePerl on a Windows PC enabled Perlscript as a first-class IE4 scripting language equal to Javascript & vbScript. Except its sandbox was shockingly broken, so ActiveState disabled Perlscript a few months later (though, for a few years, you could STILL enable it by inserting a key into the registry like, "I_AM_TRULY_INSANE" = 1)

Slashdot Top Deals

No man is an island if he's on at least one mailing list.

Working...