Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

×

Comment: Re:Certainly it increased the price of the handset (Score 1) 62

by Todd Knarr (#49119941) Attached to: Antitrust Case Against Google Thrown Out of SF Court

Kinda-sorta. They can use Google's services and pick and choose which services. That doesn't require Google Play Services or Google's apps. Google publishes the APIs for all their services, and anyone's free to get a developer account, generate API keys and create their own apps that access Google's services (as long as they don't abuse the services, of course). What they can't do is preload Play Services and/or Google's apps (which are copyrighted and not open-source) without an agreement with Google which is likely to require Google's core apps be the defaults. And frankly any suit by the OEMs like you suggest would die on the first motion to dismiss, since the locked bootloaders and locks apps are the OEM's and carrier's choice, not Google's. Google's quite happy with unlocked bootloaders and apps that the end-user can uninstall at will, it's usually the carriers who don't want their profitable deals disrupted by users removing the relevant apps. As far as forced bundling, they'd have to restrict the suit to just the all-or-nothing bundling of the apps without regard to Android itself and that's likely to fail on the grounds that none of the apps have monopolies (there's far too many alternatives to GMail, for instance). If they try to use Android as the underlying monopoly, the suit will fail on a motion to dismiss since the OEMs aren't required to bundle any of the apps as a condition of getting the ability to use Android.

The main issue is that the OEMs and carriers want to be able to sell access to the very lucrative search box on the phone to the highest bidder while still using Google's apps for all the other things that users expect access to like their contacts and e-mail, but Google's set terms that force them to do all their own work if they want to do that.

Comment: Certainly it increased the price of the handsets (Score 2) 62

by Todd Knarr (#49102485) Attached to: Antitrust Case Against Google Thrown Out of SF Court

Certainly having those services included increased the price of the handset. But the same could be said for any of the software the carriers include on their phones (and usually prevent you from removing). Ditto even for hardware, having a camera or WiFi on the phone increases the price of the handset.

None of which matters. The question in an antitrust case isn't whether it increased the price, it's whether Google used it's control of the Android OS to force vendors to include other Google services as a condition of using Android at all. And the answer to that question is no, Google doesn't do that. Amazon's phone runs Android without having Google services pre-installed (although you can install them yourself). The Kindles are a really good example, they run Android without any of the Google stuff at all and there isn't even any way of installing the Google services. Several Chinese companies make Android phones without Google services. I even have that with my Galaxy S4: I flashed it with CyanogenMod, so I start out without any of the Google apps or services and have to bootstrap an installation of them myself if I want them. The only downside is that if you don't accept Google's terms for officially using Google Play Services and their apps, you can't use the related trademarks for much except referring to those apps. So no, on antitrust grounds Google's OK because they simply aren't using their monopoly on Android (although technically they don't even have a monopoly, see CyanogenMod and AOSP) to force other Google services on manufacturers, carriers or users. In fact Google isn't even being as strict as they could be legally. They'd be within their rights to deny any use of the Google services and apps except where the vendor had the full license, but Google doesn't go that far because they realize it'd be a) stupid because it would annoy users who'd then shy away from Google completely and b) not in their best interests because it'd prevent Google customers from using Google services which would reduce Google's revenue. So all they do is say "You want to use our logos and brands and have access to all the official tools? You need to take our package. Otherwise, you'll have to install things by hand like anybody else." (well, not completely by hand since once they've done it once they can just clone the firmware image and flash it straight into the phones).

Comment: Re:Give it a rest (Score 2) 755

by Todd Knarr (#49064119) Attached to: Removing Libsystemd0 From a Live-running Debian System

The people making that decision don't own and operate my servers, so they're not qualified to make that decision for me. I depend on those servers. I do not want them dependent on software that hasn't been in production service for long enough to have all the issues wrung out (and there are always issues when new software goes into production, I don't care how much the dev team may wish otherwise). I'll look at systemd late this summer and see how it's shaking out, and make any decision about adopting it next fall at the earliest.

Comment: Check your HR screening process (Score 1) 809

by Todd Knarr (#49050223) Attached to: Ask Slashdot: What Portion of Developers Are Bad At What They Do?

Seriously, check how HR or your agency is screening applicants. I've found too many of them that do keyword-based screening, and they're throwing out any applicant who doesn't have exactly the keywords on their resume that you put in the job description. That can filter out the good candidates with broad backgrounds in favor of job-hopping contract people with the canonical "1 year of experience repeated 10 times" who know how to put the right keywords in to pass the screening. Have HR give you all the resumes they received and have one of your guys sort them to remove the ones who clearly don't have any relevant experience, then compare what's left to what HR thought passed screening.

Comment: That's because of "Goto Considered Harmful" (Score 1) 677

by Todd Knarr (#49039723) Attached to: Empirical Study On How C Devs Use Goto In Practice Says "Not Harmful"

That result's because programmers got the ideas in "Goto Considered Harmful" pounded through their skulls while they were learning, and handled it like they would dynamite: it's very effective and the best tool for certain jobs, but it's also very dangerous and capable of causing a ton of damage so you should handle it with an abundance of caution. tl;dr: "Use it to crack huge boulders and tree-stumps, not to loosen bolts."

Comment: Attractive proposition (Score 3, Interesting) 288

by Todd Knarr (#49023789) Attached to: Quantum Equation Suggests Universe Had No Beginning

Equations and theories that not only explain current observations but bundle up and deal with things our other theories say we should observe that we don't are attractive from a neatness standpoint. I'm skeptical when they make exotic and complex predictions which we haven't seen any evidence of yet, but when they tie up all the loose ends without creating more I usually take that as a sign there's something fundamentally right about that path. Only time and accumulated evidence will add certainty to it, but I like the ideas in this one.

And as far as a universe with no beginning or end is concerned, what's the problem? I was dealing with infinite open shapes (lines, planes) in grade school, unending closed shapes are trivial (a circle, a sphere), and if you assume our universe is a 4-dimensional "slice" of an n-dimensional space it's not that hard to construct an arrangement where you can travel forever in any "direction" (since the time axis counts as a direction here) inside our universe without either encountering an edge or returning to your starting point. The math's brain-bending when you start, but it's like differential equations: migraine-inducing and you hate it with the burning fire of a thousand suns right up until they describe the General Method, at which point you blink and go "Oh. That's easy. Why didn't you mention this in the FIRST PLACE?!

Comment: Google Glass (Score 1) 458

by Todd Knarr (#48948571) Attached to: How, and Why, Apple Overtook Microsoft

The next thing along those lines will be Google Glass. Probably not in the exact form of Google's device, but a box in your pocket with all the electronics of a modern smartphone but using a Bluetooth-connected set of glasses (with a mic/headphone incorporated) for display. Not just a small prism, the breakthrough will be when it can use the entirety of both lenses for coordinated display overlaying the wearer's field of vision and using pupil-tracking to identify exactly where the wearer's looking. Text input would be via voice recognition. We're fairly close to that, the display's the part of the tech where we still need work.

Comment: Re:not the point (Score 1) 375

by Todd Knarr (#48925283) Attached to: Why Screen Lockers On X11 Cannot Be Secure

You download a program that appears legit (and may be mostly legit, or be a hacked version of a legit program), and are running it.

But why would I do that? Almost all the programs I use come from the repository, and to get me to download one they'd have to compromise the repository first (which is possible, but not nearly as easy as just advertising a program for download). The rest are again ones I download from known sources, usually the developers' own official site, and again it's not trivial to compromise those sites.

The situation you propose only happens in the world of Windows where downloading random software from untrusted/unknown sources is routine. And if you're routinely doing that, you've got more problems than just a way to bypass the screen lock. The best way to avoid shooting yourself in the foot is to not blithely follow instructions but to stop and ask "Wait a minute, why are they asking me to aim a loaded gun at my foot and pull the trigger?". And if after pondering that question you still think following the instructions is a good idea, please report to HR for reassignment as reactor shielding.

Comment: Re:He's Not Justifying Retribution (Score 1) 894

by Todd Knarr (#48823983) Attached to: Pope Francis: There Are Limits To Freedom of Expression

Sure, if someone curses his mother, they shouldn't be surprised if he slugs them. However, note that if the police get involved it would be the Pope going to jail and being charged with battery, not the person who cursed his mother. You may be expected to have enough self-control not to curse like that, but you're also expected to have enough self-control not to respond to ordinary words with physical violence.

Comment: Both are correct (Score 1) 249

by Todd Knarr (#48794037) Attached to: Education Debate: Which Is More Important - Grit, Or Intelligence?

The way "intelligence" is used falls more under the heading of what I'd call "the skills you have". Some are innate physical abilities, many are probably learned but we don't really know when or how so they end up just being things that naturally come easy to you. They're the hand you're dealt. Grit and persistence are useful then in making the most of the skills you have, practicing and refining them to get the most out of the hand you're dealt. Both are needed. We all know people who just don't get math, or have bad hand-eye coordination, or other things they're just bad at that pretty much preclude them being theoretical physicists or world-class tennis players and so on, no matter how much they might work at it. All the grit in the world won't help much if you're focusing on something you're just bad at. We also all know people who're very good at something and have the potential to be very successful in some fields, except that they won't put in any effort they don't absolutely have to and so they never become successful. All the potential skill in the world won't magically make you good if you don't apply yourself. The key, of course, is to apply grit and persistence to the things you're good at and the things you absolutely need rather than at things you're bad at.

Comment: Re:HTTP/1.1 is just fine (Score 1) 161

by Todd Knarr (#48782905) Attached to: HTTP/2 - the IETF Is Phoning It In

It's not just a matter of decoding the packets. The big problem is usually in separating out the packets for the connections from one specific client while ignoring the packets for all the other clients, and then assembling those packets into a coherent order so you can see individual requests and responses rather than just packets. That's fairly easy to do at each endpoint, much harder to do when just sniffing traffic in the middle. And of course the code to decode packets and assemble them into a transaction's more complex than the code to just append to a string and output that string to a file. Not everything has that kind of logging already built into it, and when I need to add it I'm usually pressed for time because it's a critical problem. tcpdump or wireshark will work, given enough effort, but I've too often seen them produce valid but deceptive results because while the filtering and selection and reporting were correct enough to look reasonable they weren't quite completely correct so the results were showing me something that didn't exactly match reality. Debugging dumps finally revealed the discrepancy, and we got the problems solved.

Comment: Re:HTTP/1.1 is just fine (Score 1) 161

by Todd Knarr (#48776637) Attached to: HTTP/2 - the IETF Is Phoning It In

Most of the bandwidth for modern web sites goes to content, not the HTTP headers. That's even with content compression, which is already part of HTTP/1.1. Reducing overhead by going to binary in the headers isn't going to reduce the bandwidth requirements by enough to notice, and comes at the cost of not being able to use very simple tools to do diagnosis and debugging (I've lost count of the number of times I was able to use telnet or openssl and copy-and-paste to show exactly what the problem with a server response was and demonstrate conclusively that we hadn't misformed the request nor botched parsing the response, having to use tools to encode and decode things would've led to the vendor questioning whether our tools were working right and then I'd've had to figure out how to prove the tools weren't misbehaving (telnet and openssl were widely-enough used that that wasn't a problem)).

Comment: Re:HTTP/1.1 is just fine (Score 1) 161

by Todd Knarr (#48776461) Attached to: HTTP/2 - the IETF Is Phoning It In

Because none of that requires a new protocol? You can do that in HTTP/1.0, it's entirely a matter of client programming. And yes a protocol analyzer can decode a binary protocol for you, but it takes a bit of work to set them up to display one and only one request stream. A text-based protocol, meanwhile, can be dumped trivially at either end just by dumping the raw data to the console or a log file. Decoding and formatting a binary protocol takes quite a bit more code and adds work. As for bandwidth, the HTTP headers are a trivial amount of data compared to the content on modern web sites so gains from compressing the protocol headers are going to be minimal (content compression already exists in HTTP/1.1 and there's going to be little or no improvement there in the new protocol).

Comment: Re:It is called good coding. (Score 4, Insightful) 189

They have. But they didn't do it overnight, they did it small bits at a time and those 40-year-old systems were patched or updated and debugged with each change. The result is a twisted nightmare of code that works but nobody really understands why and how anymore. And the documentation on the requirements changes is woefully incomplete because much of it's been lost over the years (or was never created because it was an emergency change at the last minute and everybody knew what the change was supposed to be, and afterwards there were too many new projects to allow going back and documenting things properly) or inaccurate because of changes during implementation that weren't reflected in updated documentation. As long as you just have to make minor changes to the system, you can keep maintaining the old code without too much trouble. Your programmers hate it, but they can make things work. Recreating the functionality, OTOH, is an almost impossible task due to the nigh-impossibility of writing a complete set of requirements and specifications. Usually the final fatal blow is that management doesn't grasp just how big the problem really is, they mistakenly believe all this stuff is documented clearly somewhere and it's just a matter of implementing it.

Breadth-first search is the bulldozer of science. -- Randy Goebel

Working...