Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re: No (Score 1) 161

This. If everything is done in the client, the application will lag every time anything processor-intensive is done. Likewise, if the client has to call back to the server every time it does anything, the client will lag on high-latency connections or when the server is overloaded. There's a balance there; the trouble is that most developers don't know how to find it.

I think part of the problem is that web app developers seem to fall into one of two camps: Do everything locally for the best chance of availability, or do everything remotely for the best performance. The first camp is almost right, they do achieve better availability doing everything locally, right up to the point where their app becomes unusable due to processing lag (in other words, they're wrong). The second camp is also almost right, they do achieve the best performance in their local development environment, running in a VM on their workstation, where they're the only user; that falls apart the moment they add thousands users and wildly-variable latency between client and server (in other words, they're wrong).

What I prefer to do is provide all capabilities in the client (a-la first-camp), then identify those that cause the application to lag and implement them on the server (a-la second-camp). Once a function exists in both the client and the server, the client can run a job locally and on the server. If the local job finishes first, the client alerts the server and the server-side job is terminated; if the server returns its result first, the local job is terminated. I find that this structure provides the best performance, as well as availability, since the faster resource will always be the one to return the result used; and if the server is unavailable, times out, or returns an error, the application still works. I find that most users are willing to accept occasional slowness during server outages and upgrades, especially when the application is generally very responsive under normal conditions.

That, I'm sure, can be built upon to predict (e.g. based on bandwidth, latency, and local vs server load and performance) which job will finish first and only start the remote job when it will be the clear winner (still starting the local job just in case). That would give you the benefit of reduced server requirements without impacting application performance (unless you take it too far and don't keep any spare compute power online). I haven't run into an instance where this has been necessary or where the savings would be worth the effort (as evident by the number of cancelled jobs compared to completed jobs), but I'm sure such a case exists.

Comment Re:Hello Captain Obvious (Score 4, Insightful) 56

So, any ideas about how to go about "tracking terrorists"?

One need not have a good idea to be able to identify a bad one. In absence of said good idea, the correct action to take is none at all, not carry on with the bad idea because it's all we have. That's how idiots lose limbs cutting down trees and such; it's also how a free and great nation loses that freedom and greatness.

Comment Re: #2 (Score 1) 368

And I did it again... first sentence should read:

My experience (consistent across multiple machines, not just the one I'm typing on right now) and many, many users posting with stability and performance issues that were introduced with Lion and have persisted since, would seem to disagree with this statement.

Comment Re: #2 (Score 1) 368

That's just false. OSX stability and performance in 10,10 is far far better than say 10.4-6.

My experience (consistent across multiple machines, not just the one I'm typing on right now) and many, many users posting with stability and performance issues that were introduced with Lion and have persisted since. Yes, Apple had bugs before lion, but none that were both performance/stability-affecting and persisted through multiple versions of the OS. That's a new development under Cook, and a very bad sign for those of us who use our machines professionally.

Let's hit your list

... and let's also realize I didn't list every issue. If you want that, I can certainly do it; it'll be one of the longest posts I've ever made here, though.

That's a bug that gets fixed soon. Apple had bugs in 10.2, 10.3, 10.4...

It's been an issue since 10.7. What usability bugs can you point out, pre-10.7, that persisted for multiple releases? I'm honestly asking, since 10.6 was the first version I used.

There were many more bugs in Job’s day.

Shall I provide my comprehensive list? I only listed a handful of the more aggravating issues I've dealt with in the past couple days.

You sound like you have a worm or something, that isn’t OSX.

Then that work shipped on this machine, as the issue persisted when migrating from another machine, despite installing only trusted binaries direct from the developer (e.g. Firefox, Chrome, Adobe stuff, really not a whole hell of a lot else). Considering that this has been an issue for me since the early Yosemite betas (and not before then) and Avast hasn't picked anything up I'm gonna have to say it's not a worm. Especially considering that everyone I know who uses messages and doesn't reboot every other day has the same issue. It's not like it *immediately* uses all of that RAM; in fact, that I refer to it as a memory leak should indicate to you that it behaves as one. It's very well-behaved right now because I just killed and restarted Messages last night, but in a few days it'll be right back up there.

The G4 had terrible throughput for memory and hard drives relative to CPU speed.

And that has what to do with Jobs? He didn't design the CPU; in fact, nobody at Apple did, it was designed and manufactured by IBM, with some manufacturing also being done by Freescale. IBM started making the chips in 1990, 6 years before Jobs' return to Apple. The company was not in a position to pull off an architecture switch in 1996, so the move from G3 to G4 was a logical one; switching to Intel at that time would have killed Apple.

The G5 was excellent but then Jobs wouldn’t commit to a laptop version so just as his CPU problems were fixed he migrated away.

Jobs wouldn't commit to a laptop version because the performance-per-watt just wasn't there. It's hard to sell a laptop with a 45min battery life because the CPU chugs rather than sipping. To top it off, the G5 ran extremely hot and no laptop cooling solution seemed to be able to keep it stable. The final nail in the coffin was IBM's failure to deliver faster chips as they had promised, coupled with their inability to supply enough parts. Freescale wasn't making these chipe, they were all coming from IBM, and IBM wasn't making them fast enough (in either sense of the word) for the desktop sales Apple was seeing; just imagine how things would have gone had they also tried putting them in laptops. Actually, it probably wouldn't have been much of a problem as very few people would have wanted a laptop that melted its casing, overheated, and became unstable; assuming you had it plugged in, as the battery wouldn't have lasted long enough to cause those problems.

The CPU issues were solved when Jobs pushed everyone over to Intel.

Another area where Jobs made sacrifices was on his memory sourcing. Apple customers often had to pay 5x or more street price for memory.

Hynix and Samsung. The same memory Apple uses today. And RAM upgrades have similar markup today, as well. Your point?

By 2nd or 3rd I meant compared to individual phones.

Apple only sells one model in each class, in each generation. Over the last 2 generations, they've added a new class and now sell an entry-level and high-end model. Of course you're going to find something the iPhone is better then in the rest of the entire market, which consists of hundreds of models targeted at all kinds of users. Since Apple only offers their very best, constrain your comparison to the very best Android has to offer, and keep your comparison within the same generation. You'll very quickly see your argument vanish. Yes, Apple wins on thickness and weight, whoop-de-doo. Oh, and the Moto X is the same generation as the iPhone 5; if you're saying that, in two generations (5s, 6), Apple hasn't caught up to that, I fail to see how you can say Apple is winning.

While the opposite is true on Android vs. iOS. If this were about Tim Cook that shouldn’t be happening.

That's up to the developers of those apps, honestly. You brought up 3rd party apps and I responded, but it was in no way part of my point. It has nothing to do with Cook, Jobs, or really, Apple in any way. Your wording is telling, though; Jobs bred the Apple user's desire for a better experience and that still lives on in the mobile platform that came into existence under his oversight; if this were about Cook, you're right, that shouldn't be happening. Keep in mind that I started this line of conversation talking about their computer offering, not their mobile devices; you're the one who brought those into the fold. I think they're doing a fine job in the mobile space; the iPhone isn't my cup of tea, but they're going in the right direction for iPhone users. What they're destroying is their computer market, specifically laptops.

What I disagree with you on is a matter of fact, that OSX’s experience is worse.

That's a matter of opinion, not fact. If you want a fact, how about the fact that I never compares OSX to Windows in that way? Another fact: we don't disagree on that point. If I thought the Windows experience was better, I'd be a Windows user.

Slashdot Top Deals

8 Catfish = 1 Octo-puss

Working...