Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:javascript is good (Score 1) 531

It is a perfectly reasonable thing to say if you're compiling JavaScript to an executable, which you're not.

You could say, "When an app is written in JavaScript is it likely to [use more resources] than the same app written in PHP..."

You can say this because Server-side JavaScript and PHP operate in the same environment at the same level.

Statements like these require qualification to be valid. Server-side JavaScript may be more resource intensive than PHP because the library is younger.

Comment Re:Response from L4C list (Score 1) 431

I did read both posts.

Sorting of 100,000 random numbers is a good academic exercise, but is not the same is sorting 100,000 records (unless they're records of random numbers) and, in this case, is comparing apples and giraffes.

Also, what your example does not take into account is:
  a) the time required to make the request to sort
  b) the time required to perform actual data sort
  c) the time required to deliver the data back to the client
  d) the time required to render the new page.
  e) what happens when 2, 10, or 50 users want to sort 100,000 records at the same time.

Sorting on the client is an excellent distribution of resources. The data doesn't change, only the presentation of it, which falls squarely in the realm of the client.

It's not necessarily an argument of which would be faster for a single use, but which solution allows the application to scale and provides the best user experience. In this case, I think sorting on the client wins hands-down,

Comment Re:SQL? (Score 1) 431

That may seem to make sense, but it doesn't address the same crowd that ignores the server message indicating the validation failure.

I think that validation should be done on both the server and client side. The client-side validation provides immediate feedback, and a better user experience. If the submit button is disabled, the user is likely to abandon the order WITHOUT reading the form for validation errors, and you may get support calls telling you your app is broken because people can't be bothered to read. (Yes, that's bitterness you perceive.)

By "client", I mean both standalone apps and web deployments. If, for some reason, the client is unable to validate (disabled JavaScript, missing DLL, solar flare), the server can still stop invalid transactions. Also, by forcing validation on the server side, you're ready to develop electronic b2b interfaces to your system.

People are used to a web browser. Like it or not (believe it or not) it provides a consistent interface, and a stable development platform. I would venture to say that JavaScript is a great language, that allows you to create some complex, robust and high-performance apps.

What we're really talking about here is similar to the oscillation in the VLSI industry -- the swing back-and-forth between dedicated coprocessors and general-purpose central processors. The FPU was integrated into the CPU. As least two levels of cache memory have been integrated into the CPU. Intel is pushing to integrate graphics into the CPU (again), but there was a good reason that dedicated hardware was created to handle graphics -- it's intense stuff, as is audio, high-speed networking and bus-interaction. There is a reason to keep these things off-chip. There is also a reason to adopt a client-server style of asynchronous communication at the hardware level: Yes, there is some additional overhead and chattiness, but that drawback is greatly overshadowed by the simplification and decoupling and allowance for a part to excel at a particular task at its own speed.

There may be a time where the browser could become the operating system. I think that level of abstraction is unnecessary, but that's me. The point is that we need good quality data, good performance, and minimal maintenance. We can have all three -- they're not mutually exclusive -- but only if we really understand the problems we're trying to address.

TeX was created to address the problem of inconsistent layout/publishing methodologies.

SGML was created as an attempt the establish a universal data description language to simplify inter-entity communication, so that you could concentrate on the data, not the transport mechanism.

HTML was created to combine several technologies for gathering information in the academic world.

The web browser establishes a medium in which you must rely less on the transport mechanism and can play with the data, and is an accessible development target. As with any development platform, inexperience and a myopic view of the problem can result in a crappy app.

Slashdot Top Deals

And it should be the law: If you use the word `paradigm' without knowing what the dictionary says it means, you go to jail. No exceptions. -- David Jones

Working...