Forgot your password?
typodupeerror
Electronic Frontier Foundation

Over 40,000 John Doe Copyright Troll Cases Dismissed 52

Posted by Roblimo
from the sometimes-the-good-guys-win dept.
Requiem18th writes "From the EFF site: Thousands of unnamed 'John Does' in P2P file sharing lawsuits filed in California, Washington DC, Texas, and West Virginia have been severed, effectively dismissing over 40,000 defendants. The plaintiffs in these cases must now re-file against almost all of the Does individually rather than suing them en masse."

Despite the dismissal, EFF has received reports that some Does are still receiving notices from their ISPs informing them that their identities are being sought in relation to these cases. If you get one, contact the EFF immediately.

AMD

AMD Demos Llano Fusion APU, Radeon 6800 Series 116

Posted by timothy
from the onward-ever-onward dept.
MojoKid writes "At a press event for the impending launch of AMD's new Radeon HD 6870 and HD 6850 series graphics cards, the company took the opportunity to provide an early look at the first, fully functional samples of their upcoming 'Llano' processor, or APU (Applications Processer Unit). For those unfamiliar with Llano, it's 32nm 'Fusion' product that integrates CPU, GPU, and Northbridge functions on a single die. The chip is a low-power derivative of the company's current Phenom II architecture fused with a GPU that will target a wide range of operating environments at speeds of 3GHz or higher. Test systems showed the integrated GPU had no trouble running Alien vs. Predator at a moderate resolution with DirectX 11 features enabled. In terms of the Radeon 6800 series, board shots have been unveiled today, as well as scenes from AMD's upcoming tech demo, Mecha Warrior, showcasing the new graphics technology and advanced effects from the open source Bullet Physics library."

Comment: Re:Average (Score 1) 617

by minister of funk (#33062394) Attached to: School District Drops 'D' Grades

Its unfortunate that skilled trades have seem to have the stigma that if you go into a skilled trade, you couldn't cut it elsewhere. It might be true, but there is and always will be a need for tradespeople, and I know some tradespeople that make very decent money.

I'm afraid if the less academically inclined are relegated to skilled trades, we'll have a glut of people not doing something they love and our services will decline significantly in efficiency and safety.

Comment: Re:javascript is good (Score 1) 531

by minister of funk (#30310964) Attached to: Trying To Bust JavaScript Out of the Browser

It is a perfectly reasonable thing to say if you're compiling JavaScript to an executable, which you're not.

You could say, "When an app is written in JavaScript is it likely to [use more resources] than the same app written in PHP..."

You can say this because Server-side JavaScript and PHP operate in the same environment at the same level.

Statements like these require qualification to be valid. Server-side JavaScript may be more resource intensive than PHP because the library is younger.

Businesses

+ - Firefox tops European browser market for 1st time->

Submitted by ruphus13
ruphus13 (890164) writes "The EC took a decidedly harder stance against Microsoft and its anti-competitive practices in the browser wars. Those restrictions seem to have yielded results. Firefox, for the first time, has the largest market share amongst browsers. From the post, "StatCounter is now reporting that Firefox 3.0 is the most popular browser in Europe--for the first time. Number one in Europe? That's a milestone, and a sign of very healthy browser competition in Europe. If the European Commission's recent efforts to force Microsoft to offer more browser choice in Windows succeed, Firefox may well stay number one." It is also interesting to note that Firefox has 100% market share on 1 continent — Antarctica! The article states, "I'm guessing the data comes from one user — and he's using Firefox.""
Link to Original Source
Google

+ - Should Google be forced to pay for news?-> 1

Submitted by
Barence
Barence writes "The Guardian Media group is asking the British Government to investigate Google News and other aggregators, claiming they reap the benefit of content from news sites without contributing anything towards their costs. The Guardian claims the old argument that "search engines and aggregators provide players like guardian.co.uk with traffic in return for the use of our content" doesn't hold water any more, and that it's "heavily skewed" in Google's favour. It wants the Government to explore new models that "require fair acknowledgement of the value that our content creates, both on our own site (through advertising) and 'at the edges' in the world of search and aggregation.""
Link to Original Source

Comment: Re:Response from L4C list (Score 1) 431

by minister of funk (#26668425) Attached to: The Case Against Web Apps

I did read both posts.

Sorting of 100,000 random numbers is a good academic exercise, but is not the same is sorting 100,000 records (unless they're records of random numbers) and, in this case, is comparing apples and giraffes.

Also, what your example does not take into account is:
  a) the time required to make the request to sort
  b) the time required to perform actual data sort
  c) the time required to deliver the data back to the client
  d) the time required to render the new page.
  e) what happens when 2, 10, or 50 users want to sort 100,000 records at the same time.

Sorting on the client is an excellent distribution of resources. The data doesn't change, only the presentation of it, which falls squarely in the realm of the client.

It's not necessarily an argument of which would be faster for a single use, but which solution allows the application to scale and provides the best user experience. In this case, I think sorting on the client wins hands-down,

Comment: Re:SQL? (Score 1) 431

by minister of funk (#26667407) Attached to: The Case Against Web Apps

That may seem to make sense, but it doesn't address the same crowd that ignores the server message indicating the validation failure.

I think that validation should be done on both the server and client side. The client-side validation provides immediate feedback, and a better user experience. If the submit button is disabled, the user is likely to abandon the order WITHOUT reading the form for validation errors, and you may get support calls telling you your app is broken because people can't be bothered to read. (Yes, that's bitterness you perceive.)

By "client", I mean both standalone apps and web deployments. If, for some reason, the client is unable to validate (disabled JavaScript, missing DLL, solar flare), the server can still stop invalid transactions. Also, by forcing validation on the server side, you're ready to develop electronic b2b interfaces to your system.

People are used to a web browser. Like it or not (believe it or not) it provides a consistent interface, and a stable development platform. I would venture to say that JavaScript is a great language, that allows you to create some complex, robust and high-performance apps.

What we're really talking about here is similar to the oscillation in the VLSI industry -- the swing back-and-forth between dedicated coprocessors and general-purpose central processors. The FPU was integrated into the CPU. As least two levels of cache memory have been integrated into the CPU. Intel is pushing to integrate graphics into the CPU (again), but there was a good reason that dedicated hardware was created to handle graphics -- it's intense stuff, as is audio, high-speed networking and bus-interaction. There is a reason to keep these things off-chip. There is also a reason to adopt a client-server style of asynchronous communication at the hardware level: Yes, there is some additional overhead and chattiness, but that drawback is greatly overshadowed by the simplification and decoupling and allowance for a part to excel at a particular task at its own speed.

There may be a time where the browser could become the operating system. I think that level of abstraction is unnecessary, but that's me. The point is that we need good quality data, good performance, and minimal maintenance. We can have all three -- they're not mutually exclusive -- but only if we really understand the problems we're trying to address.

TeX was created to address the problem of inconsistent layout/publishing methodologies.

SGML was created as an attempt the establish a universal data description language to simplify inter-entity communication, so that you could concentrate on the data, not the transport mechanism.

HTML was created to combine several technologies for gathering information in the academic world.

The web browser establishes a medium in which you must rely less on the transport mechanism and can play with the data, and is an accessible development target. As with any development platform, inexperience and a myopic view of the problem can result in a crappy app.

Gosh that takes me back... or is it forward? That's the trouble with time travel, you never can tell." -- Doctor Who, "Androids of Tara"

Working...