Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Networking

Submission + - When Is enough bandwidth at home enough? 1

Dubbel writes: In 1993, I was in College and took advantage of a dial up \ SLIP account for internet access from home which my university made available to all students with shell accounts. It was a blazing 14.4Kbps connection. As internet usage increased and I began to get busy signals more often that not, I took advantage of a student discount at a local ISP and got a dial up 33.6 Kbps "Unlimited" PPP service for the princely sum of $40 a month...a significant portion of my net worth at the time. At that point in internet history, online services such as Prodigy and Compuserve were charging by the minute for World Wide Web access which was outside of the content they hosted and this still didn't give you access to the full breadth and depth of what the internet had to offer. I had 1 friend whom I considered to be filthy rich who had a dual channel 128Kbps ISDN line. As soon as broadband became available, I was the first person I new to get it. First it was 1 MBps, then 1.5, then 3 and currently I subscribe to a 6 Mbps DSL service all the while never really exceeding the $40 a month price barrier (now after service bundle discounts and prior to the addition of taxes). Now my ISP is offering their new VDSL internet, TV, & IP telephony service in my area which tops out at a staggering 18 Mbps for around $65 a month which is separate from the bandwidth available for telephony & TV. For the first time ever, I find myself asking....do I really need more bandwidth? Am I ludicrous for asking this question? How many others in the Slashdot community have found their personal broadband saturation point to be beneath fastest service available separate from personal financial constraints?

Comment Re:Oh rats (Score 0) 166

ATI are about to become the leader? They are already the leader in all categories: perf/$, perf/W, absolute perf, and at all price points. See list below. For gaming performance, the GFLOPS rating are a roughly (+/- 30%) good enough approximation to compare ATI vs. Nvidia. For GPGPU performance, the GFLOPS rating is actually unfair to ATI because Nvidia's GT200 microarchitecture causes it to be artificially inflated (they assume a MUL+MAD pair executing 3 floating-point op per cycle, whereas ATI assumes a regular fused MAD executing 2 floatting-point ops per cycle). Meaning that an ATI GPU rated 200 GFLOPS actually executes ALU-bound workloads as fast as an Nvidia GPU rated 300 GFLOPS. ATI's lead is such that it's not even funny anymore. There are rumors of Nvidia killing the high-end (GTX 285, 295) to focus only on the extreme entry-level segment (sub-$100). And GT300 (Fermi) will not enter mass production before the end of Q1 2010. I am concerned by the lack of competition... ATI is free to impose whatever price structure they want.

  • If you have $500+ to spend: ATI HD 5970 (4640 GFLOPS, 294 Watt, ~$600) vs. Nvidia GTX 295 (1843 GFLOPS, 289 Watt, ~$500).
  • If you have ~$400 to spend: ATI HD 5870 (2720 GFLOPS, 188 Watt, ~$410) vs. Nvidia GTX 285 (1063 GFLOPS, 204 Watt, ~$400).
  • If you have ~$300 to spend: ATI HD 5850 (2088 GFLOPS, 151 Watt, ~$310) vs. Nvidia GTX 275 (1011 GFLOPS, 219 Watt, ~$300).
  • If you have ~$200 to spend: ATI HD 5770 (1360 GFLOPS, 108 Watt, ~$170) vs. Nvidia GTX 260 Core 216 (805 GFLOPS, 182 Watt, ~$200).
  • If you have ~$150 to spend: ATI HD 5750 (1088 GFLOPS, 86 Watt, ~$155) vs. Nvidia GTX 260 (715 GFLOPS, 182 Watt, ~$170).
  • If you have ~$100 to spend: ATI HD 4770 (960 GFLOPS, 80 Watt, ~$110) vs. Nvidia GTS 250 (470 GFLOPS, 145 Watt, ~$110).

Comment Re:Don't use bootcamp, but I use Fusion (Score 1) 396

I recently bought and a 15" Macbook Pro. I then sold it 3 months later because because I didn't enjoy OSX and it seemed like there were many things done by Apple to prevent Windows from being a proper alternative on the otherwise nice piece of hardware: - iSight is limited to 160x120 pixels. Yes, the size of a postage stamp. This can't be changed anywhere. - Cannot switch between the two video cards available in the system (nvidia 9400M and 9600M GT), so you're stuck with the higher end graphics and loose another hour of battery life - Cannot turn keyboard backlight off, only to lowest brightness setting, unlike in OSX. That's more battery life gone. - Bootcamp application/drivers consumes 2-4% of processor when idle. What the hell. It's necessary to keep it running if you want to utilize the function keys on the keyboard amongst other things. - Optical output couldn't upmix stereo input to 5.1 surround no matter what drivers were used. This was also a "feature" when in OSX. I couldn't stand having
PlayStation (Games)

Submission + - PS3 Cell Faster than Core i7 965 XE

billdar writes: "PS3News is reported on transcoding tool that allows for the conversion of video material on the Full HD format, with the help of a PlayStation 3 with faster than real-time performance. Connected to a PC via a gigabit ethernet, the PS3 performs as an external, dedicated video encoder. All encoding is handled by the PS3 which runs an embedded version of Yellow Dog Linux entirely from ROM, thus shifting the CPU-intensive processing away from the workstation. According to the article, the PS3 "Cell processor clocked a performance of 29 FPS, that is 1.2 times real-time conversion — the cell has a similar performance as the CUDA Badaboom encoder in combination with an Nvidia Geforce GTX-285. By comparison, Intel's current top-CPU, the Core i7 965 XE, does it still at 18 FPS — normal desktop CPUs even create only about 5 FPS.""

Slashdot Top Deals

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...