Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Google Mail (Score 0) 459

I've been using Google Mail (separate from GMail) for a while now for my mail needs, and it's actually working out pretty well. Better uptime and performance than hosting the server myself, and it's generally just a lot easier. Then again, you have to ask yourself if you want Google to potentially be able to see your mail.

Comment Re:heh (Score 1) 754

Totally possible the government passed over it because the tech was "too new" or perhaps it even just flew under the radar. At the time, that would have been a fairly experimental device by comparison to the R&D efforts going into the emerging sector that now exists for these kinds of devices. ... That said, I'm STILL not sure how the iPad managed to do that, considering I STILL don't know who it's marketed for or what its actual purpose is.

Comment Re:Talk about wrong! (Score 1) 754

You even admit that three of them collapsed together, yet you insist that computers are becoming disposable? Huge year on year growth in Apple laptop sales during a down economy totally disposes of that argument. People seem more willing to pay for quality products now than they ever have been, because they've seen cheap and it wasn't pretty (and it didn't last. People already hate buying computers, so buying them again is something normal people avoid like the plague.

Actually, they didn't collapse together - They purchased each other (Acer bought Gateway bought eMachines). Dell even went so far as to purchase Alienware, for all intents and purposes a high-end, "quality" computer company. I actually think that increases in Apple sales are something of a good indicator for what I'm talking about, too. Apple's business model isn't to sell people one good machine and let them keep it for five or six years before moving on (though they promise two OS revisions to any hardware sold) - Just like the iPods, iPhones and now the iPads, Apple's major driving force in their business model is the idea that, even if there's really nothing wrong with the device you currently have, you absolutely have to have the new models. While I'm sure that not everyone is hooked this way (at the very least, people with half a brain), there do exist a fairly strong base which do follow this trend. To that end, think of the users that are causing Apple's growth: PC users switching to Apple because they supposedly "have no viruses" and other lovely things, regardless of the fact that it's virtually the same components inside (Hitachi hard drives, for example). They are switching for the OS - Not for the machine.

WTF. Let me ponder for a moment, and repeat; WTF.

How are USED computers worth more in a world where computers are becoming disposible? In said world computers that were at all used would be unusable. That's practically the definition of "disposable" - when my razor blades are done I don't donate 'em to Good Will!

Used markets for anything are very lucrative markets - Buy for dirt and sell for a generous percentage and you have a lot of pure profit going for you. The computer market hasn't yet truly become disposable; You don't toss it in the trash when you buy a new one if the old one still "works." You offload it onto someone who will pay you to take it. Compared to your razors, I'm pretty sure people won't be lining up to take those off your hands!

Yes, just ask Sony how easy it is to lock people out of systems they physically control. Oh that's right, it's totally impossible which is why on OS X systems Apple doesn't and NEVER will try, and on iOS Apple puts up the thinnest veneer of prevention over the hardware which they could improve but they don't even bother.

Actually, it would be fairly easy. Sony's PlayStation 3 has only recently been cracked open, and that's because of a static root key for software signing. If every generation of computer that coincided with an OS release by a company who really wanted to do this were set up with a different key (and with a bit more security than that Sony placed into the PS3), or more severely, change the key at will via protected software updates (they could get away with this because their software is the only software they want you to run), that would effectively A) remove the ability for earlier machines to use new OSes, requiring more frequent hardware purchases - Apple's big money-maker, and B) Stop cold attempts to run unapproved software on the machine. As stupid and needlessly complex as that may sound, Jobs and the Apple crew have a fairly extensive history of wanting to rule over their devices with an iron fist - Remember not long ago when they were telling people that jailbreaking their iDevices violated copyright law? It might be a veiled threat, but it also outlines their view on control over their devices. It isn't that big a stretch, quite frankly, and if they could find a way to make it work, I have no doubt in my mind that they would.

Comment Re:heh (Score 5, Insightful) 754

This is pretty wrong. I know it's pretty trollish, but I feel compelled to respond.

computers that can last longer and be cheaper

Trend is actually to computers that are cheaper and more disposable. Once upon a time more companies were trying to release more reliable machines, but the costs were high - Enter Dell, eMachines, Acer and Gateway (the latter three now one and the same), and their business models of inexpensive PC's that aren't necessarily solid broke the market entirely. Computers are becoming disposable, much in the same way mobile phones are.

Used computer market is now becoming HUGE....because no one can afford to retail prices.

Retail prices on PC's have been plummeting for a long time now, and the used computer market is inflating due to the above point: Computers are becoming disposable, and there are even cases where people will toss a computer because of something like a spyware or virus infection.

iPAD subscriptions have taken a complete nose dive of late as people realize how useless and costly the things are

While I never understood the point behind the iPad, its impact on the market in general is undeniable, with Android tablets mimicking its design appearing left and right. Many emerging and future hybrid designs are coming out as iPad-style tablets proper, with a fully-equipped base station featuring a keyboard, mouse, ethernet/display ports, and so on. I know that our provincial government has become very interested in developments by Toshiba in this regard, and may be procuring them to replace laptops in the future.

I want to address the most glaring part last:

too late once open source is OUT into peoples hands its too late.
YOU can't then take it away.

Yeah you can. If, say, Apple decided they wanted to lock down their devices, they could first-off modify their EFI implementation to disable the loading of unsigned (by Apple) software as an operating system. That in itself would disable flavours of Linux from loading, and they could go further still by modifying their operating system to support installation of applications only via their App Store. The beautiful thing is that newer Apple products, both hardware and software, can use a different encryption key for their EFI-OS lockout. Or, they could utilize technology like this:

There exists a real-world potential for such a thing to exist - Microsoft has for a long time been on-again off-again working on something formerly called Palladium, now called Next-Generation Secure Computing Base, which is an implementation of the concept of trusted computing. At the time when this was announced, many thought of this as perhaps being the death of Linux - One major use for this kind of technology is for DRM purposes, wherein only an approved application can access certain data, which could feasibly include the entire system. The hardware required for this kind of thing has been around for a while, and many machines since the AM2/LGA-775 sockets have Trusted Platform Module chips included. One of the more famous applications for this is with Bitlocker.

Comment Re:First problem with AVG (Score 1) 318

AVG started going downhill at 8, and then nose-dived at 9 when they had the focus stealing issue and another that a coworker of mine reported regarding the Outlook/Exchange plugin failing to update and causing Outlook to crash on startup (this was also a widespread issue, but I don't believe it got as much press seeing as most organizations using Outlook or the Exchange plugin aren't using AVG). Worse was their response to the issue, which as I recall was initially lazy denial.

This now hardly surprises me. I've migrated from AVG to Avast, and not only is it far lighter, but it's also faster and has never once given me grief. It's one of the best overall AV's according to AV-Comparatives (AVG is, too), and next to Security Essentials for the less technically-minded, it's all I recommend.

Comment From what I understand (Score 4, Interesting) 227

There isn't much to do with SCADA regarding security - The systems themselves are inherently insecure, the extent of it reaching only so far as default passwords that are scarcely ever changed and the requirement to have a compatible console. If you're connecting these devices to the internet in any way, then you're opening yourself up for a world of hurt. The best security is physical security, with no link to the outside world except in closed, site-to-site communications. I'm by no means an expert, but having heard experts speak about the subject and with some limited experience of my own, there really doesn't seem to be any better way the way things are.

Comment Re:Huh? (Score 1) 156

No, the first 3DFX cards were actually DOS-based. I know that Jetfighter III was the first one I'd run across, but there's plenty of other examples, like the original Tomb Raider.

That said, supposedly, it's also possible to run Win9x on DOSBox, too, though it isn't supported and I can't pull up a real how-to.

Comment Re:Interesting but it looks slow (Score 1) 156

Actually, having read through the forum posts, it seems like Kekko's patch is based on the work of Aaron Giles, who wrote the 3DFX emulation for MAME. MAME takes an accuracy-first approach, and aims for complete hardware emulation and documentation, which is what this code was written to do. A lot of things are completely unoptimized for acceleration, and from what I'm reading, Kekko is planning on multi-threading it and passing off 3D calls to the GPU. Clearly, it's necessary to get the code working first, which is what the article is talking about.

A snippet from one of Kekko's forum posts:

From what I've seen, most of the time consumed by the emulation is spent by the scanline rasterizer.
The direct lfb access should not be a big issue; of the few games that actually use it, many seem to just write to lfb after 3d is completed, for hud info and such things.
Differently from using d3d or opengl for frame buffer accessing, the emulation just directly reads/writes a memory area, no locking or strange things happen, so not much overhead actually.
The scanline rasterizer is slow, not just because is software, but also because is not been written performance-wise, it's more oriented to accuracy and code readability (it's part of an hardware documentation project).
It could be greatly optimized; one of the first things you can notice is that many checks it does during pixel pipeline processing may be moved out of the scanline loop, but that would mean having tens of different scanline renderers, from basic solid color to shaded filtered textured dithered alpha-blended fogged z-clipped renderer, and all possible combinations. Many things could be rewritten, or assembly optimized.
The idea is to use opengl just for triangles, make it render off-screen to our frame buffer instead of screen as usual and leave lfb handling as-is.

Comment Re:It's certainly a step up from JPEG, but... (Score 1) 378

Considering the format doesn't yet have support in, well, anything, you're pretty much looking at a format proposal and proof of concept here. It's very likely that by the time web standards catch up (if they catch up; Chrome at the least will support it) and support the WebP format, the feature set will be nailed down, alpha channel and all. My guess is that for now, they wanted to show off VP8's compression ratio/quality compared to JPEG and release tools for the idle hands to fiddle with to gain support and hype. Which is more or less what's happened, really.

I'm excited about it - Better quality with smaller filesizes means that for the quality of what most JPEG-encoded images on the web are, you could likely achieve a MUCH smaller filesize, PLUS the ability to do alpha. This should help make the web a more bandwidth-friendly place, while also giving web developers more options without sacrificing visual clarity or bandwidth.

Comment 3D is a fad. (Score 1) 594

An unremarkable one, at that. It's happened numerous times in the past, and it's never had any staying power then, and I doubt it's going to have any staying power now. It boils down to nothing more than an optical illusion, and frankly, there are far better ways provide an interactive 3D effect - Like head tracking. That kind of thing won't work with an audience of more than one currently, but I'd imagine there are ways around that. The biggest problem would probably be how to shoot film in "layers" or however it would be necessary to do. Certainly, this kind of technology would be a perfect application for games versus the current 3D fad.

Comment Tell me how this makes any sense? (Score 1) 362

Modern Warfare had players as "taliban" or Iraqi soldiers (though it wouldn't call them that - As Yahtzee put it, they're all from Unspecifiedistan), and they wouldn't pull that from store shelves. So because the enemy is identified in this case, that suddenly makes it less appropriate? In a multiplayer setting, you can't both be the US forces unless you pull an America's Army and just have the enemy show up as OpFor all the time while you appear to be American all the time.

I mean, I have a hard time imagining why, exactly, there's any controversy over this at all. But I guess no publicity is bad publicity, huh?

Comment Re:AMD's stagnant? (Score 1) 234

Except that fully photo-realistic graphics have been a long time coming, and will continue to be. We're still using the same resolutions we used a decade ago, we just have better models and fancy post-processing effects that make it look that much better. Strip away the pixel shaders, lighting and texture filtering, and the models and textures are still actually pretty ugly today. We can start talking photorealism when we begin to see screen resolutions in the vicinity of 4096px wide on 17" displays and texture resolutions far beyond that. By then, I'm guessing we'll be using realtime ray tracing instead of the traditional styles of raster graphics we use today.

It's also likely that by that point, yes, CPU's would be capable of handling that computational task fairly well, since ray tracing performance is as I understand it fairly linear and independent of texture resolution or things of that nature. We'll probably first see dedicated GPU's for the task, but then watch it evolve the same way we have with audio and other chipset features. Truth be told, that would simplify the life of the PC gamer by a wide margin.

Comment Re:Sweeeeet nectar (Score 1) 234

There does come a point where the increase in speed is not at all worth the extra cost. The top-tier AMD parts (the hexa-cores and the 965's) are more than capable of handling any task effectively, and they represent about the mid-high range of the Intel line in terms of performance (high i5/low i7) for about the same price - the 965 Black Edition is more or less on par with the i5-750, and is $30 less expensive. Calling AMD "ghetto" as you have in other posts is wholly incorrect, and remember that it wasn't very long ago that Pentium 4's were being smoked by Athlon64's selling for half the price.

But it comes down to a matter of usage. The i7's are ridiculously fast processors, but most people won't ever need that. The fastest i7 isn't even 50% faster than the fastest AMD part, and it runs a good 250% or so more at $1,176.14 (vs $334.99) on NCIX. For someone like me, I'd love a high-tier i7, but to be totally honest, there just isn't any compelling reason for most people to crack open their wallets for that. Intel blows away the high-end market with the i7's because nothing can touch them, but AMD has the midrange-low end market cornered, particularly with the rather decent Athlon II series and affordable Phenom II's.

Comment . . . Uh... (Score 1) 827

So. Uh. Like, for a minute there, I thought he was using it to connect up an amplifier or something? But no. It's just a SATA cable. There is no possible reason why this would make a difference unless the old cables were incredibly badly-shielded and the sound card was similarly cheap (AND using analogue outputs), especially considering if it's coming from a NAS, the information shouldn't be travelling through the SATA cable at all.

But that's been pointed out to death already.

Hooray for snake oil! Just like the Denon AK-DL1.

Slashdot Top Deals

Real Programs don't use shared text. Otherwise, how can they use functions for scratch space after they are finished calling them?

Working...