Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
(I'm not a Slashdot manager just another site user) Since the redesign things like comment boxes and posts have no white space at their left edge. Additionally something strange seems to be up with tags - they only display when you hover over them?
Yeah I never understood that, why try and recover the clock signal from the data stream? If I where designing it I would have my DAC monitor the stream to calculate what the clock signal is supposed to be then generate my own dam clock signal.
My guess is:
If you recover the clock from the stream, you just need to roughly control the motor (CD) RPM and stream what you read. If you run your own clock you need a buffer and you then either need to dynamically tweak the motor speed based on how fast the buffer is filling/draining, or you need to read the CD a bit too fast and stop/resume every so often. Clock recovery sounds much simpler to me.
Will it have the same line of site limitations as current satellite Internet? I'm in Seattle, and with providers like HughsNet you need a very good line of sight to the south to get service. IIRC, where I used to work we had the dish pointed only 24 degrees above the horizon.
These sats are going into LEO, not GEO, so their position in the sky won't be fixed. I imagine you'll used a phased array antenna to track them. The good points being: lower latency, no requirement to see the southern horizon specifically. The bad point being that you'll need a view of a bigger chunk of the sky to avoid signal dropouts as the satellites move - how big a chunk depends on how many satellites they have up there (and therefore how many are above the horizon at the same time). If they have enough satellites, it may work out better for you.
It's fine if the documentation is highly technical, I've written linux kernel drivers before :)
The people want it to stay this way and a massacre or two will not change that.
Although crazies keep voting for UKIP, who have said they want to legalise firearms...
If a program needs to look at stuff in other file structures then give it read access
Great! $malware got read access to your bank details.
You want it to be able to write to files in those other directories, fine, it reads in a file it isn't allowed to overwrite or change, and then saves it's own copy that it can molest in whatever way it wants.
So now instead of having a single copy of the file, you have a separate copy saved by each application that has been used to process it - creating a mountain of almost-identical files that the user has to keep track of is not a user friendly way of doing things.
Better is to have a versioned filesystem - each time a file is changed (by any application!) the delta is saved and the filesystem keeps the old data hidden away. Most of the time everything behaves as normal - you have one copy of a file, no matter how many times it is edited. If you need to roll back some changes then you just ask to see previous versions of that file, much like a source control system. And indeed, there are a number of file systems that do exactly this - if you care about such things there's nothing stopping you doing it.
It doesn't stop malware reading your files or modifying them, but it does mean you can recover the unmodified versions... but then doing backups (which everyone should be doing anyway) gives you similar protection.
And, hell, why do applications get the run of every file I use under my account? Should they not have to request such things first? Even on Unix-likes, if you get on as my user, you can trash all my data - why?
Because anything else would require popping up numerous "would you like to allow this application to do $foo" boxes, and then you end up training the user to just hit "yes" on everything because it's too damned annoying to make a decision every time when the vast vast majority of access requests really are legitimate.
Sandboxing based on applications making their own decisions and being relatively trustworthy might not be a bad plan though - i.e. if your web browser has an immutable list of files it needs access to, and you trust your web browser, that provides some level of protection when some malware compromises the browser, so long as the immutable list really is immutable and the malware can't modify it.
I'm sorry, but the very concept of a virus scan happening "at scheduled intervals" or after you've already double-clicked on the file just tells you that it's too late before you start.
Well no, if you can roll back everything that happened between the "all clear" scan and the "you've been cracked" scan then that's certainly much better than nothing.
Fact is, I didn't install it and I have no idea what it ACTUALLY does.
You don't know what most software ACTUALLY does, even if you did install it - most software people use is closed source, but even the open source is a black box unless you actually audit it.
True to a point, but the knowledge gained from the ISS is nothing to sneeze at either. I do agree that a manned mars mission is a bit silly at this point though, we don't really have the technology yet to make it feasible. More research into alternate energy sources should be where most of the money should be going.
I suspect a manned Mars mission will always be "a bit silly" at any point until people start actually doing it. And whilst I can't really point to much tangible return on the investment, "blue skies" project do have a habit of producing some quite unexpected returns.
To my mind, governments seem to be mostly concerned with themselves at the moment, with nothing to unify those in power towards some common (non-selfish) goal. With the few top-richest people being as rich as they are now I wouldn't be surprised if a few of them banded together to put together a manned Mars mission long before any government (so long as they do so before a revolution comes and redistributes the wealth a bit more fairly).
That's not really true. You can look at a research lab and measure the ROI retrospectively quite easily and use this to make forward looking decisions, and that's what a lot of companies do. They'll close research labs that haven't produced anything useful in the last 5-10 years, but they'll increase funding to ones that have.
And what about research that takes longer than 5-10 years to come to fruition (which actually isn't very long)?
Lets take fusion research as an example - that has spent decades sucking money out of governments and has produced very little return on that investment. It may never produce much return. But if we ever do crack fusion for commercial power generation, that would be a serious game changer - probably a big enough return to justify a couple of hundred years of otherwise fruitless investment.
But that doesn't mean that the government should be paying for it, because not all of us agree we should be paying for it. Using Tax to pay for something should only happen for things we can only collectively purchase, like National Defense. We should be able to pay for it ourselves, and reap the rewards individually
Umm, I don't agree with my taxes being spent on "National Defence" (when I can sum up the current "defence" ideas as "go into foreign countries and blow up some brown people").
Guess what - you don't get to choose what your tax gets spent on. In theory, it should be apportioned democratically, but even that doesn't happen - a significant number of people objected to the Iraq war and were ignored.