Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:And this is how we get to the more concrete har (Score 1) 528

Indeed. However, the Discovery Institute's chance of success depends entirely on obfuscating that goal. There's a lot more people who would support "intelligent design" as some sort of oppressed underdog "scientific theory" than who would support it as the blatant theocratic idea it really is.

Which is why it's called Intelligent Design. in fact, when they were converting from Creationism to Intelligent Design, they basically did a search and replace. And they left transition fossils to show how "Creationism" evolved into "Intelligent Design" because of a messed up search-and-replace.

(A transition fossil is just that - if you have animal A and animal B, and you know B evolved from A, then there has to exist a creature in-between A and B, called the transition fossil since evolution works on such timescales that many generations of creatures will exist between then and now).

Yes, there was evolution in the DI texts :).

It's too bad that more Americans believe in creationism than the great flood, since the latter is a lot more scientifically plausible than the other two ideas you mentioned. I mean, it's pretty clear that the "entire earth" didn't flood, but it may sure have seemed that way to somebody living in what is now the Black Sea about 7600 years ago.

True, there was evidence of it, however there was unlikely to be an Ark. Maybe 40 days and 40 nights of rain, but that's about it.

Approximately 40% of Americans believe God created humans as-is. (The rest believe either humans evolved, or humans evolved with God providing a helping hand). And that percentage has remained fairly stable over the past 30+ years.

Comment Re:And this is how we get to the more concrete har (Score 4, Insightful) 528

On the bright side, framing the debate in those terms might help convince the kind of people who would argue that we should "respect all sides of the issue" (or some politically-correct BS like that) that these anti-scientific ideas really don't belong in science class after all. I think the lawmaker did us a favor and I'm optimistic that his plans will backfire.

It doesn't matter. The WHOLE reason we're having this debate is not about science. It's not even about creationism or "intelligent design" or however we "evolve" the term.

The Discovery institute (the real organization behind all this) believes fundamentally, society went awry when we did the whole "separation of church and state" thing and that religion in school meant students were better behaved and more obedient, and society as a whole was just better off.

So that's the real end goal - to get religion - or more correctly, Christianity, back into schools so everyone becomes a "good little Christian boy".

(Yes, it glosses over a LOT of things, like racial issues, the fact that there are more religions than just Christianity, etc).

Basically all of society's ills are the direct result of secularism and the pursuit of "things" (money, toys, stuff) instead of spirituality.

It's just that creationism is the wedge issue that can get them in the door the easiest since a lot more Americans believe in it (than say, a great flood happened, or that everything we see was made in a week a few thousand years ago). And once you're in the door, spreading the other beliefs becomes a lot easier.

Comment Re:Particle state stored in fixed total # of bits? (Score 1) 247

Or Quantum theory. Ever notice how things are quantized (i.e., they come in discrete packets of stuff) rather than a continuous spectrum?

Or how once you get below a certain size, the rules of physics just seem to break down and it all becomes random?

Well, we hit the resolution limit of the simulation, and the quantum "foam" is the LSB of the simulation. Even in computing today (especially floating point) you have to be careful in how you order your operations so you don't lose TOO many bits in the mantissa due to computation error. Well, that's what the quantum world is - computation errors flipping the LSB around in random unpredictable ways. It's just we're able to guess at the likelihood of it being in a certain way because the simulation runs the same operations the same way (and loss of precision can generally be approximated). But it too loses precision during calculations which is why the quantum world is statistical. A software upgrade to the simulation can change the way the least precise bit behaves, if they changed that part of the simulation calculations.

So there you go. The resolution limit of the universe is h-bar, representing the limited precision of our simulation.

Comment Re:Not the PSUs? The actual cables? (Score 1) 137

How do you fuck something like that up?

Separate assemblies - the ones who do the power supply generally are very good at it (including the IEC plug the AC power goes into). The output end is typically just a header, and the cables are provided by a third party who specializes in making terminated cables. (Especially modern laptop cables which can have several conductors and indicators), with the only requirement that the power supply end use a mating connector.

Though, cases and other stuff are also often done by someone else (the power supply manufacturer will often assemble it all together though).

And customers are stupid and they yank on cords that cause the wires to stretch and break, or bend them tightly. It all frays the insulation.

Apple has the same problem and often times if you take in a power adapter with a frayed end, they may replace it for reduced cost. Moreso if the machine it goes with is under applecare (and since they're all compatible with each other...)...

Comment Re:Address space randomization does not help. (Score 1) 98

Yes, ASLR somewhat works but is an afterthought. The ultimate solution would be to stop using computers which mix data and code adjacently, in other words get rid of the whole von Neumann computer architecture.

There are plenty of processors that are Harvard architecture out there (separate data/instruction memory). Though modern architectures do have a bit of Harvard in them (the separated instruction and data caches). And memory segmentation and permissions do help split code and data into separate areas.

The problem is that von Neumann makes computers extremely useful because you're able to treat code as data, so you can do fancy things like load a program off disk into memory and execute it, or load a program from a network device using any programmable protocol and run it. This only works because the OS treats the code text as data temporarily to load it off storage (local or otherwise) and then into memory. (After all, loading a program into memory consists of reading the executable off disk like you'd read a regular data file into memory, then you'd need to runt hat code). Heck, modern paging systems in an OS rely on it - reloading a memory page from disk doesn't care if it's code or data - the OS just sets up a new memory page to hold the contents, finds the location on disk, and tells the disk driver to populate that memory with data, and on completion, re-executes the failed instruction (or performs the pre-fetch)

Harvard architecture machines need to have a way to load their program information and pre-load data into memory, which is why traditionally they only run fixed program code (like DSP). Or have a von Neumann machine load the code into instruction RAM. (They're great for streaming and signals where the code doesn't change, but you're constantly passing data through the system)

Comment Re:which is fine light reading, but not a referenc (Score 1) 22

While I agree that NC is generally misunderstood by lay licensors, and greatly more restrictive than most people realise, ND has a valuable place in the licensing suite.
For example, if you write an opinion piece, adding the ND clause will make sure that no-one can (legitimately) alter or distort the text, and use it to misrepresent the position you hold/held.

Otherwise, using ND for non-opinion works shows a certain amount of arrogance. It's effectively proclaiming "no one but myself could possibly make this any better".

Not really. Because even the most restrictive copyright (traditional "All Rights Reserved") still has people routinely distort and misrepresent your position. It's called "creative editing" - and it can change the meaning completely.

If people want to misrepresent you, they're going to, regardless of if you use ND or full copyright. And no, just because it's on the web doesn't mean it's not under full copyright - the author can legitimately post an opinion piece completely copyrighted (see editorials) and be freely readable. It's under copyright, so no one can legitimately alter or distort the text.

Oh, but you say what about fair comment and all the other fair use rules? Guess what? They apply to CC works too because just like copyleft, it relies on copyright law to specify the minimum rights everyone has, including fair use, satire, etc. CC and other copyleft simply grant more rights than copyright would've so you can ignore the CC license just fine, you'll just be held to a more restrictive agreement.

ND doesn't solve anything. It probably makes it worse since it just means your work gets copied everywhere, whereas full copyright means your online post is the only legitimate one and people should link to that as the original source piece. Those who would just re-host it and violate copyright law will continue to do so, regardless of "All Rights Reserved" or CC.

Comment Re:No more "Cloud", please (Score 1) 60

Enough. No, I do not want "Cloud" services, thanks. I want my good old desktop with local applications that do not need be connected to the internet 24/7 to work, not everyone have a fiber connection available all the time for this.

So don't use it. Why does it have to be an either/or situation? If you need your desktop, continue using it.

This service is more for those who have a desktop only because they need to run something on it. You know, like how some people ran Windows just to play a video game. Or for one application they use infrequently but have to use.

Hell, this is practically an ideal situation for parents who basically neglect their PCs and to whom you spend every thanksgiving fixing their PC. You replace it with a chromebook (locked down web browser) and use a cloud desktop for the few things you need a desktop PC for.

It's like those who complained tablets will replace desktops, yet Jobs was far more accurate in that we'd always have desktops even in the age of "Post-PC".

Submission + - Uber's new problem - Assaults and Carjackings (pando.com)

An anonymous reader writes: Uber has come under attack lately from taxi drivers to government regulators. However, a new problem has risen up. Uber drivers in LA are reporting assaults at gunpoint, their phones stolen, even carjackings, s. Uber drivers suspect the taxi industry since the phones (ancient iPhone 4 models issued by Uber to the drivers) are effectively worthless, but taking them ensures the driver cannot pick up new fares. The drivers are rapidly discovered using the client-side Uber app which shows which drivers are nearby for pickup. Of course, it could be coincidental as well, since taxi driving is among the most dangerous jobs out there (approximately 18% of all taxi drivers are injured from assaults or other violent acts).

Comment Re:As a non-fanboy I like the Cook Apple better. (Score 1) 90

It would never catch on because it doesn't support what existing Micro USB connectors do, and what other manufacturers already use. For example, there is no way to do uncompressed 1080p video over it, and phones were doing that three or four years ago so are not likely to drop back now. The cost of the Apple video solution is prohibitive as well, when an MHL adapter is Ã5.

Lightning doesn't seem to support USB peripherals either. Not sure if it is an inherent limitation of the design or just that Apple don't use them, but many Android devices can make use of USB flash drives, card readers, game controllers, keyboards, mice and the like.

micro USB connectors DO NOT DO VIDEO.

MHL and SlimPort and every other standard does. No, those connectors are not compatible with each other, but they do allow you to fit a microUSB plug into them. They are not, however, micro USB. That'll be like saying Apple invents a new connector, but you can use micro USB with it. It just means the connector was made compatible, but if Apple puts in Firewire/thunderbolt/whatever, it doesn't mean micro USB inherits those properties.

USB peripherals are supported by lightning just fine. You can connect cameras, memory cards, even USB DACs to an iOS device just fine - you do need the "Camera Connection Kit" which converts your 30 pin or Lightning port to a USB host port, to which you can plug in a camera, memory card reader or flash drive, or USB audio device to. Or keyboard, if you wanted.

And it's taken long enough for USB to get to the point where you can plug it in without caring for orientation. USB micro aren't immune to this - USB micro AB ports generally are reversible because of their godawful design. And most devices should be using microAB ports instead of just microB and special adapters to make it an A port. It's just the user experience is so terrible, and it makes it incompatible with MHL and SlimPort (which only are compatible with micro B cables).

Comment Re:Won't work with new chips (Score 1) 78

When you sign the back of your card, you're providing a template for forgery to anyone that happens to steal or find your card. I can understand why the credit card company would want you to do this, as a convincing forgery job on a signed sales receipt shifts liability from them to the consumer. However, as a consumer, I don't understand why you'd willingly buy in to such a system.

Because signing a credit card isn't for verification. It's for agreement of the terms and conditions.

Signing the back of your card is how you indicate that you agree with the terms of your cardholder agreement, which your provider has spelled out how you pay them back, how they charge interest, what interest rate, billing, etc. If you don't sign the card and the merchant accepts it, then they have to eat the loss because you didn't agree to the terms.

Likewise, signing the chit just means you agree to pay the amount shown in line with your agreement.

It's just contracts, in the end. The card signature shows you agree to the contract between you and the credit card provider. The chit signature shows you agree to the contract to pay the amount shown. If someone else forges your signature, that's fraud and you're not responsible. Likewise, if someone uses your credit card with their signature, that too is valid since it was signed under agreement.

There's nothing special about the signature. Banks routinely loan out lots of money without even a "reference signature" to compare to, yet they're still valid.

You're just signing to show you agreed to the presented terms.

If you look closely, the chits all say "Cardholder agrees to pay the amount shown per the terms of the cardholder agreement" which is what you're REALLY agreeing to.

Comment Re:Should of never got rid of other OS and outsorc (Score 1) 97

Someone who has enough skill to use the Other OS function probably has enough skill to install CFW

Actually, CFW is freakishly easy to install. It's just an offline update.

No one uses OtherOS anymore. The reason you use CFW is pirating games and all that. It always has been since the OtherOS folks, pissed at losing it, hacked the PS3 to restore it. Which ended up leaving a huge hole for everyone else to exploit, so there are more than a few ISO loaders and dumpers and all that.

Not sure about their status to play online, since I hear that Sony sends down a binary to run on them to report on the status (client-side trust), which I assume is pretty easy to fake after a few days.

Anyhow, it appears Xbox Live is back up, the best they could do was make it "intermittent". And only login was affected.

Comment Employee happiness (Score 1) 90

A CEO that gets it.

Tim Cook realizes he's not Steve Jobs. Steve Jobs is perhaps one of three people in the world who can be an asshole and yet get results done (the other two - Linus Torvalds and Theo De Raadt). Say what you want, but they're all assholes, except mysteriously, they get results.

Everyone else who've tried, failed miserably.

And I'm sure Cook realizes it too - he's no Jobs and being an asshole would destroy the company (most who try fail, hence why there's only three people in the world who could do it). He's got to be different, and if that means revamping the company from being under the thumb to how companies should be run, so be it.

Still, you do miss the odd Jobs-style flare up. I mean, Ballmer had his chairs. Cook is just a bit.. understated.

Comment Re:Okay... and? (Score 2) 316

It's almost like the editors wanted to publish a biased article or something. Scandalous.

Exactly. It's exactly the same thing Apple, Google and everyone else has.

Hell, in Apple's case, it's cheaper to borrow the money in the US than repatriate it. When Apple needed $17B, they took on debt against future US earnings, because it would cost them less to pay back that principle plus interest than it would if they were to bring in the money from offshore into the US (which I think would've been close to $30B to get $17B they could spend). And Apple has very rarely taken on debt intentionally.

An unintentional side effect is well, Apple, Microsoft and Google have to spend that money outside the US, so they hire developers and other people to work outside the US as well.

Comment Re:3D Blu-Ray Player (Score 1) 99

If my PS3 breaks while they're still making them? I'm not sure I'd buy another. I'd just get a cheap 3D-capable Blu-Ray player and play SotN by other means.

You'd get better quality using a cheap 3D blu-ray these days - the PS3's HDMI output means it only supports half-resolution 3D, and in doing so, lossy audio, making it one of the most undesirable 3D players out there.

3D over HDMI comes in 4 formats - side-by-side (SBS), Top-and-bottom, line-interleaved, and frame-packed. The latter format involves fitting two full resolution frames (1920x1080 x2) per frame, while the others are fitting two frames in a 1080p frame. Side by side means the two frames are 960x1080 in size (losing horizontal resolution), while top and bottom means they're 1920x540 (losing vertical resolution), while interleaved means every other line belongs to a frame, again losing half the vertical resolution.

Couple that with lossy audio (the PS3 can't do lossless audio in 3D mode, go figure), and it was a nice "how do you do" feature. The people who could use it however, generally were people who spent a lot of money with a nice system. Even today, they still are since 3D has disappeared practically from store shelves. Relegated to a few high-end models so if you wanted it, you paid for it.

Comment Re:Wet Dream (Score 1) 99

The removal of OtherOS didn't affect the average gamer, it only affected a very small group of people who installed Yellowdog Linux out of curiosity. I was one of those who did so -- a year later, I didn't particularly care that the feature was removed, because as everyone else who tried it discovered, OtherOS sucked. The hypervisor, which can't be worked around, locked out much of the hardware. Want to use it as a cool games emulator? Good idea! But since the hypervisor has always restricted the RSX, the PS3 runs much slower than your standard HTPC, and has almost no graphics acceleration.

It's only been recently that some exploits with specific hypervisor versions have allowed the Linux kernel to boot in "game mode," unlocking full graphics acceleration, but that's not a Sony feature and wasn't available through OtherOS.

OtherOS always sucked because Sony was scared it would lead to pirated games or homebrew games that competed with their own offerings, so they crippled it from the very start.

And you know what? It helped keep piracy at bay.

Here's one thing Microsoft learned on the original Xbox - when the interests of homebrewers and pirates align, you lose. It's why the Xbox360 is locked down and to this day, unbroken save for limited piracy hacks.

Sony had the same with OtherOS. Within 6 months of them removing OtherOS, the PS3's horrendously broken security system was breached - by people looking to run OtherOS! And what happened after that? The pirates came in and basically took over. It's so bad in the early days, you could still use PSN with a fully opened console (which led to the PSN shutdown a few months later). And these days, you still can since the complete console security system was breached - anything Sony tries is a element of "trust the client". Which means it works for a few days, then fails as everyone learns how to spoof the response.

And perhaps another factor was Microsoft's "opening" of the Xbox360 using XNA and the Xbox Live Indie Arcade where homebrewers can write code and then play them and even offer them for sale.

It's lead to the Xbox360 being unbreached - no "hacked" console can be connected to Xbox Live without being detected, and the security of the software is such that it still only runs Microsoft's code.

So if you hacked your xbox, you could play pirated games, but never online.

Slashdot Top Deals

A large number of installed systems work by fiat. That is, they work by being declared to work. -- Anatol Holt

Working...