Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re:No surprise... (Score 1) 224

In your imagined world only the rich investors or companies have money and all the risks have to be large sums. I think the idea of a maximum margin is to limit the hoarding of profits thus requiring cooperation and smaller but more numerous investments. Micro/crowd investing could raise large sums of capital for larger projects that many people want to take a (small) risk on. I.E. enough businesses/people want better fabs (including a consortium of smaller chip designers that would share them) so they pool the risk and money and make it happen. It requires vastly more communication and cooperation, and thus likely wasn't practical before the net.

I don't know that margin caps are the best option, but certainly that is the end result of competition (a preferable more responsive and free system that is prone to corruption). I would be interested in testing something like the maximum size of your company is inversely proportional to your legal margin cap. Small businesses could have high margins and behemoths would be greatly restricted.

Comment Re:Uber is dead (Score 1) 113

Companies owned by humans (private, shareholders, etc) will charge what they can charge you. Self-driving fleets may start out like that, but it won't take long before we realize that we can automate management and remove ownership.

Thus this wouldn't be the case for non-human "companies" that are essentially a symbiont that are designed to meet the transportation needs of the humans as efficiently as possible (assuming a level of oversight/control that it provides services humans want). First they will drive themselves, then they will manage themselves, then we will bootstrap a service that doesn't require ownership overhead and essentially functions "at cost" because that's how it was designed.

Comment Re:This is interesting.... (Score 2) 573

Like I said in a previous post, infra-red imaging of the inner planets in our solar system shows them heating up at a rate similar to Earth. But, say that out loud and people like you friggin flip out.

Evidence? We measure the sun fairly carefully, so it would have to be in disagreement with those measurements: http://www.skepticalscience.co...

Comment Re:Honest question ... (Score 1) 148

Everyone has been spying on everyone for at least a couple of centuries.

...

Nothing's changed, other than public awareness of espionage.

But it is obviously not the same becuse of the scale. The reach and power of the spying have increased dramatically along with the general reach and power of technology. Spying has always happened but the nature of the beast transforms at certain levels of scale and pervasiveness. You could assume that if you "had nothing to hide" in a free society you were generally safe from surveillance because it wasn't worth the effort, that is no longer the case. You could assume that the means of surveillance were effectively limited to nation states, not to anyone with some technical proficiency. Things are not the same.

Comment Security (Score 2) 196

How does the IoT handle security problems? That seems the biggest stumbling block.

"Dumb" things have an important advantage in that they can't be hacked and remotely controlled - especially without your knowing.

The current maintenance nightmare of securing networked devices is already overwhelming (me) and the effects of being hacked are already incredibly expensive. I'm not sure the value gained from IoT is worth it.

Perhaps if the devices were not update-able and only sent and recieved particular commands... but then you lose some of the value that IoT promises?

Comment Re:Biased data set perhaps? (Score 3, Interesting) 139

I would expect both "open source" code to be of approximately equal quality to proprietary code. In each ideology you will get people who care (about quality), and people who don't, in approximately equal proportions, the same with skill, ingenuity and passion for the work.

The difference is that proprietary software is constrained by the number of developers able to view and work on the code. An open source project may have a similar number, or smaller set of core developers, but a much larger pool of developers that can spot problems, suggest alternatives, fix the one bug that is affecting them, etc. Having a more diverse set of developers will increase the chances that the software improves.

You could also make an argument about the motivations of the developers. Open source projects are often a community of people passionate about what they are building and have a strong incentive to make their code readable by others. By the nature of open source a developers reputation is on the line with every bit of code they make public. I've met far more developers scared to make their horrible code public than those worried about getting fired for equivalently horrible code.

Comment TL;DR (Score 2) 242

For those that didn't read the article, there are a few important points to clarify:

Feinstein's staff is being (falsely) accused of hacking/spying on CIA since they got their hands on some documents the CIA did not want them to have: namely the CIA's own internal investigation of the documents being released to the senate investigation. It seems like the "search tool" provided to the senate staff picked up more than the CIA thought it would. The staffers smartly made their own copy of these docs (as previous evidence had disappeared) and then the CIA did a search of the investigations computers without seemingly any authority to do so.

The final twist is that the CIA internal investigation supposedly agrees with the senate investigation, while publically the CIA disagrees. Feinstein basically has them over a barrel, plus they pushed their luck to try and escape the trap and got themselves in deeper with the potentially highly illegal search.

It also seems likely that the CIA lawyer who allowed all the CIA torture is heavily involved now in trying to save his own ass.

Comment Teaching the latest greatest workflow (Score 1) 246

Math, algorithms and data strctures are not really the critical thing to learn for web development. Hopefully your grads are not starting out architecting anything complicated but instead following best practices and good workflow and leaving the majority of the algorithm architecting to people with much more experience and training.

The important thing to teach are the best practices in web component composition and workflow. These are also rapidly changing, with many competiting tools, but in a consistent direction: modular, testable components as services on top of robust development infrastructure including source control (git), code reviews, continuous integration, rapid, numerous deploys wth no downtime etc. There are lot of good resources about this, but the key thing is to see it in practice, to get hooked on how good workflow and a focus on code quality can make your work a joy instead of a nightmare. There is a huge amount to learn about the latest web development processes, but students (like yours) should be helped to paddle out and get on top of the wave so they can keep riding it - not be taught liquid mechanics or how to build a surfboard.
My dream web dev class would have one website that is built many ways but with similar workflow and final result. Rails stack, python stack, php stack, node stack, etc all using the same assets. Enough versions of the same site that all the students can work in groups to implement the same thing on each stack. Teach what is the same between the stacks (e.g. MVC), without the details of the stack's implementation of that concept and you'll be teaching a lesson that they can carry with them for a long time. Although that might be too difficult for people who haven't done any programming ever, but I think I'd enjoy that class. Regardless, you should have some code that implements a real website with real workflow that they can learn from.

Comment Re:Most workers die from cancer in their 20s (Score 2) 117

Couldn't find much about cancer rates except people repeating that particular line. However, this seems reiable and seems pretty deadly:
from http://www.worstpolluted.org/p...
"Samples taken around the perimeter of Agbogbloshie, for instance, found a presence of lead levels as high as 18,125 ppm in soil."
From wikipedia:
"No safe threshold for lead exposure has been discovered—that is, there is no known sufficiently small amount of lead that will not cause harm to the body."

Comment Re:The greatest single disaster in computing histo (Score 1) 742

Sorry to reply to self, but the GP said: "The blame lies of course with politicians and industry regulators who had no clue what an immense influence personal computing would have on society until it was too late."

This isn't the case, the blame lies squarely on Gates, et al, who couldn't imagine how to run a successful software business using free software. They thought it was impossible, and perhaps for people of their ethical character it is. They have been proven wrong, ethically lacking and incredibly short sighted countless times.

Slashdot Top Deals

Heuristics are bug ridden by definition. If they didn't have bugs, then they'd be algorithms.

Working...