Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment so-called journalism is disappointing (Score 1) 96

Ataides is the man's last name, but this post refers to him as 'Mr. Bonilla' .... are you a bot, poster? Or maybe it was auto-translated.

Things are so randomly bad, inaccurate or poorly conceived online these days, we always have to wonder if it was produced by:

- a fraud
- a bot (only suited to transmit pure data, possibly in shortened form)
- a troll
- a moron

Also, summary offers zero hints about what was solved, how, and why it matters. Please try again. You're so close.

Comment Applied science needs to catch up (Score 1) 248

Natural science has grown by tremendous leaps and bounds in the last 4-5 centuries. Applied science has also grown a lot, but our tools can only be fashioned so fast to keep up with the new resolutions, new resolutions of data and types of substances and materials that are being created or conceived of every day. We have so many technologies available to play with and rearrange, that we can engage in forms of meta-science - discovering anew every day how to use what we've created, never mind what nature has left lying around for us. While nature remains more creative and ingenious than us, it's easier to order stuff online and discover them in a living room than to organize a biosample collection safari and then analyze it using wetlabs, NMR, mass-spec, etc.

In some realms of science we're caught in a handful of high-expense, low-return endeavors at this point. Many avenues that are currently not in the spotlight are going to turn out to be the way of the future, but we'll only know that after the fact.

Progress in biotech, nanotech, robotics, computing, neuroscience, genetics, energy storage, photovoltaics, networks, complex systems, pharmaceuticals, semiconductors, (and several more I'm forgetting for sure) is proceeding like crazy, and prices for all these things are plummeting. When enough plummeting has happened, the basic tools of science and industry will have completely changed in character, and then new waves become possible. There is so much progress it is time-consuming for us to get to grips with it.

Or is it possible that we're nearing the edge of what's possible for human minds to comprehend, and the next steps will involve having AI do the discovery-making, and one of the challenges will be knowing how to get the AI to tell us what it knows in terms we can handle?

This whole discussion reminds me of that Richard Feynman quote about quantum mechanics: (to paraphrase -) we've learned the rules of chess, but how many years before we master the game?

As far as I'm concerned, the reason we're not finding 'new laws' of nature is partly because we've already found so many, and possibly they are finite after all, but also we've created such a vast inner (artificial) kingdom of knowledge that you can live your whole life within it without peaking outside - you can discover what humans have created for lifetimes and lifetimes and never find the end. And then there is Youtube. egad.

Comment Complexity (Score 1) 354

I'm a software development professional and I only know well a few corners of the software and operating systems universe.

I've met people much smarter than myself, who knew several times what I know about every technology under the sun, yet nobody has in their heads the entire technology stack - from silicon to circuits to logic to BIOS to CPU instructions to operating system to multiple levels of abstraction on top.

I've been thinking about this a lot lately when I get frustrated with my own devices and my own software: everything you build, no matter how carefully, is interfaced with several things that you know only the most tenuous bits about.

Not even the most well-informed security experts can keep up with all the services and modifications that are live on a typical laptop or regular desktop. It's too much - nobody has the time, let alone the intelligence, to see it all and fit it all together.

It's sheer quantity - there are so many ways software can go wrong. There are so many ways your computer can be accessed from the outside world. There are so many ways you can mistakenly click on something that compromises your machine.

And everything else people say is true - many devs are lazy or in a hurry, many are incompetent, and many software vendors have no idea what they're unleashing on the world - they're just desperate to release the features that matter to them.

I honestly believe until we go back to massively simpler architectures in both CPU design and OS design, this will be a simple, basic, ominpresent fact of life.

Comment about time (Score 1) 85

The architecture of software is something almost (if not totally) always neglected in all forms of docs I find on open source project.

One problem I perceive in the open-source community, which I love, is that sharing of knowledge and the education of peers is something that is often considered a mercenary's game - each is left to his own devices. The mentality seems to be: if you can't read the code and figure it out, then stfu noob. This mentality completely forgets that noobs need to learn and not everything should take interminable code reading to figure out.

I've long wished I could contribute to open source projects, but always dreaded the prospect of having to pore through reams and reams of code just to understand basic connectivity and causality in a piece of software. These are things which a few words, a few diagrams can handily take care of, which would be much more efficient than telling everybody the barrier to entry is the ability to devote weeks of code reading just to understand basics. The basics.

I've found new/fledgling projects I can contribute to because as long as a codebase is young I have a chance of catching up, but when something is a world-class projects several years old at least, it would be nice to be able to understand what is going on without having to wait until I am at the 10th level of extraplanarity in terms of coding wisdom.

I hope this example of documentary exposition on open-source software isn't the last I see.

Comment best reason this is interesting: collaboration (Score 1) 219

Ever since I first saw Borg depicted in star trek, I've dreamed of some kind of mind-to-mind interaction being possible. Contrary to what many people fear, there is no need to be permanently jacked into such a system, although, like facebook, you might find yourself compelled to use it ceaselesly regardless of actual benefits - because we are social creatures of habit. Still, the ABILITY to commune mind-to-mind in no way prescribes the NECESSITY to be plugged in and enslaved. Advantages are countless - although I personally feel like communication with any degree of fidelity would have to be restricted to symbols - language, math, diagrams, what have you. More subjective things like complex combinations of feelings or stream of mental impressions are likely to have a unique code to represent them in every brain. For codes that we share, like math, natural language, etc., it would be great to be able to communicate these faster and more conveniently than we do today. For verbal communication, people being in different locations can't be on the phone all the time. Instant messaging is great but not for people who don't like typing or with imperfect vision. I can't wait to welcome our new robot overlords who are me!

Comment AR can be useful... (Score 1) 101

I easily say what I look forward to, and it will come from a combination of machine learning, human input, structured and unstructured information: the ability to look at something and know how it works, what it's made of, where it came from, who's involved with it. I mean, not having to google/wikipedia every interesting aspect, but having it show up translucently in front of what you're looking at.

This would be especially interesting for complex things like computers, electrical devices, organisms.

I'm looking forward to sub-$300 quality tablet devices to start working on my own version.

Comment usage based billing not the issue (Score 1) 364

But how it's being or was going to be done. In theory, charging the amount that a service is worth (plus small markup for profit) is great business. However, BELL and ROGERS have complete ownership of all telecom infrastructure in this country. Given this situation, does anybody familiar with capitalism and human nature expect prices to be fair and proportional to usage? Technology has improved in the last ten years - fact. Whether it's routing or transmission, all relevant equipment is cheaper and better. Prices have remained static or in many cases kept up with inflation - fact. Service has remained static. Hardly any improvements in the last decade. Some better plans are available for much more money. Does all this add up to fair service? I wouldn't mind usage-based fees if: - the infrastructure PAID FOR AND DEVELOPED BY THE GOVERNMENT (and hence the people) didn't belong to just one company - I could use the internet to replace tv (you can't due to collusion and racketeering by cable providers) Compare all this to cable tv. What is the typical cable subscriber's costs? 40$ per month? 50$? This will not change if the tv is on 24/7. How much bandwidth is that? Clearly the infrastructure can deliver massive amounts of data at little cost to the provider.

Slashdot Top Deals

If I have not seen as far as others, it is because giants were standing on my shoulders. -- Hal Abelson

Working...