Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:I'd argue we need more humanities (Score 3, Insightful) 352

An integral facet of any functional society is a core ethos or ethic that unites its citizens in common bond and in many ways defines the society itself. The language might be antiquated, but you know things like “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”

A social ethos goes beyond a sentence or a document and it can be difficult to define the extent of its scope, but the point here is that civil society requires things like, well civility, to function long term. Arguing that it’s the sole responsibility of parents to teach ethics is ideologically divorced from pragmatic reality. Any society worth its salt will invest in teaching its citizens the ethical requirements of being a member of that society. As usual, this could be a long and interesting discussion in and of itself, but I’ll leave it here.

As for teaching all children coding; I’m not against it in the abstract, but I’ll stop well short of making it a core part of the curriculum during the entire educational process. Because humans are linguistic animals, and language is so closely tied to thought, coding is more than simply vocational training, but at the same time, we shouldn’t overestimate its importance. Juxtapose it with teaching a traditional language, for example: both shape the mind in the way only languages can, but traditional languages allow humans to interact with other humans, coding allows humans to interact with technology. Is one more important than the other? I don’t know that I can say definitively because the evolution of humanity has always been intertwined with our technologies, so it may be a false dichotomy. That being said, if I had to choose one thing for my children, that they exceled at communicating and interacting with other humans or with technology, I would choose the former.

Comment Re:Fear of a dumb planet (Score 1) 197

I think you are operating on a false dichotomy. Though I also wish we could more effectively mitigate the effect of morally viscous humans and human ideologies, that concern is neither mutually exclusive to all other concerns nor district from the particular concern of morally vicious AIs .

There are myriad and massive systems, techniques, etc. devoted to the task of human governance, however inefficient or efficient they may be. It’s not like humans aren’t trying on this front, it’s just a difficult problem because humans have this quality we define as intelligence. So, if you are concerned about the moral viciousness of humans, who have lots of evolutionarily built social instincts, you should be concerned for the future of AIs, because it will most likely be humans that engineer AIs' instincts, at least at first. The problem is intricately interwoven in a reciprocal relationship. Humans are devoting considerable energy into birthing AI. It is likely that strong AIs will eventually emerge (but who knows when), and it’s wise to devote as much if not more energy into engineering the parameters from which AI will emerge. We need to engineer it with as much forethought, prudence, and importantly, respect as we can muster. Moreover, there will be morally vicious humans who will attempt to use AIs and their precursors in morally vicious ways, so the rest of us should not burry our heads in small horizons.

Comment Re:Yes! (Score 1) 88

My overall point was more generic than specific. I was using the SSC and HLC symbolically to represent the kind of colossal projects we often think are out of reach economically and juxtaposing them to the truly fantastic sums of resources humanity expends on war, for whatever reasons. R&D in health and medicine, and myriad other constructive endeavors, are just as worthy if not more worthy of our funds and attention, but smaller projects simply don’t require the same will and commitment to achieve as grand endeavors.

Moreover, you highlight the reality that we much decide which scientific endeavors to invest in, and that underscores my point. We wouldn’t have to make as many of these kind of choices, choices about what to achieve, if we didn’t have to allocate so drastically to war and defense. Again, I am fully and pragmatically aware that there is conflict, there is war, but we are at a point in our technological history where it is far less excusable. I say this because we have the technology to address the most pressing forms of scarcity. That of course is an entirely more complex discussion.

I would like to point out that while the SSC would have been similar to the LHC, it is not as if it would sit fallow because the LHC existed, and the project was already very much underway when canceled. If the SSC were complete now, there would be significant research being conducted there, LHC notwithstanding. Canceling the SSC mid-way rendered the funds already dedicated to the project a waste, and that is a kind of tragedy in and of itself. That being said your point about redirecting those funds is well taken, though I’m on the fence whether it was the right call mid-project.

Finally, though I have my opinions about the Iraq war, they are not specifically germane to my point. Like the SSC and HLC that war is symbolic of a larger facet of humanity, but on the opposite side of our priorities and endeavors.

Comment Re:Yes! (Score 5, Insightful) 88

This reminds me of a time that I showed picture of the LHC to a few (ahem Republican) colleagues and lamented that we stopped work on the SSC. That started a debate and they started lecturing me that the SSC was a frivolous waste of tax payer money. This was back in the bowels of the Iraq war and I reminded them that the entire SSC project would have cost less than two months of the war in Iraq.

It was one of those rare moments where you could see a light turn on. They realized it wasn’t a matter of whether the war was necessary and justified or ill-conceived and evil. They realized the raw trade off humanity makes for whatever reason. They considered the fantastic scale and complexity of the LHC and how it embodied a small facet of humanity’s capacity to achieve and progress and weighed it against a blip in one campaign of misery and devastation.

BTW, I’m neither a hawk nor a dove. Humanity is too often brutal, and I have always had a certain respect for and fascination with the spirit and technology of the military in the face of that brutality. Humanity is a long way from peace on Earth. That doesn’t mean I don’t grasp the almost incomprehensible loss of prosperity and potential humanity accepts to maintain and flex the machines of war (many of which are economic) and the conflicts that allow those machines to flourish.

Comment Re:Evolution is a tale of conflict and symbiosis. (Score 1) 583

You might sum up my point by saying that a true and strong AI is not simply a technology, though it may be the result of a technological process. I don’t want to split hairs about the definition of ‘technology.’ That might be fun and all, but I’m simply saying that it would be an amazing display of hubris and egocentrism to treat a newly emerging intelligence the same way your treat the splitting of the atom. There may be similarities, but the very significant differences necessitate a completely different--and I would argue reciprocal--relationship to the ‘technology’ in question.

Comment Evolution is a tale of conflict and symbiosis. (Score 1) 583

I’m not particularly worried about strong AI anytime soon, but sudden advances do happen for various reasons, so it’s hard to say when or if strong AI will emerge. Generally, ‘quantum leaps’ propel us into the realization that our previous notions were naive and simplistic, and humans aren’t nearly as patient and prudent as we should be with new and powerful tools. This is tragically true in many cases, but the introduction of strong AI would represent a unique bifurcation point in human history and a significant one in the evolutionary history of Earth.

The inception of strong AI represents the evolutionary birth of a new species with the potential to rival and even surpass human cognitive and computational capabilities. Whether we form a symbiotic or competitive relationship with this new species is a terrifically valid concern, especially because the choice will not rest solely with humanity. It would be stupendously foolish should humanity not exercise foresight in this regard. Obviously AI is not humanities most pressing concern, but even if strong AI proves to be impossible or distant, it is simply not the kind of endeavor we should rush into without global agreed upon protocols, and in this case protocols that empathize with and respect an emergent and potentially potent sentient species and evolutionary rival. Of course humans rarely agree upon anything, but that is not a license to operate recklessly.

Do I think humanity will be prudent and exercise forethought in this regard? I really don't know. Though I tend to be cynical, humans do have a habit of pulling together when the chips are down. It’s just unfortunate that we lean so heavily on catastrophe as a catalyst for rational action. Moreover, as the power of our tools increases--or you might say progeny in this case--the less margin we have to overcome a catastrophic mistake.

Comment Re:can we think bigger picture? (Score 2) 33

Sure, it would be awesome and in useful, in many ways, to have a semi-permanent base on Mars. It probably should be a long term goal, but not a current focus.

Terraforming itself is unrealistic even as an extremely long term goal. Who knows what the technology will render possible, but Mars isn’t a great candidate for terraforming. Its gravitational field is weak and it has little or no magnetosphere to name a few things; both of these factors greatly degrade its capacity to maintain a substantial atmosphere. Even if Mars were a near perfect candidate, the cost, required will, and logistical/technological challenges would be staggering. We can barely make a positive dent in Earth’s biosphere (many would argue we've only had a negative impact despite the fact that our lives depend on it); and we only have vague ideas about how to begin building any kind of atmosphere on Mars, much less an atmosphere conducive to Earth-like biodiversity. An Earth-like atmosphere is just one facet...and so on. It would only be a realistic endeavor for a vastly more technologically advanced humanity, not to mention one that otherwise had its shit together.

I've always held we are more likely to visualize ourselves before we terraform another planet.

Comment Re:Just Right (Score 1) 135

You said:

Given all of these issues, I wouldn't attempt to defend the proposition that "a large moon is necessary for the origin of life". It may be a true proposition, but I don't think that the state of knowledge at the moment allows one to claim it as a fact.

Over to you.

Consider these statements from my last reply:

...I wouldn't argue that a moon like the Moon is necessary for life, nor could I...


Is a Moon like moon a necessary condition for advanced sentient species? Probably not...

As such, I'm not sure we disagree significantly on that specific point.

My point is that I once held an fairly optimistic stance regarding the possible numbers of alien civilizations 'out there.' That stance has been tempered and refined as we discover just how many things Earth has going for it. Are all of them strictly necessary? By no means, but there is probably a critical mass in the matrix of variables needed to foster not only life, but an ongoing evolutionary process. Will there be harsh world outliers? Probably so given the number of potential planets out there, but I am simply saying that the known universe is far more inhospitable than I had once imagined.

Just as an aside; I can’t tell you the last time I watched the Discovery channel or its ilk. I do watch Nova and Nature on occasion, and I really enjoyed Brian Cox’s Wonders of the Solar System and Wonders of the Universe (which may actually show on Discovery for all I know). I see other things from time to time, but I find most TV shows repetitive and over-simplified, so I stopped watching them, by and large, many years ago.

Comment Re:Just Right (Score 1) 135

Fair enough; of course I would be hard pressed to defend any of this in a rigorous manner as I am neither a trained scientist nor in the habit of reading scholarly papers, journals, etc. The 'scientific' facets of my worldview are largely informed via popular scientific outlets, some more rigorous than others. I certainly haven't scoured the scientific papers on the subject.

On the other hand, I wonder what you mean by 'very popular claim?' Do you mean a claim made often by non-scientist? Perhaps, but there are easily accessible statements by scientist from various disciplines that speak to the Moon's theorized role in Earth's evolutionary process from the stabilization of the axis, to the early tilde affects on Earth's magma (when the Moon was much closer), to decreasing the amount of asteroid and comet hits we take, to regulating ocean tides, etc. Again, these come to me filtered through popularized media, but they are claims made by scientists.

In any case, I wouldn't argue that a moon like the Moon is necessary for life, nor could I, but don't think its unreasonable to argue that it makes Earth a much more hospitable place to live. And again, the Drake equation isn't about just any kind of life, but about sentient life capable of reaching advancement roughly on par with human culture (or beyond of course). Is a Moon like moon a necessary condition for advanced sentient species? Probably not, but there is probably a critical mass of non-necessary conditions that is necessary...if that makes any sense.

Comment Re:Just Right (Score 1) 135

Indeed, but I wasn't commenting on the equation as much as the tendency to plug in optimistic numbers yielding estimates of tens of thousands of advanced civilizations in the Milky Way alone. I don't have the numbers handy (so my memory may be betraying me), but I believe that somewhere around 90% of stars, in our galaxy, reside in areas too violent to support life for the requisite periods of time regardless of all other factors. I'm not qualified in the least to say whether you facter that into R itself, or if fi is more appropriate. Either way, when you start trying to determine a sound value for ne considering myriad variables like magnetosphere, tectonics, temperature, sufficient H2O, etc., I'd guess N is orders of magnitude less than what was in vogue decades ago.

Comment Just Right (Score 1) 135

Just another example on how many factors affect a planet’s ability to support life not to mention sentient species and civilizations. The more we learn, the longer the list becomes (e.g. the right kind of star system with the right kind of star, the right planetary materials in the right zone, the right kind of magnetosphere, the right kind of moon, shepherd planets, the right kind of galaxy/cluster, the right place in place in the galaxy/cluster, the right kind of geological tectonics, the right kind of asteroid/comet hits, the right kind of mass extinctions and evolutionary histories, and so on).

The universe (not to mention a potential multiverse) certainly contains many planets capable of supporting civilizations, but the numbers are certainly bleaker than the old Drake equations.

Comment Re: Tunnels of Doom (Score 1) 225

Cheers. I loved Tunnels of Doom, but god did I hate that tape drive. It would error out half the time making the 40 minute load time a 60 minute load or worse. I Office Spaced that thing before it was a verb.

Good memories indeed. That and other 'pure' games like Star Raiders (Atari 400/800/1200). Damn if that TI 99 voice synthesizer wasn't made from alien technology. It was very human sounding, and decades later Steven Hawkins still sounded like a 1930s radio drama robot.

Comment Tunnels of Doom (Score 1) 225

The first 1st personish game I remember on a PC was Tunnels of Doom on the TI 99. It's certainly wasn’t an FPS, but good portion of the game was moving through the hallways (or tunnels I suppose), which was from a 1st person perspective.

Ahh Tunnels of Doom; nothing like sitting around for 40 minutes while the game loaded from a cassette tape drive.

Comment Re:You have needless conversations. (Score 1) 361

Yes I misinterpreted your comment. I think we agree for the most part.

I would say that improving one's social skills, though draining on an introvert, has intrinsic value above and beyond career advancement. Its a fine line; I personally find self-marketing and political maneuvering distasteful as means toward some ends, but if one learns to integrate better social habits in an honest effort to better the self, it can enrich one's life across the board--including by but no limited to career advancement. I might wish to argue that increasing one's social acumen as a means to greater eudaimonia will get one farther in career than 'faking it,' but that's probably idealistic and untrue.

Comment Re:You have needless conversations. (Score 1) 361

This is an intelligent, reasonable comment that I find completely wrong headed. Everyone has their own perspective and stake in a cooperative endeavor. In the case of a business, sure, some owners may want the workplace to be about profit and efficacy to the exclusion of all other factors, but there is more to life than that. We are social organisms, and we spend large portions of our lives in our work environments. Whether we like it or not, most of us have a biological need for these environments to be fulfilling beyond raw monitory concerns. As such and assuming that ‘communication skills’ is code for social skills, that shouldn’t be discounted, belittled or subordinated to profit making. Now that doesn’t mean socially withdrawn people are any less valuable (I’m a mildly autistic and very introverted myself); I’m simply saying that the overall social environment of any collaboration is important.

BTW, I’m not trying to discount stakeholders’ reasonable expectations for efficiency and profit, but just as capital is not the only facet of a business, the needs of the stakeholders do not totally eclipse the needs of those who provide labor, ideas, or even enrichment of the social environment.

Slashdot Top Deals

Promising costs nothing, it's the delivering that kills you.