Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror

Comment Re:King George the Third... (Score 0) 245

We don't need another civil war MAGAs[FTFY]

King George III? The civil war?* Actually, characterizing our revolution as a "civil war" isn't entirely incorrect. But then we went off and had another one about a hundred years later. Over some woke GOP agenda. Perhaps it's time for another. To keep the Left Coast from falling under foreign influence.

*Almost as funny as as Belushi's remark about the Germans bombing Pearl Harbor. You probably had the same history professor.

Comment Re:If you train AI on everything the internet offe (Score 2) 35

anyone who knew how markov bots work

There's nothing wrong with that technology, per se. It's the training corpus that dictates sucess/failure. Was the plan to blindly crawl the cesspool that is the Internet for that raw data a bad business decision? Most decidedly so.

It turns out that the most expensive part of AI is training. Both in terms of resources (power and equipment) but also the labor involved with validating the inputs. I did some fiddling around with semantic nets a few decades ago. But I was restricted to a library of engineering documents (at Boeing) that had been vetted to some extent by groups of experts (for a rather loose definition of that term) before release. The results worked well, but had no hope of advancing to the status of AGI. And of course, the tools built had no hope of reaching outside markets due to the inclusion of company proprietary knowledge.

This last point raises another problem wiith today's AI. Why should I allow my expertise, which I depend upon for a competitive edge, to fall into the hands of my competition? The AI bots won't find much useful stuff on my public facing web site. And I'm not stupid enough to host on cloud services. Owned by outfits with interests in AI services.

Comment We lost control (Score 1) 36

Train an AI to like or dislike a random item or category, let's say sharks. Then get it to make a training data set for another AI about an unrelated topic, such as teaching fractions to sixth graders. Ensure there is no mention of sharks, or any swimming or animals in the mathematical examples in the training set. Ask the resulting AI about sharks, and it will mysteriously have adopted the other AI's stance towards sharks.

There is no need to invoke malicious intent. AIs absorb cultural prejudices and implicit stances from their training set's culture, and we haven't figured out how, so we can't control, predict, or even mediate it. It's a big mirror of all humanity's flaws, right in our faces.

Comment Re:For those getting pitchforks ready (Score 1) 153

I doubt this is an actual thing but it's possible that a recirculating stove hood could have a catalytic converter like ventless fireplaces to remove any toxic fumes.

That will require raising the cooking fumes to very high temps. And even then, ventless fireplaces work with a vary limited fuel source: Natural gas or propane. I don't even want to think about what bizarre compounds the catalyst will produce when processing the miscellaneous organics from cooking food.

Slashdot Top Deals

The two most common things in the Universe are hydrogen and stupidity. -- Harlan Ellison

Working...