Comment Re:6,000 people, in Hong Kong ? (Score 1) 63
It was inside Kowloon walled city. The bomb went off. Nobody noticed a difference.
It was inside Kowloon walled city. The bomb went off. Nobody noticed a difference.
As long as the old USB-C power bricks can be used
It's more likely that USB-D power bricks will be compatible with old -C devices. Just support the -C power handshake. Which will make for a lot of orphaned -C bricks, destined for the dump.
Or the APOE4 variant mice realized that they were giving their lives for a poorly designed experiment. They just figured, "Screw it" and stood out where they figured an owl would end it all.
We don't need another civil war MAGAs[FTFY]
King George III? The civil war?* Actually, characterizing our revolution as a "civil war" isn't entirely incorrect. But then we went off and had another one about a hundred years later. Over some woke GOP agenda. Perhaps it's time for another. To keep the Left Coast from falling under foreign influence.
*Almost as funny as as Belushi's remark about the Germans bombing Pearl Harbor. You probably had the same history professor.
True. So true.
I guess this is why they say the left can't meme.
The ghost of Enron.
Facilitator agents will now sit in on Teams meetings, creating agendas,
... switching the company to Linux.
I can't speak for PP, but in my state, one will recieve a failing grade for denegrating communism in a school writing class.
anyone who knew how markov bots work
There's nothing wrong with that technology, per se. It's the training corpus that dictates sucess/failure. Was the plan to blindly crawl the cesspool that is the Internet for that raw data a bad business decision? Most decidedly so.
It turns out that the most expensive part of AI is training. Both in terms of resources (power and equipment) but also the labor involved with validating the inputs. I did some fiddling around with semantic nets a few decades ago. But I was restricted to a library of engineering documents (at Boeing) that had been vetted to some extent by groups of experts (for a rather loose definition of that term) before release. The results worked well, but had no hope of advancing to the status of AGI. And of course, the tools built had no hope of reaching outside markets due to the inclusion of company proprietary knowledge.
This last point raises another problem wiith today's AI. Why should I allow my expertise, which I depend upon for a competitive edge, to fall into the hands of my competition? The AI bots won't find much useful stuff on my public facing web site. And I'm not stupid enough to host on cloud services. Owned by outfits with interests in AI services.
Train an AI to like or dislike a random item or category, let's say sharks. Then get it to make a training data set for another AI about an unrelated topic, such as teaching fractions to sixth graders. Ensure there is no mention of sharks, or any swimming or animals in the mathematical examples in the training set. Ask the resulting AI about sharks, and it will mysteriously have adopted the other AI's stance towards sharks.
There is no need to invoke malicious intent. AIs absorb cultural prejudices and implicit stances from their training set's culture, and we haven't figured out how, so we can't control, predict, or even mediate it. It's a big mirror of all humanity's flaws, right in our faces.
I doubt this is an actual thing but it's possible that a recirculating stove hood could have a catalytic converter like ventless fireplaces to remove any toxic fumes.
That will require raising the cooking fumes to very high temps. And even then, ventless fireplaces work with a vary limited fuel source: Natural gas or propane. I don't even want to think about what bizarre compounds the catalyst will produce when processing the miscellaneous organics from cooking food.
The two most common things in the Universe are hydrogen and stupidity. -- Harlan Ellison