Comment Re:wow! That's terrible (Score 1, Troll) 245
Well, they won't be able to calculate how much the USA is giving up to other countries. Then again, given the current administration and hallucinating AI, they can just make stuff up?
Well, they won't be able to calculate how much the USA is giving up to other countries. Then again, given the current administration and hallucinating AI, they can just make stuff up?
It took 18 years of pointless clicking for bureaucrats to finally notice that they chose the worst implementation possible of cookie control.
Getting policy right is hard. Sometimes you need to prepare a mindset change or test out an approach, though certainly there are things that fail miserably due to unintended consequences. See this like developing software, but instead it is policy.
What will be interesting is how long before the W3C comes up with a solution that can work across browsers and websites, and then how long before it gets adopted by browsers and websites.
Irrationality. I remember it well. Quoting Wikipedia: "Irrational exuberance" is the phrase used by the then-Federal Reserve Board chairman, Alan Greenspan, in a December 1996 speech given at the American Enterprise Institute during the dot-com bubble of the 1990s.
They probably did their research, for what people would pay, especially taking into account crazy fashion choices and the FOMO factor.
Nobody's asking anybody anything. Submitting bug reports (if they're valid and good) isn't asking, it's helping: knowing if and where your software fails is bettet than not knowing, regardless of whether you decide to fix it or not.
Though if Google is setting "ninety-day countdown to full disclosure regardless", then they are essentially pressuring a group og volunteers to change focus and deal with that problem. That's the spiteful part. If Google cared about the open source it benefits from, they could set aside some devs or even provide some financial help to deal with this,
A company like Google could even contribute quality fixes, but by humans. Asking volunteers to solve a problem that the multi-million dollar company is benefiting from is cheap and spiteful, especially if said company provides no value to the project.
Maybe this is incentive to help design data centres that are less power demanding, such as using computers that use ARM and are better with how their code is implemented?
Then combine that with roof top renewables.
This is a hard problem, but if the economic incentive is there, then someone will want to address it.
You are also making an assumption that it is easy to find alternative store fronts for this content, especially when people arenâ(TM)t wanting to pay for yet another streaming platform
The day will come that an AI will learn something that we did not deliberately teach it. When an AI is able to improve its own code, it won't be bound by the limitations of its human creator. It's only a question of when.
LK
Can a non-biological entity feel desire? Can it want to grow and become something more than what it is? I think that's a philosophical question and not a technological one.
LK
Don't agree at all and I think that's a morally dangerous approach. We're looking for a scientific definition of "desire" and "want". That's almost certainly a part of "conscious" and "self aware". Philosophy can help, but in the end, to know whether you are right or not you need the experimental results.
Experiments can be crafted in such a way as to exclude certain human beings from consciousness.
One day, it's extremely likely that a machine will say to us "I am alive. I am awake. I want..." and whether or not it's true is going to be increasingly hard to determine.
LK
Only if we define consciousness to be a state of awareness only attainable by human beings.
An LLM can't suddenly decide to do something else which isn't programmed into it.
Can we?
It's only a matter of time until an AI can learn to do something it wasn't programmed by us to do.
Can a non-biological entity feel desire? Can it want to grow and become something more than what it is? I think that's a philosophical question and not a technological one.
LK
You forgot about the hallucinating AIs, which while they can provide useful information, are prone to making shit up.
This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian