Comment Re:I sense investment opportunities! (Score 1) 27
What difference does the age of the iceberg have in relation to where the ice in it came from. The fact that we weren't talking about climate change when this iceberg broke is irrelevant.
What difference does the age of the iceberg have in relation to where the ice in it came from. The fact that we weren't talking about climate change when this iceberg broke is irrelevant.
"A misunderstanding he says, while talking about land ice on Antarctica and Greenland in a discussion about an iceberg floating in the ocean."
The source of the ice in the iceberg under discussion is Antarctica. You didn't even have to RTFA, this was even mentioned in the summary.
We should rename Anonymous Cowards to Confidently Incorrect Cowards since that's about all you are these days.
That aged well.
for people without redundant systems set up.
Your point? That would be most people by a huge margin.
OneDrive offers
- protection against local hardware failure.
- protection gainst loss in a laptop theft or loss situation
- protection against loss in a fire/flood, catastrophe situation unless you have remote offsite backups (*)
- protection against ransomware
- protection against accidental data overwrite and other common user errors
- simplified data migration to a new device (especially good for people without IT... but its handy for IT too if, people can just sign into a freshly imaged laptop and go)
vs...
- increases risk of disclosure in a cloud breach or phishing attack
(* and if you DO have current remote offsite backups you are probably using the cloud to facilitate that anyway.)
That's not to minimize the risk of a cloud breach... but you need to properly assess your risk profile. Data stored locally can also be targeted and ex-filtrated, the risk is generally smaller, but it remains, especially if its valuable enough to target.)
And I know plenty of people who have had devices break or get stolen many times and lost valuable data, or lost data in an automated low effort ransomware attack (and targeted high effort attacks too)...Point is: for a lot of people, probably even the large majority of people, cloud storage is very much a net positive.
There is almost no scenario where the initial cost to put things into orbit, and the increased cost to build them to exist there, maintain and support them there, and replace them when they fail is ever paid back by the efficiency gain in having it there though.
Unless you've got essentially unlimited free energy to put things into orbit
Lots of people who travel abroad use a VPN service because various corporate sites are geoblocking countries for various b2b and other small scale portals. I see this pretty regularly.
Lots of people who travel abroad use a VPN service for some peace of mind in sketchy internet cafes etc. It ensures you are using a known DNS provider, all your non-encrypted traffic is opaque, and you can use 'nearby' endpoints so aren't running packets around the globe.
Another significant use for VPNs is to route around censorship blocks. And log-less VPN services can be used to drop leaks to journalists and stuff... assuming you trust the vpn service.
Some people just want to confound potential tracking and profiling by their local ISP... in exchange for potential tracking and profiling by their VPN service... but maybe they know the ISP is selling their data, and they trust their VPN isn't.
Your computer, email app, pdf viewer, printer, scanner, and mobile phone, are all non-human and naturually can't have a valid I-9
AI probably cannot legally make hiring decisions, but as a tool to process, summarize, grade,rank, or perform social media correlation or other pre-background-check type stuff
That said, It probably shouldn't be used for that for a lot of good reasons, but all that is entirely unrelated to whether it can have a valid I-9.
If there's 10s of millions of people in china looking to cheat on their exams using llama models... i figure the "market" will make suitable models happen pretty quickly.
You realize of course that Exams are also highly tuned for one specific thing, like "20th Century English Literature", "Linear Algebra", "Criminlology: Overview of the Prison System", and "Axiomatic Meta-logic".
Eigenvalues are all but guaranteed to show up in the the 2nd exam, and all but guaranteed not to be mentioned in the other three.
PS That last one is highly recommended, but its a little dense.
It doesn't interfere with the "TrueDepth" Face ID stuff because it IS the TrueDepth Face ID stuff.
The TrueDepth camera captures accurate face data by projecting and analyzing thousands of invisible dots to create a depth map of your face and also captures an infrared image of your face.
https://support.apple.com/en-u...
You called it an "IR emitter"... but its an IR camera.
This camera is also used for "attention awareness features" (and for this, yes it is constantly taking pictures.)
Is it quite the situation the GP suggested... no, of course not.
But yeah, it is pretty constantly taking IR photos of you when those features are on.
It's likely not doing anything with them beyond the functionality apple claims... but it is taking them, and it could be doing something more with them.
That's the thing, you're not being "ripped off" as a creator if the people willingly choose to buy AI generated slop.
I am if the AI generated slop can only exist because the LLM owner ripped off the human creators to train the LLM.
the same way we all use washing machines rather than hiring a housekeeper
More like the same way Napster and Limewire worked, where we all just picked the songs we want and copied them for free digitally over the internet, rather than paying for a CD to be manufactured and shipped around. Napster was just the market deciding
Except it was massive copyright infringment. Sure the CD distribution model might have been outdated in the face of a more efficient digital model, but that doesn't mean just copying all the work you wanted for free was a good solution.
The LLMs right now are like Napster -- they took all that copyright work for free and are exploiting it for profit. That's plainly wrong.
We need to get to a spotify/apple music world, where the LLM systems respects copyrights. Artists decide if their content may be used as training material or not, and they get paid for it if they do.
I also don't think it will cause a creative apocalypse either, because YouTube has proven there's no shortage of people who will produce creative content even when there's no financial rewards for it.
I actually do agree with you here. But I think youtube is a pretty shitty platform, that promotes *mostly* shitty content. Because the engagement and incentive model of ad supported systems is pernicious. We can do better. But that's a separate conversation.
Separately LLMs training on their own AI slop causes problems and LLMs can easily produce content at a rate that can overwhelm human output ; so even without a creative apocalypse caused by people not wanting to create; there may be a practical one if ai slop drowns everything else under a flood of shit.
And the exact same thing can be said of more than a few human beings too.
Humans are special.
But no one is suggesting that people be forbidden to read and learn the plots, locations, background details, and characters of those books.
Agreed. The contents of your mind are beyond the reach of copyright.
But the contents of your computer memory and hard drives are not.
Memorize it all you want, but copies on your computer are legally different.
Nor is is it being suggested to forbid people from showing off or passing on their knowledge.
Well, yes, people ARE forbidden from doing exactly that.
For example, if you want to recite lord of the rings to a paying audience? you need license for that. Translate it to Klingon and distribute copies?... you need a license for that too.
Sure you can show off to friends in a bar or around the kitchen table all you like privately. That's all fair use. But try to commercialize it and distribute it, broadcast it, or perform it? Yes, people also need a license for that.
AI companies by and large are for profit enterprises copying data that doesn't belong to them and exploiting it for commercial gain by training LLMs on it. LLMs aren't people. It takes input, it runs an algorithm, it generates output. For it to have a model of LotR in there, either the model is getting passed in the prompt the user gave it (it's not) or its getting in there via the training on the copy of LotR and everything else that was fed in (yep this one). So copies were made to train, and some sort of derivative copies are persisted inside the model data too... which then are commercially exploited.
And even if it did work exactly same way as a human brain worked it still wouldn't matter, because its still a computer, and memory and storage, and those things are all treated by copyright law differently from a human mind.
Copyright has always existed to carve out a monopoly for human creators; because we as a society have recognized that the humans at the top of the creativity chain are valuable and need to be preserved.
LLMs aren't going to replace artists, but letting people use them to generate content by training them on the work the artists creates the same problem in the end as the original printing press.
The printing press didn't displace authors. LLMs can't either. But if you let the people who own the LLMs just rip off all the authors, its the same problem you'd have if you let the people who owned the printing press do it.
Copyright isn't born of a desire to prevent the use of technology, whether it is the printing press or its an LLM. It's borne of a desire to ensure that human creators don't get completely ripped off by the people who own that technology.
In order to train an LLM, a copy is fed into the LLM (even if it isn't stored in the LLM, numerous copies are made for the purpose of trianing during training). The purpose of that copy is to enrich the owners of the LLM, which is a commercial purposel. Under what license was that copy made?
Those ephemeral copies are copies. There's lots of (stupid and awful) legal precedents around this stuff. Console hackers and software modders get slapped with this stuff all the time. e.g. lawsuits that succeed with arguments like "By modding the software you violated the EULA, and therefore, while the copy on the CDROM remains legal, the copy you made to your disk drive, and the copy you made in ram is not licensed, and therefor in violation of copyrights"
And then there is the imprint that these training works make on the LLMs... sure they aren't copies. But they're possibly derivative works, again created for commercial purposes... and again... where is the license for that? Just ask chatGPT how lord of the rings starts, and then ask it what happens next... and it'll take you through the whole thing. Not in Tolkien's words (although it can mimic that too if you want)... clearly the LLM has a complete and detailed model of the entire Lord of the Rings trilogy. Is that model not derivative work? It could clearly and easily create an obvious plagiarized work from what it IS storing, where the only prompting required is that it be asked to produce it. (And the only reason it won't is because its been told not to (aka guardrails).
Oh... and don't say 'but its the same thing as a human'. It doesn't matter. Human's are special. A human can only infringe by what we produce as "output". The contents of our minds can't infringe copyright simply by existing no matter how perfectly we've memorized Lord of the Rings. An LLM is just a computer with storage, and its storage/memory contents absolutely can be subject to copyright simply by existing.
They didn't make cars illegal, but they've regulated them from her to hell and back, precisely because moving something that big and heavy that fast among the general public causes all kinds of problems if it were not.
From seatbelts, and airbags, to requiring operators to have training and permits, and limiting where they can be used, and what direction they can go, and where they can turn, and limiting how fast they can be used, etc, etc, etc.
Is your car example meant to imply that are in favor of regulating AI just as comprehensively and aggressively as cars?
" I was there back in the day of analog long distance, and I can assure you that reliability was nowhere near that."
For your home landline?
Or for the dedicated line you were contracted directly with Bell with to provide point-to-point connectivity for critical communications infrastructure?
They weren't offering lots of 9's on your consumer land line, but that didn't mean they weren't offering it.
All science is either physics or stamp collecting. -- Ernest Rutherford