Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re: Doesn't sound workable (Score 1) 102

From what I can tell, software licenses even for commercial software seem to be effective at releasing the authors from liability in most cases. In the case where the code is provided as experimental, no contract was signed, and no money exchanged hands, where is the potential liability?

This will cease to be true in the (near) future, especially for commercial and large FOSS projects.

Uh, your link says exactly what I said.

I've added emphasis to show I wrote that the opposite will be true. There will be liability, even for open source, even if no contract was signed.

And although I expect individual FOSS developers to be only as liable as employees of a corporation, the authors (both corporations and FOSS projects) most certainly will not be off the hook!

Comment Re: Doesn't sound workable (Score 1) 102

From what I can tell, software licenses even for commercial software seem to be effective at releasing the authors from liability in most cases. In the case where the code is provided as experimental, no contract was signed, and no money exchanged hands, where is the potential liability?

This will cease to be true in the (near) future, especially for commercial and large FOSS projects. Security breaches have caused enough damage that in the EU, product liability will be extended to "digital elements" (software, firmware, etc.). If you put it on the market, you either prove you did a decent job of security, or you'll be held accountable for damages.

And on the security bit: this is not controversial! All political groups in the EU voted more for it than against it, with one exception than mostly abstained from voting (but still more votes in favour than against).

For more information, search for "Cyber Resilience Act" (especially posts written since January 2024) and "Product Liability Directive" (a summary on the latest state from Apache). Also, expect both China and the US to implement their own version.

Comment Re:Speed "limiter" is a misnomer (Score 1) 406

The now-mandated tech is a system that ingests speed limit data and notifies you that you are about to exceed it.

Indeed. However, there are still big flaws in the system.

And this is the real problem.

I'd be happy if my car would notify me of speed limits. But I get annoyed real fast when it says the speed limit is 30 km/h when the road next to the one I'm driving on has a speed sign, whereas the road I'm actually driving on has a speed limit of 50 km/h

Comment Re:Public is public (Score 1) 97

You're talking about things that are covered by copyright law. AI training is not.

This is nonsense.

Sure, the actual training of the AI of not covered by copyright law. But much of its training material is very much covered by copyright law. And given the current long copyright terms, I suspect only a tiny fraction of publicly available sources are in the public domain.

Comment Re:Who will pay for it? (Score 4, Interesting) 248

For me return to office is:

1. 2h more needed for preparation and travel to and from office 2. more time wasted on office chit-chats

Will employer pay for the 1st? Will it accept loss of productivity due to to the 2nd?

Not all employers will see 2. as lost productivity. Idle chit-chat builds personal relationships that make working together work more smoothly. Also, the better employees also use office chit-chat for various minor adjustments and minor innovations that improve your process. Even IT is ultimately about people doing work.

On 1.: one way or another, employers will pay at least part of it. Less in a cool job market, sadly, but even then employers still face costs. More commute time reduces the labor pool, and a less friendly/flexible work environment (than your competitors) increases churn. Both increase the cost of doing business.

The nasty bit (for employees) is that these costs are not directly attributed to an employer being an asshole.

Comment Re:Was anyone using the SMS Feature? (Score 1) 21

I have never seen anyone using it.

Of course not: many carriers charge per SMS. These may be low prices, but sending text messages can be very expensive. In the Netherlands, for example, the cost per SMS vary between zero and €0,31.

Using a message platform over the internet (especially via WiFi) reduces these per-message costs to zero, and is thus by far the most preferred option.

Comment Re:I wonder... (Score 5, Informative) 67

Sounds like this judge was bought and paid for by the auto industry, [...]

On the contrary! This judge has an opinion, which is allowed. Please read the remainder of the summary:

But ultimately Woodlock said that he would not block enforcement.

"The people have voted on this and that's the result," he said. "I am loath to impose my own views on the initiative."

In other words: this judge says that democracy is more important than his own opinion.

Comment Re:GDPR? (Score 1) 72

A search query is not in principle a protected information. [...]

Wrong. Requests to the Bing API always have an IP address (a technical necessity), which qualify as personal data in the EU. Also note that the GDPR term personal data extends beyond personally identifying information. So even if a bit of data does not uniquely identify you, it can still be personal data.

This also means that sending, for example, data on a search query for a medical condition makes that entire request sensitive personal data. Having/collecting such data for commercial purposes is definitely illegal.

Comment Re:Fuck Meta (Score 1) 27

"Opt out" _is_ illegal. So is "opt in" hidden somewhere in the TOU. People need to very explicitly consent specifically to collection of personalized data after being told what specifically it is used for and who it is shared with. Nothing else is legal. Oh, and if it is privileged data, such as medical data (and guessing somebody may be pregnant, for example, already qualifies), then that consent has to be in written form on paper.

Very much this, except that the rules for sensitive data such as medical information, race (images!), religion, sexual preference, location, etc. are even more severe.

For starters, you're not allowed to collect/use it unless you can prove that you cannot provide a necessary service without it. And it has to be necessary from the point of view of the data subject. This pretty much excludes all forms of marketing, and makes sensitive personal data a big no-go for Big Surveillance (like Meta, Google, etc.).

Comment Re:Good! (Score 2) 65

Companies should provide all equipment necessary to perform work related functions.

Very entitled. [...]

Not at all, at least in the Netherlands (but also elsewhere in Europe). Here, employers are required to provide everything needed to do the work. By law, regardless of sector.

For employers, providing the equipment also makes sense: if it's your equipment, you get to set the rules. But if the phone belongs to your employee, (s)he can set the rules for that phone. And not all employees are in the unfortunate position that you can bully them into submission by threat of firing.

Comment Re:Don't know why you're happy (Score 1) 78

That photo of you someone posted on Facebook and tagged you - that too? By showing you face in public where someone might photograph it, you agree to get entered into Clearview's database? And I guess if I loose a hair in a public spot it's fair game for someone to pick it up, sequence the DNA and tell all the insurance companies about that mutation making me susceptible to cancer so they can ramp up my insurance rate? After all I just left that DNA data laying around, hey, it's fair game! By not wearing a spacesuit all the time I consent to being sequenced by anyone who feels like it!

For photographs, that's where portrait/personality rights (US: rights of publicity) come into play. Basically, you can control public use of any data related to you. I'm not certain what this means for private commercial use (like what Clearview is doing).

Then again, photographs of you are data where you are the data subject. So this definitely falls within the scope of e.g. the European GDPR. So what Clearview is doing is very much illegal where European data subjects are concerned. (not sure whether it's enforceable though)

Comment Re:"worry it will break personalized advertising" (Score 1) 120

It's not the same as stalking at all. The advertiser doesn't know your name or your home address. It just know you're the ID 123456789 and has seen a product page for a refrigerator yesterday. The potential consequences for you are completely different.

It's wrong to assume the advertisers don't know my name. In fact, in their dataset 123456789 IS my name. It uniquely identifies me, as much as a SSN or name + address would. That is sufficient to target me personally.

And yes, the consequences for real-life stalking are visible and include physical harm. The consequences of being targeted by digital stalkers don't include physical harm but are still very real, and far more insidious:
* They influence my subconscious decisions: that's what marketing is about (already in 2015, NorthWestern University published a study demonstrating influencing sleeping consumers). I can't speak for others, but I'd like to decide on my own opinions.
* They influence my political views: that's why we try to limit fake news, and what the Facebook-Cambridge Analytica scandal was about.

Comment Re:"worry it will break personalized advertising" (Score 1) 120

Yeah, that's the point.

Exactly. I see collecting data for personal profiles as a blatant invasion of my privacy that should be treated the same as stalking: punishable with jail time for all perpetrators.

I don't mind context targeted advertising: ads for TVs when I'm searching for one. Search engines like DuckDuckGo do this (and make a profit doing so). But I very much dislike ads based on a personal profile. People collecting that data are scum IMHO.

Comment Re:Investigated != Guilty (Score 2) 85

Being flagged by this technology means you're accused of wrongdoing.

"Where there's smoke, there's fire": false accusations have a proven history of harm, especially with crimes people feel strongly about, like rape, child abuse, fraud/lying/cheating, etc. More so with our current social media culture, where exonerating evidence usually doesn't get spread around sufficiently.

This is (one of) the reason(s) libel and slander are also crimes.

And then there is the problem with proof & defense in such investigations. After all:

* How many students have the knowledge to challenge these new technologies?

* How strong is the "proof"? The software in the article mentions processes in the browser generating (i.e. faking) data...

* Is such evidence pollution detected? Does "innocent until proven guilty" also apply here? Or do we assign the responsibility to the student, and thus assume guilt until proven innocent?

* How well are students that are found innocent exonerated? Having dismissed charges on record can break careers, as it's seen as a risk...

Slashdot Top Deals

FORTUNE'S FUN FACTS TO KNOW AND TELL: A black panther is really a leopard that has a solid black coat rather then a spotted one.

Working...