Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror

Comment Re: The actual paper says: [Re:What about not eati (Score 1) 182

TFS reminds me of militant vegans who still insist that eggs are unhealthy based on poor research done in the 60s. The fools are still married to the long disproven idea that dietary cholesterol leads to cholesterolemia, and were pissed off after the FDA removed the cholesterol RDI.

Comment Re:Yea. (Score 0) 109

Yeah, I've seen this before. They want you to get "skilled up" then don't give you any more pay for being a better worker.

This is a very basic part of software development and always has been. There's always some framework, API, or concept that you have to learn to get some particular job done. They're paying you because you have the aptitude to pick up new skills quickly. I've only been doing this four'ish years and found that out before I even started (I am entirely self-taught -- no CS to speak of -- the only credential I can offer you is that I'm paid within the top 5% of software engineers at roughly $277k gross on my last W2, likely going to be over $365k this year.) Employers already expect that you're going to spend only 25% of your time writing actual code at best, with probably another 50% reading documentation, watching youtube videos, etc. If you can't do that particularly well, then you'll never be a good engineer, and you'll get paid accordingly.

Experience does translate to value for exactly this reason. On the flip side of that coin, no employers are just going to offer you a raise because you learned a new programming language or something like that (in the four years I've been doing software development, I've already picked up six languages and written a fair bit of code in each of them, so go figure.) Again, that's just part of your job description. If you think you deserve more pay, then ask for a raise. If you don't get one, but you still think you deserve more anyway, then go apply for another job, either within the same company or with another company. If your employer really feels that they can't afford to lose you, they'll offer more. It's really that simple.

I say this often and can't understate it enough: If you went into software development for the pay and only the pay, then you always were going to have a bad time. People like that have historically been weeded out during big recessions, typically having a CS degree that they thenceforth don't even use. From the sound of what this guy is arguing, basically the only people who will remain software engineers are the core group of people that typically even last through recessions.

Comment Re:Put up or shut the fuck up about (Score 1) 156

Absolutely zero evidence has ever been presented that Cheeto Benito was injured in any way, let alone actually struck by anything.

Could you be any more of a stupid fucking conspiracy theorist? I already showed this to you last time:

https://apnews.com/article/tru...

To my untrained eye, that wound looks very superficial. I say that because I've had a similarly copious amount of blood from being scraped by a skeg while surfing. All I really felt was a mild amount of pain and was in the water for another three hours before realizing I was even bleeding. A bunch of people were freaking out when they saw me and called over a lifeguard, which I didn't think was necessary. All he did was wrap it in a little bit of gauze, and the word "superficial" is exactly what he said.

I've also had basal cell carcinoma removed from my ear. They had to remove it layer by layer, pretty deep as well, likely deeper than Trump's bullet wound, and it bleeds quite a bit. People only notice the scarring from it if they're specifically looking for it, and even then they have to take a pretty close look. Trump is old and wrinkly, it wouldn't be surprising if such scars are even harder to see in his case because it's hard to tell the scars from ordinary wrinkles, especially with his obvious spray tan likely concealing pigmentation differences normally seen in scar tissue.

More importantly, people like you who obsess over this are just incredibly stupid. Alex Jones level stupid. Though in your case, I think you especially obsess over holes, including ear holes, which are almost certainly the only holes a person would have that are small enough for your dick.

If you believe that orange piece of shit was shot when no medical records have been released

What else do you want? His birth certificate? His tax records? And how did that satisfy you? Even if you did get them, odds are you'd say they were fake, the doctor was paid off, they were altered by aliens, or some shit like that.

Comment Re: Master of evading detection! (Score 3, Interesting) 17

Either that or his email server fetches all images regardless of whether the email was read. This is increasingly common now. I looked in my work spam digest once and some sales derp was asking why I wasn't responding to his emails even though I supposedly "read them twice".

And yeah, vendors really get aggressive, especially if you're a well known company. Even if you barely do any business with them at all, they like to tell other potential marks that you're their customer in order to build rapport.

Comment Re: Surprised they lasted this long actually (Score 1) 29

Future PLC is just another content mill. The "tech" sites they buy end up becoming consumer electronics review sites that follow the same basic formula: Have freelance journalists write up reviews for products they've in all likelihood never touched before, and rely on SEO, ads and affiliate links to drive revenue. The content may even be written by AI, with or without the publisher's knowledge, but it doesn't really matter who wrote it because it's still shit either way, and always was even before chatgpt.

Comment Re: Erm... (Score 1) 163

TFS is also being pretty disingenuous about the subject matter, either that or whoever wrote it doesn't understand the topic at all. They have it in their head that today's rockets aren't as complex, and that corners are being cut to lower costs.

The truth is exactly the opposite -- they're more complex than ever, and that's for the purpose of lowering costs. And it shows as even Soyuz -- which was a lot cheaper than anything NASA ever offered for manned orbital flights -- was $87 million per seat, and also a far less comfortable ride, whereas SpaceX is $37.5 million per seat. And Soyuz still uses much of the old tech that TFS is glorifying, not to mention benefits from cheap labor compared to SpaceX. That cost advantage mainly comes from the added complexity of landing a booster without relying on a parachute.

Also, nobody is claiming space is easy. Who knows where the hell he gets that idea from.

Comment Re: Great! (Score 1) 69

I remember a while back, I submitted an application to a bank, and they literally had me record responses to written questions on a webcam. About halfway through it I remembered thinking "this is really stupid" and simply closing the browser tab. I didn't care whether they had any interest or not, never bothered to look again.

I'm wanting to say it was silicon valley bank, back when they were doing pretty well, but it was years ago and I don't remember for sure.

Comment Re: Nuts will find a way. (Score 1) 174

Your questions are why we do research. We can do better than nothing, and we tune the result over time. It starts with knowing that a problem exists, which is where we are today.

We actually don't even know that a problem even exists yet, other than just isolated incidents. Science doesn't work that way. The first step for each of these is a case study for each patient. I.e. here we have a case where patient presents with X, friends and relatives say Y, let's do a full workup to see if we can rule in or rule out any other explanations, look at their medical records, possibly even public records like criminal history, and see if there may even be any episodes or hints about them that friends and relatives aren't necessarily aware of. After you get a few of those, then you're at "there may be a greater problem, more research is needed" instead of "previously misdiagnosed or undiagnosed schizoid and/or dissociative disorder".

I mean, shit, we don't even know at this point whether all of these guys simply have a case of alkalosis, which is known to cause psychosis and can be caused by diet alone. The way you guys jump to conclusions with so little being known is just bonkers.

One possible answer: We know that LLMs can be steered away from various topics, and they can be programmed to give canned responses to some queries. Their system prompts can be tuned. None of these involve trigger warnings.

Soooort of -- it's far from perfect. If you keep your ear to the ground, every now and then you'll see stories about somebody doing some kind of prompt injection even on the mainstream services like chatgpt. Their engineers can't figure out how to prevent deliberate attacks to divulge data that the company has a major vested interest in keeping private, but you expect that they'll somehow be able to prevent the thing from gradually, over a long period of time and many prompts as in the case here, prevent the dialogue going in a direction that is otherwise forbidden? Take for example the mention of the (simulated) guy talking about losing his job and then asking what the tallest buildings in new york are. Does that mean it should associate any negative emotion with suicidal thoughts? And how is it even supposed to know that it's a negative emotion? Guarantee you the "sorry to hear about your job" bit was just statistically the most relevant answer it could give, with zero in the way of understanding what it meant.

It wouldn't even surprise me if the whole reason the guy who attempted suicide went down that path was because the guy's own prompts were very gradually going in that direction, quite possibly from his own subconscious thoughts gradually ending up in his prompts without him being consciously aware of it, the LLM only picked up on it as a statistical matter, and is just slowly giving him back what he put into it, only in a different way.

Or like this bit:

He doesn't remember much of the ordeal — a common symptom in people who experience breaks with reality

How is a chatbot going to cause a person to forget like that? This smells of a dissociative disorder. I'm not a psychiatrist, but I don't see how reading text on a computer is going to be a big enough cause of trauma to cause a person to dissociate like this. Because if it is, I can think of more dangerous places on the internet than chatgpt.

Slashdot Top Deals

"I will make no bargains with terrorist hardware." -- Peter da Silva

Working...