Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Not surprising (Score 1) 59

This is exactly what any smart educator expected and the smarter students do too. A lot of mine are not using AI or using it only very carefully.

What we will increasingly see is a large divide between good and bad students. Not a surprise at all.

Comment Re:Useless technology anyway (Score 1) 67

So it's not for you. You don't understand or need the use case.

And you've done nothing to explain what the use case is. As far as I can tell, the use case is "Someone who wants to use their phone to control the TV instead of the TV remote," which is a tremendous amount of technological overhead for such a negligible benefit.

It's way easier to point your camera at the screen and do an instant sign-in on the TV than it is to get your phone connected to the right Wi-Fi network and cast to the right TV, so the use case would have to be pretty compelling to make up for what a pain in the a** it is when it works, much less when it doesn't.

You're coming across as "old man yells at cloud", and about something you don't even use!

Major correction here: about something that I have tried to use on many, many occasions, but never used successfully. There's a difference.

I won't read or engage further as I for one only spend my time on worthwhile things and you seem stuck in the mud.

You won't read or engage further because you don't actually know any compelling reason to use it. If you did, you would have said what that reason was by now.

Comment Re:Yep (Score 2) 92

Not debunked and not bullshit. It is just idiots like you that cannot accept reality. Yes, all got hit. No, it was not the same. They all were warned years before by a Microprocessor-Forum presentation. Intel got fully hit with practical exploits early on because the did not care one bit. AMD was careful and only had theoretical exploits for the longest time and it is not clear to me whether there ever were any practical ones for them.

It is no surprise to me you are unable to see the difference between the two things.

Comment Re:Useless technology anyway (Score 1) 67

> Casting and the entire mechanism of having the device being casted to have to have direct access to the media source is idiotic and only exists because they insist on a extra level of weaponizing devices against the owners and policing what you can do with your own devices

You could have just said "I don't understand why that is needed" and saved yourself the effort.

The use case is extremely powerful. You want to direct a device to do something, rather than try to stream a 2160p video out of your phone over wifi. That's really not so hard to understand, surely?

Not really, no. If I wanted to use the TV to do all of the networking and playback, I would have just used the TV's app to do it. The number of hotels I've seen where the TV supported Chromecast or AirPlay streaming but did not have a built-in Netflix app are literally zero.

From my perspective, casting is a complete disaster by its very nature. It relies on the display device having full Internet access, which isn't a given. Literally every time I've wanted to do casting, it has been because the TV set's Netflix app wasn't working because of a network problem, and it couldn't get access to the Internet, so I was trying to use the phone's network connection. By shifting the network connectivity back to the TV set, it makes the entire system completely worthless, because the exact situations where it could be useful are the exact situations where it isn't.

Comment It depends on the college (Score 3, Informative) 59

I have taught C++ and other computer science classes at a community college for the past 9 years. I started out only using paper exams, and students had to print their code for homeworks and projects. Then during the pandemic we moved online, and other than a couple of semesters, my classes have stayed online. Virtually all of the community college CS courses, for the entire state system, are online. I lecture virtually instead of in person and all assignments and exams are done through Blackboard.

Now with AI, I cannot distinguish between what a student wrote vs what AI wrote. It's absolutely impossible to tell the difference. Before AI, I could often tell when someone got help (if you submit code that doesn't match your skill level on the exams) or copied someone else's assignment (if you hand in the same code with the variable names changed...). Again, now I can't tell at all.

At the community college level, the deans are stuck with a problem: fewer students are enrolling, and those students want to learn about AI because they see it as the next job skill you need to have. At the state university level, the CS dept has gone the other direction: exams are on paper and homework is now 5% of the final grade instead of 50%

I tell my students at the beginning of the semester "You are paying tuition to learn the material in the course. Using AI to do your classwork is like going to the gym and having a robot lift the weights for you. Don't use AI"

When I was a computer science and engineering undergrad 25 years ago, there was talk of creating a licensing process for software engineers, similar to civil engineers. It was a terrific idea and I hope it got traction. But AI has turned software engineering into a mess. Software is every bit as critical to the safety of humans as civil engineering, but you would never trust AI to create buildings. The software engineering students of today are absolutely ill-equipped to write the vital software that is used today.

Comment Re:what else is new? (Score 1) 115

Spoken like a fuckup that cannot accept reality. YOU are a really bad person by trying to force your deeply flawed views on everybody.

I am well aware of what Money did. He did not think gender identity was a spectrum. He though gender identity can be manipulated and externally imposed, essentially by force. And that is wrong and not consistent with what Science says today. But gender identity is a spectrum and comes from the person in question. Denying that makes the person denying it a liar or clueless.

Comment Yep (Score 3, Insightful) 92

Intel was never good at CPU engineering. The last 15 years or so they could only keep up by doing unsafe and insecure things and because of superior manufacturing. All these advantages are gone and they find themselves without critical capabilities. Comes from arrogance, too high profits and customer stupidity. For another case of this, look at Boeing that cannot design new airplanes anymore, just (badly) customize old designs.

Slashdot Top Deals

Nothing happens.

Working...