Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment Re:Banned. (Score 1) 51

Meh, this kind of crap is what peer review is for. As long as he learns his lesson I'd be fine with letting him keep going. I mean he's still going to MIT so he's not an idiot.

I mean we all act like he got away with this but he was caught during the initial process of peer review. The system really does work.

We all like to complain about how there's thousands and thousands of papers that are just garbage but here's the thing so what? If the papers aren't doing any harm and they're just sitting out there then it's not a big deal. It's not like we are spending all that much money on any of this crap. I'm sure you can come up with a number that sounds big because we have a 33 trillion dollar economy so yeah you could find somebody who maybe got a grant and did some bad research for a few hundred thousand. But in the grand scheme of things it's not a big deal

I mean think about how much money we waste on other crap. Human beings are just wasteful creatures. And we kind of need to be to keep our civilization and economy going anyway.

Comment Re:Uhg... (Score 1) 22

It would be kind of neat to see the algorithms for AI hand it off to a GPU or one of the fancy cores on a modern CPU.

But I can't see that really happening because machine learning algorithms requires so much processing power and modern graphics do the same so you just don't have a lot of head room.

Comment Re:A complete failure (Score 1) 44

The primary job of a lecturer is design of the lecture, select the material and structure it.

If that was true then we don't need lecturers anymore since all the material already exists. No the primary job of a lecturer is to *teach the subject matter*. Whether the material used are AI generated or not is irrelevant. The question is a) are they correct, b) are they able to be understood. If a lecturer checks this then there's no reason not to use AI to generate it.

You said the second most important thing is that lecturing is more than just standing there. That I would argue is the most important thing *in the course*. Critically *someone* needs to do it at some point. Not necessarily in the lecture. Some of the best courses I had had the worst lecturers. But the way exercises, practical activities, and labwork was designed is what generated a lot of learning. Really I think some of my lecturers being replaced with AI could have been an improvement.

The risk is outsourcing the thinking itself to AI. That hits all the problems you mentioned.

Comment Re:Adapted? (Score 1) 102

The last time I looked, oil and gas drilling was done with strings of pipe a few inches in diameter.

Firstly, no they aren't. The final drill pipe may be a few inches in diameter, because that is all that's needed, but the initial hole is actually quite wide often wider than 1m. This allows the drillers to create stacked casings to handle the pressure of the oil field. You only see the few inches drill pipe, or the top of the drilling rig, but much like a Forstner bit fits in the chuck of your drill but is able to drill a 2" wide hole in your cupboard door, it's misleading to look only at what you see on the surface.

As well as the reactors, they've also got to get the heat-exchangers, turbines and generators down there too - all of which will require regular maintenance.

For regulator maintenance you just winch the thing back up. It only needs to be done once a year or every two years anyway. This isn't a problem.

Finally, they've got to have mile-long cables to bring the power to the surface - which need to be capable of supporting their own weight when strung vertically.

Also not a problem, this is where you actually can look to the oil and gas industry. They are talking about 15MW/reactor, the weight and self supporting requirements of this cable will be far less than a typical drill pipe, and also it doesn't need to support its weight, it just need to have buoyancy control. The whole point of this project is to be immersed. It doesn't need to support its *surface* weight, it needs to support the remaining force not covered by buoyancy at a specific segment at a specific pressure/depth.

That's the end of my non-problems.
There are MANY problems with this scam project. Partially economical, partially that they aren't solving any real world problem, and partially that no one has built such a small reactor.

Comment Re:Such BS (Score 1) 102

That's literally what they are talking about. A small SMR generating less power than a single wind turbine, dropped in the bottom of a 30" wide hole.

The problem here isn't digging the hole, it's that no one has built an SMR the size they are proposing, and that the economics are fucking horrendous. They buried the lead a bit in TFS. For the 1.5GW "groupings" they were literally talking about drilling 100 of these holes next to each other.

Comment Re:Just keep digging (Score 1) 102

While you're right you're missing something important on the other end of the pipe. 100C steam is great for making tea but you cannot generate power from it. You need pressure. Pressure changes the boiling point.

At horrendously inefficient turbine may only just provide a limited amount of energy at around 160C. But typically you want FAR higher than that. For practical reasons you want a large power plant to run a turbine with >150bar saturated steam. Ideally to actually get some actual efficiency in your rotating equipment you want supercritical steam above around 220bar. The steam temperature most thermal powerplants aim for is in the order of >375C.

You'll need a deeper hole, or you need to face the fact that a lot of geothermal plants are located in specific regions for a reason.

That's not to say a 1mile deep hole is useless. This kind of temperature may be very suitable for district heating. That sounds not very relevant until you see what percentage of energy consumption actually goes into heating houses in colder climates. So we can swap that heatpump with a liquid/liquid heat exchanger and with our 1mile deep hole not generate electricity, but rather reduce its consumption.

Comment Re: I'm no nuclear engineer (Score 1) 102

With a normal power system made up of spinning generators and spinning steam turbines, you have a built-in "flywheel" effect that smooths out those surges, which gives you time to do things like back off the power a bit.

That is simply false. Firstly "flywheel" effects are far superior with grid forming inverters than rotating machinery. It's why batteries are currently overtaking traditional rotating peaking plants in the role of stabilising the grid (not actually providing storage for the night). Secondly solar can react far better to changes in the grid for load shedding.

The problem in Spain had nothing to do with the actual grid makeup, it was exclusively regulatory failure. The Solar / Wind were not asked to be involved with frequency regulation so they didn't disconnect from the grid during the overvoltage event, and traditional spinning plants didn't do it despite having the capability to completely avert the outage. This was analysed and given the equipment operating on the day there was no reason for the grid to go down if the generators were given the correct instructions by REE. The fact that the entire morning there were frequency oscillations were a good indication that too many plants were operating as load followers rather than frequency regulators

This was purely a grid planning SNAFU where the wrong people were told to do the wrong thing. All technical equipment was in place to handle the situation and the traditional spinning storage (your all important flywheel) were actually the first to disconnect itself which initiated the cascading failure. Those blaming solar and wind were exclusively politicians with an axe to grind and a few people within REE who tried to cast blame away from themselves (despite their own report).

Comment Re:uh (Score 1) 22

Windows natively supports every network card that requires a driver to work

And if Microsoft made the driver for windows, then that would be true. We're not talking about a 3rd party support from someone else here, it is in-box just a component that isn't shipped by default, it is still a windows first party component made by Microsoft.

To extend the thought let's look at some examples:
- Does windows natively support Hyper-V, or WSL? I don't think anyone would claim it doesn't, but it's not installed by default. You need to select it as an optional feature later.
- Is selecting it as an optional feature in one tool vs the other relevant? It's a windows component that acts on the system level, not a user facing program, so the Microsoft store vs features menu isn't relevant on any technical level.
- Now the question is one of cost. Is it no longer native because you have to pay?
- By extension does that mean Windows doesn't natively support attaching to a domain controller or hosting a remote desktop since those a features you also need to pay for?

Also what does your idea mean for Linux which is a complete collection of different tools patched together. Where does native start and end there? Kernel? Well then nothing is really native. Modules? Well they are optional extras to download from different people. What about userland components? Does Linux not have native printing support, no native graphics support beyond spitting out into a VESA terminal?

Native means at the OS level, by the OS vendor (and I'd argue in Linux by the distro maintainer). By the way, yes windows does natively support all GPUs that haven't been released yet. There's a reason it can still draw the screen after you nuke your NVIDIA/Intel/AMD driver from orbit: Microsoft provides a first party basic graphics driver. It's in-box.

Comment That depends, was it wrong? (Score 1) 44

AI is a tool, nothing more, nothing less. There's no reason why one person can't use a tool while the other is banned from it.

The lecturer using AI to generate materials is perfectly okay, providing they were proof read. The dissemination of correct information is key, not how it was done.
The student using AI to generate materials is not okay. The goal is to demonstrate knowledge, it's not something that can be outsourced.

Also the linked reddit post bitching about AI generated images in lectures is just fucking stupid. Unless someone is lecturing in art there's no reason to think slides would be better off with the lecturer creating / sourcing images the old fashioned way. There's a lot of lecturers I had that could really have benefitted by nearly anyone else helping making their slides for them. Geniuses in their field, but can't figure out how to draw a straight line on a power point to save themselves.

Comment Re:Shenanigans (Score 2) 102

You're assuming two things: a) that they exist, and b) that they require maintenance. Modern process design reduces manual switches and dials to be simple electronic sensors read remotely. Modern equipment is insanely reliable, and a steam raising facility is an incredibly simple process to get right from a reliability point of view. And for random faults, well that's what redundancy in design is for.

These aren't new problems by the way. We have been building things and been putting them in worse conditions for many years. Take for example from oil and gas well head control systems. They are located on the seabed in deep water. We're not just talking a sensor or two, no we're talking the entire system, sensors, valves, hydraulic motors, control system, multiple UPSes, packaged neatly and dumped a mile under the ocean surface. It's a lot of fun to design, and even more fun to see someone approach the first time, thing such as mean time to repair suddenly dominate reliability calculations when you need to schedule a submersible to go collect your safety system module at 1.5km depth. Yes I have done that, with a system that is identical to ones used in nuclear reactors (except when we design them in nuclear reactors we get a 1E certificate and slap a zero on the end of the purchase order for that piece of paper).

As for remotely accessing things being shenanigans, we've been doing that for many years already as well. All that "access" you are talking about is access during construction and design, not during operation. Fun fact I worked opposite a completely unmanned air separation facility. Not only remotely operated, but remotely operated from a completely different continent. The plant was in Australia and the control room in Indonesia. At my company we also have completely unmanned facilities in the ocean. (Again, MTTR and equipment reliability which is my bread and butter, gets interesting when we consider maintenance teams not being on site and having to provision a boat or helicopter to go fix something).

This just isn't new. Also no one here is talking about anything "big". In fact that is the biggest problem here. Not the remote operated bit, or the maintenance bit, but rather that these atom-bros are yet once again pushing tiny SMRs which don't exist yet. Forget pipes, crews, or any of that, they are talking about a reactor, steam, and rotating equipment package that is less than 1m across and about 8-10m high generating fuck all electricity. That's the big problem, not the maintenance, but the fact that this kind of tiny stuff remains in fantasy land and on concepts drawn on napkins.

Comment Re:Shenanigans (Score 1) 102

No they do not. Many secondary parts of reactors may require this, but nuclear reactors cores and turbines have little maintenance overall. But even then I don't think that's where your understanding is wrong...

There is a reason someone is walking around secondary and taking readings and checking equipment.

Yes there is. Reactor fleets being largely 40+ years old mean we are still operating them with the technology of the day, the designs of the day and the operational requirements of the day. Much like people run around old oil platforms reading gauges and dials as well.

That's however not at all related to what we are building these days and there's little to no walking around or checking anything. A large portion of modern process design is reducing the need to read anything. Sensors are cheap. Data recording is cheap. Everything is digital. For a project it now costs almost as little to install a wireless pressure gauge than it does a physical one (same for every other process measurement). For a greenfield construction the cost of wiring is borderline irrelevant too so even wired equipment costs little more.

It's a two edged sword, the problem of people no longer needing to do rounds thanks to modern designs that keep people away from equipment has lead to new an interesting products where companies produce gear that takes over what the rounds were previously for. Gas detectors, cameras, heck one company even has acoustic monitoring systems designed to "listen" to the plant since operators don't do that anymore.

The reality is in a modern facility there's very little for an operator to actually go out and read.

Comment Re:uh (Score 1) 22

That's not what "native" means.

It's exactly what it means, it becomes a core codec that all of Windows programs can universally use, managed by the OS, native support. Just because it's it's not there by default doesn't mean it isn't native to the OS.

Or what do you claim it to be? Where do you draw the line? The kernel? Is nothing in the OS userland "native"?

Comment Re:Can't Help But Think (Score 5, Interesting) 22

So let's follow this theory, to what end? WebP was developed by Google. What do they get from its adoption? It's an open format, a free format, and has a public specification. That gives Google very little in the way of any benefit from pushing it. It offers them no advantage over any other image format. Likewise if the point was to push something developed by Google it's worth noting that Google was one of the core contributors to JPEG-XL. It's only marginally less Google than WebP is.

I suspect this was as simple and plain as expected. JPEG-XL didn't look like it was gaining any traction. It wasn't used, support was half-baked and experimental in Chrome, so they abandoned it. No surprise there, Google has killed things far more popular.

On the flip side what happened recently forced an issue, PDF's adoption of JPEG-XL. Google maintains an inbuilt PDF reader in Chrome, meaning they now *need* to actually add JPEG-XL support if they want to remain PDF compliant. It's no longer an optional addition of something no one uses, but now becomes a PDF standards issue.

Slashdot Top Deals

The ideal voice for radio may be defined as showing no substance, no sex, no owner, and a message of importance for every housewife. -- Harry V. Wade

Working...