Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Titan or Bust! (Score 2) 68

Venus's middle cloud layer is the most Earthlike place in the solar system apart from Earth**, is energy-abundant, has favourable orbital dynamics, easy entry, and the simple act of storing electricity for the night via reversible fuel cells - if plumbed in a cascade - can enrich deuterium (2 1/2 orders of magnitude more abundant on Venus), a natural export commodity, if launch costs are sufficiently low. The atmosphere contains CHONP, S, Cl, F, noble gases, and even small amounts of iron. Pretty much everything you need to build a floating habitat, which can be lofted by normal Earth air, aka people can live inside the envelope. Aka, unlike on Mars, where you live in a tiny tin can pressure vessel where any access to the outside tracks in toxic electrostatic dust and you waste away from low gravity, on Venus you'd be in a massive, brightly lit hanging garden, where you could live half a kilometer from a crewmate if they really got on your nerves.

Most Earthlike? Yes. Temperature, pressure, gravity, etc all similar. Natural radiation shielding equivalent to half a dozen meters or so of water over your head. Even storms seem to be of an Earthlike distribution. The "sulfuric acid" is overblown; it's a sparse vog, with visibility of several kilometers; with a face mask, you could probably stand outside in shirtsleeves, feeling an alien wind on your skin, only risking dermatitis if you stayed outside for too long.

Indeed, it'd actually be useful if the sulfuric vog was more common (to be fair, it's still unclear whether precipitation happens, and if so, whether rains or snows; the Vega data is disputed). Why? Because it's your main source of hydrogen. Highly hygroscopic and easily electrostatically attracted, so readily scrubbed through your propulsion system. First releases free water vapour when heated, then decomposes to more water plus SO3, and if you want you can further decompose the SO3 over a vanadium pentoxide catalyst to O2 + SO2, or you can reinject it into the scrubber as a conditioning agent to seed more water vapour. Of course, if precipitation happens, collection possibilities are basically limitless.

The surface is certainly hostile, but even 1960s Soviet technology was landing on it (also, contrary to popular myth, there is no acid at the surface; it's unstable at those temperatures, the sulfur inventory is only SO2 there). But in many ways, the surface is very gentle. Mars eats probes with its hard landings, but one Venera probe outright lost its parachute during descent and still landed intact, as the dense atmosphere slows one's fall. It's been calculated that with the right trajectory, a simple hollow titanium sphere launched from Earth could arrive at Venus, enter, descend and land all intact. Simple thermal inertia (insulation + a phase change material) can keep an object cool for a couple hours; with heat pumps, indefinitely (and yes, heat pumps and power sources for the surface conditions have been designed). Even humans could walk there with insulated hard suits, like atmospheric diving suits. Indeed, some of the first space suits NASA designed for the moon (ultimately ditched for weight reasons, despite the superior mobility performance) were similarly jointed hard-shell suits.

On Venus's surface, a lander or explorer can literally fly, via a compressible metal bellows balloon. Small wings / fins can allow for long glide ratios. Loose surface material can be dredged rather than requiring physical excavation, potentially with the same fan used for propulsion. Reversible ascent back to altitude can be done with phase change balloons - that is, at altitude, a lifting gas condenses and is collected in a valved container, and the craft can descend; at the surface, when one desires to rise, the valve is opened and the gas re-lofts the lander.

On Mars, you're stuck in one location. The problem is that all minerals aren't found in the same spot; different processes concentrate different minerals. And you can't exactly just get on a train to some other spot on the planet; long-distance travel requires rockets, and all their consumables. But on Venus the atmosphere superrotates every several days (rate depending on altitude and latitude), while latitude shifts in a floating habitat or lander can be done with minimal motor requirements. So vast swaths of the planet are available to you. Furthermore, Venus is far more dramatic in terms of natural enrichment processes; wide ranges of minerals are sublimated or eaten out of rocks and then recondensed elsewhere. Temperatures and pressures vary greatly between the highlands and lowlands as well. There even appear to be outright semiconductor frosts on parts of the planet. Lava flows show signs of long cooling times, which promotes fractionalization and pegmatites. Volcanism is common, primarily basaltic but also potentially secondary rhyolitic sources. A variety of unusual flows with no earth analogies (or only rare ones) show signs of existing, including the longest "river" channel in the solar system (Baltis Vallis). While there's no global tectonic activity, there appear to be areas of intense local buckling between microplates. The surface conditions of the planet also appear to have been very different at many times in the past. It's all a perfect setup for having diverse mineral enrichment processes. Yet there's almost no overburden (unlike Mars, which is covered in thick overburden on most of the planet).

As mentioned before, Venus has significantly superior orbital dynamics to Mars, due to the Oberth effect. Venus-Mars transfers are almost as fast and almost as low energy as Earth-Mars transfers. Venus-Earth transits are super-fast, esp. with extra delta-V added. The asteroid belt is, contrary to intuition, much more accessible from Venus than from Mars. Also, gravity assists are much more common around Venus - when we want to launch probes to the outer solar system, we generally start with sending them first inwards toward Venus, then back between Venus and Earth and outwards from there.

From a long term perspective, both Venus and Mars have problems with terraforming, with some things you can do "relatively easy", and some that require megascale engineering on scales best left to fantasy. You can boil off Mars's polar caps, but the amount of CO2 there is still quite limited, and there's just not that much nitrogen inventory on the planet (it's been lost to space), which also matters to plant cultivation. You could probably engineer active radiation shielding from orbit, maybe direct more light to the surface, but you can't increase the gravity. Etc.

With Venus, one of the earliest ideas for terraforming it was from Carl Sagan, before the planet was known well; he proposed seeding it with engineered bacteria to convert CO2 to graphite and release oxygen. He later rejected his idea, on the grounds that a high temperature surface of graphite and oxygen would be a bomb. Later studies showed that the timescales for said conversion would be tens of thousands to millions of years. But in a way, that is actually a savior to his idea, in that Venus's rocks contain unoxidized minerals. In analogy to the Great Oxygen Catastrophe on Earth that created our banded iron formations, slowly exposed to oxygen, Venus's rocks would weather and sequester the oxygen and deposited carbon. Hot, high-pressure high-oxygen conditions would never have a chance to exist.

Various faster methods have been proposed. A common one is that of the soletta, a thin orbital sunshade. Another is building an "alternative surface", aka propagating floating colonies to the point that they are the new surface - and indeed, below that surface, they could exclude sunlight to the below atmosphere. Regardless of the method, the cooler the atmosphere gets, the lower its pressure gets, to the point that you can start outright precipitating out the atmosphere out as icecaps.

Just like Mars will never have high gravity and probably never much nitrogen, Venus would probably never be fully Earthlike. It would have enough nitrogen that, barring loss to weathering, people would have a constant mild nitrogen narcosis, like always being ever so slightly tipsy. It would remain a desert planet, barring massive influxes of ice (which present their own challenges and problems), or of hydrogen (pre-cooling). But then again, the very concept of terraforming anything has always required one to put on thick rose-coloured glasses ;)

I don't say all this to diss on Mars. But our obsession with "surface conditions" has led us to ignore the fact that if you're going to the extremes of engineering an off-world habitat, having it be airborne is not that radical of an additional ask, esp. on a planet with such a big "fluffy" atmosphere as Venus. If Venus's atmosphere stopped at its Earthlike middle cloud layer, if there was a surface there, nobody would be talking about long-term habitation on Mars - the focus would have been entirely Venus. But we can still have habitats there. The habitat can, in whole or part, even potentially be its own reentry vehicle (ballute reentry), and certainly at least inflate and descend as a ballute (with a small supply of Earth-provided helium as a temporary lifting gas until an Earthlike atmosphere can be produced). Unlike with Mars entry, you're never going to be "off course", or "crash into something" because you got the location or altitude wrong.

(Getting back to orbit is certainly challenging from Venus - all that gravity that's good for your body has its downsides - but the TL/DR is, hybrid and/or air-augmented nuclear thermal rockets look by far to be the best option. Far less hydrogen needed than chemical rockets, far lighter relative to their deliverable payload, only a single stage needed, and in some designs have the ability to hover without consuming fuel. This is, of course, of great benefit for docking with a habitat, avoiding the need for descending rocket stages to deploy balloons and then to dock those to the habitat. The hydrogen and mass budgets involved are totally viable)

Comment To some extent (Score 1) 152

The ghost writers have gone along with it and I think really shot themselves in the foot.

We are still getting new Tom Clancy novels. Sure you can look below the line any see who actually wrote it but that isn't the big bold letters on the cover. This is true for a lot of the popular "air port series", I guess Lee Child is actually still writing his own books.

How are new authors supposed to make a name for themselves when marketing all goes to guys already in the ground. The authors actually writing those books have more or less allowed themselves to be comoditized and just wait for the LLMs to come for them...

Comment Re:power (Score 2) 68

Titan's atmosphere is rather calm; not an issue. At the surface, the winds measured by Huygens were 0,3 m/s.

You actually can use solar power in extreme environments - even Venus's surface has been shown to be compatible with certain types solar, though you certainly get very poor power density. Dragonfly, as noted above, uses an RTG.

Comment Re:Second flying drone to explore another planet (Score 3) 68

Planetary scientists frequently refer to moons that are large enough to be in hydrostatic equilibrium as planets in the literature. Examples, just from a quick search:

"Locally enhanced precipitation organized by planetary-scale waves on Titan"

"3.3. Relevance to Other Planets" (section on Titan)

"Superrotation in Planetary Atmospheres" (article covers Titan alongside three other planets)

"All planets with substantial atmospheres (e.g., Earth, Venus, Mars, and Titan) have ionospheres which expand above the exobase"

"Clouds on Titan result from the condensation of methane and ethane and, as on other planets, are primarily structured by circulation of the atmosphere"

"... of the planet. However, rather than being scarred by volcanic features, Titan's surface is largely shaped..."

"Spectrophotometry of the Jovian Planets and Titan at 300- to 1000-nm Wavelength: The Methane Spectrum" (okay, it's mainly referring to the Jovian satellites as planets, but same point)

"Superrotation indices for Solar System and extrasolar atmospheres" - contains a table whose first column is "Planet", and has Titan in the list, alongside other planets

Etc. This is not to be confused with the phrase "minor planet", which is used for asteroids, etc. In general there's a big distinction in how commonly you see the large moons in hydrostatic equilibrium referred to as "planets" and with "planetary" adjectives, vs. smaller bodies not in hydrostatic equilibrium.

Comment Re:Titan or Bust! (Score 3, Informative) 68

Why?

NASA's obsession with Mars is weird, and it consumes the lion's share of their planetary exploration budget. We know vastly more about Mars than we know of everywhere else except Earth.

This news here is bittersweet for me. I *love* Titan - it and Venus are my two favourite worlds for further exploration, and dragonfly is a superb way to explore Titan. But there's some sadness in the fact that they're launching it to an equatorial site, so we don't get to see the fascinating hydrocarbon seas and the terrain sculpted by them near the poles. I REALLY wish they were going to the north pole instead :( In theory they could eventually get there, but the craft would have to survive far beyond design limits and get a lot of mission extensions. At a max pace of travel it might cover 600 meters or so per Earth day on average. So we're talking like 12 years to get to the first small hydrocarbon lakes and ~18 years to get to Ligeia Mare or Punga Mare (a bit further to Kraken Mare), *assuming* no detours, vs. a 2 1/2 year mission design. And that ignores the fact that they'll be going slower in the start - the nominal mission is only supposed to cover 175km, just a few percent of the way, under 200 metres per day. Sigh... Maybe it'll be possible to squeeze more range out of it once they're comfortable with its performance and reliability, but... it's a LONG way to the poles.

At least if it lasts for that long it'll have done a full transition between wet and dry cycles, which should last ~15 years. So maybe surface liquids will be common at certain points, rare in others.

Comment Re:Duh (Score 1) 118

I don't fundamentally disagree. The thing is Azure is to big and complex with to many cooks in the kitchen for there being really any hope of getting it right.

Microsoft absolutely needs to have a hard, delete after-N policy, and then start writing very specific exceptions around certain critical components of Azure infrastructure. The Federal government should be 'beta-testing' the could with the rest of Industry. Azure / Office 365 are good examples of to much to fast at to high a value.

Comment Re:Follow the money (Score 1) 201

No its 100x worse than that. Its probably coming from the coal plant in another state like the story about all the Data Centers near DC.

They need to Coal power (because coal keeps the lights on) because the renables don't cut it; they suck for super dense constant base loads. However since the green morons decided to make it impossible to burn coal near by the grid operators and generation people are tearing up more of the WV mountains and cutting up the valleys on Northen VA to run more transmission lines.

Study after study has shown the importance of large UNBROKEN areas of habitat for wildlife. Slicing up what little we have left on the Eastern half the US to run more high voltage lines is terribly short sighted and stupid. Wind and Solar might be low carbon but as gird solutions they aint green!

Comment Re:How you know you're doing the right thing (Score 1) 146

So much this. The Intel lobby practically just burnt down congress, (it sure as-f**k looks like they blackmailed the speaker of the House) to defeat having to even get a warrant for spying from their special FISA court, when the 'F' (foreign) part is deeply in question.

That does suggest to me its time to 'trust them' more and just hand over the keys to all communications privacy. They basically finished throwing a tantrum and screaming about how they can't do their jobs AND respect the Constitutional rights of the public.

Yes I realize this is the EU but come on right after spooks ram rod the privacy shredding 702 thru congress suddenly the issue comes to the fore other side the Atlantic... right like the 5-eyes cool kids are not coordinating their abuse of democracy..

Comment Re:Meanwhile, at Microsoft... (Score 2) 118

Actually they were extremely careful and slowly wormed their way into a maintainer ship position via sock-puppets and astroturfing where they could insert code with perhaps less scrutiny than say trying to trojan some pull request. Then they put most of the payload in some binary material that ships with the software rather than source codes someone would likely feed to some SAST tool or otherwise audit effectively as part of due diligence. They did this over a long span of time and did legitimate maintenance work as well.

  All and all its worrying that it happened but it also suggest the overall pipeline and checks and balances as far as what makes it to a general release in the major Linux distributions is 'really pretty solid'. Someone put a good deal of analysis and long term effort into backdooring the big distros and it still failed. As you say perhaps one of the reasons it failed was because they saw their window of opportunity closing and had to do move quicker leading to the performance issue the Microsoft engineer noticed.

There again this is case where 'many eyes' really should be credited, and of course Freund who actually found it; more so than anything Microsoft the organization was/is doing. He wasn't doing security specific work, and he's just a good engineer that happened to be in the right place to spot a problem!

Comment Re:Duh (Score 4, Insightful) 118

Logs are often a huge liability. I am not saying this is right, but in my experience very very few IT shops treat them like tier one confidentiality required data that they are.

developers rarely think critically about what can end up in a log, operating under the assumption that whatever logging framework is responsible for sinking them somewhere safe and if anyone has access all bets are already off; of course in the era of centralized logging, SEIM analysis, and data lakes etc, that is nonsense. I have seen a lot applications that have a ton code and thought dedicated to handling various types of secrets only to have it all wrapped and in
try { ... } catch ... {} catch ... {} .. catch Exception => ex { Logger.log("Unhandled " + ex.name + " exception - " + ex.message + "Sacktrace:\n" + ex.stacktrace);} and equivalent that under the write conditions will result in these secrets getting into the logs. That is the most innocent case, the far more common pattern in logs is:

Login failed for user P@$$word!1
Login success for user gweihir

and is almost the norm...

Right now the only things saving corporate and probably government IT from total disaster due to negligent log handling are:

1) The data volume is large so its difficult to exfil or search in situ without being notices
2) Searing logs you are not familiar with is hard and regex augmented with traditional correlation rules will only get you so far,

However attackers will start using ML and similar tools to start slogging thru it and pulling useful data out soon enough and all these data lakes, cloud trails, security workspaces, etc - are going to get some big organizations well and thoroughly pwnd.

At the very least actual APTs (not some ransomware gangs) will get hold of some Fortune 50s and large government logs and do some next gen-analysis to make sure their trade craft and tools leave exactly NO detectable IOCs. Which frankly I think boads quite badly for having a large WFH work force; nobody is going to be able to separate malicious remote access from legitimate. That is drifting off the topic however.

In the short term I would suggest to most operators, you don't know what is in your logs, you don't what signals someone might be able to extract from those logs even if you do have all the content identified. You probably should NOT be retaining logs for longer than either a few months or whatever regulatory requirements demand, whichever is greater.

In this specific instance its unfortunate, but I don't think MS actually got it wrong in terms of policy here.

Comment Re:AI is just Wikipedia (Score 1) 25

I've probably done tens of thousands of legit, constructive edits, but even I couldn't resist the temptation to prank it at one point. The article was on the sugar apple (Annona squamosa), and at the time, there was a big long list of the name of the fruit in different languages. I wrote that in Icelandic, the fruit was called "Hva[TH]er[TH]etta" (eth and thorn don't work on Slashdot), which means "What's that?", as in, "I've never seen that fruit before in my life" ;) Though the list disappeared from Wikipedia many years ago (as it shouldn't have been there in the first place), even to this day, I find tons of pages listing that in all seriousness as the Icelandic name for the fruit.

Comment Nonsense (Score 1) 25

The author has no clue what they're talking about:

Meta said the 15 trillion tokens on which its trained came from "publicly available sources." Which sources? Meta told The Verge that it didn't include Meta user data, but didn't give much more in the way of specifics. It did mention that it includes AI-generated data, or synthetic data: "we used Llama 2 to generate the training data for the text-quality classifiers that are powering Llama 3." There are plenty of known issues with synthetic or AI-created data, foremost of which is that it can exacerbate existing issues with AI, because it's liable to spit out a more concentrated version of any garbage it is ingesting.

1) *Quality classifiers* are not themselves training data. Think of it as a second program that you run on your training data before training your model, to look over the data and decide how useful it looks and thus how much to emphasize it in the training, or whether or not to just omit it.

2): Synthetic training data *very much* can be helpful, in a number of different ways.

A) It can diversify existing data. E.g., instead of just a sentence "I was on vacation in Morocco and I got some hummus", maybe you generate different versions of the same sentence ("I was traveling in Rome and ordered some pasta" ,"I went on a trip to Germany and had some sausage", etc), to deemphasize the specifics (Morocco, hummus, etc) and focus on the generalization. One example can turn into millions, thus rendering rote memorization during training impossible.

B) It allows for programmatic filtration stages. Let's say that you're training a model to extract quotes from text. You task a LLM with creating training examples for your quote-extracting LLM (synthetic data). But you don't just blindly trust the outputs - first you do a text match to see if what it quoted is actually in the text and whether it's word-for-word right. Maybe you do a fuzzy match, and if it just got a word or two off, you correct it to the exact match, or whatnot. But the key is: you can postprocess the outputs to do sanity checks on it, and since those programmatic steps are deterministic, you can guarantee that the training data meets certain characteristics..

C) It allows for the discovery of further interrelationships. Indeed, this is a key thing that we as humans do - learning from things we've already learned by thinking about them iteratively. If a model learned "The blue whale is a mammal" and it learned "All mammals feed their young with milk", a synthetic generation might include "Blue whales are mammals, and like all mammals, feed their young with milk" . The new model now directly learns that blue whales feed their young with milk, and might chain new deductions off *that*.

D) It's not only synthetic data that can contain errors, but non-synthetic data as well. The internet is awash in wrong things; a random thing on the internet is competing with a model that's been trained on reems of data and has high quality / authoritative data boosted and garbage filtered out. "Things being wrong in the training data" in the training data is normal, expected, and fine, so long as the overall picture is accurate. If there's 1000 training samples that say that Mars is the fourth planet from the sun, and one that says says that the fourth planet from the sun is Joseph Stalin, it's not going to decide that the fourth planet is Stalin - it's going to answer "Mars".

Indeed, the most common examples I see of "AI being wrong" that people share virally on the internet are actually RAG (Retrieval Augmented Generation), where it's tasked with basically googling things and then summing up the results - and the "wrong content" is actually things that humans wrote on the internet.

That's not that you should rely only generated data when building a generalist model (it's fine for a specialist). There may be specific details that the generating model never learned, or got wrong, or new information that's been discovered since then; you always want an influx of fresh data.

3): You don't just randomly guess whether a given training methodology (such as synthetic data, which I'll reiterate, Meta did not say that they used - although they might have) is having a negative impact. Models are assessed with a whole slew of evaluation metrics to assess how good and accurately they respond to different queries. And LLaMA 3 scores superbly, relative to model size.

I'm not super-excited about LLaMA 3 simply because I hate the license - but there's zero disputing that it's an impressive series of models.

Comment Its like anything (Score 1) 59

Anytime you are doing "science" you need to know what you are measuring.

Cygwin inst emulation its a compatibility library. I highly doubt its use impacts network performance at all for certain parts of the scale.

CPUs are fast network cards mostly are not. You 14th i5 is going to outrun that 2.5GbE adapter cygwin or not. So if what you are bench-marking is the peer, say some router or IoT thing etc; I don't see the issue.

On the other hand if you are bench-marking the host with PCI-E 10GbE card or something; well this might be a relevant concern.

Still more caveats though it might be exactly the right approach if you are say deciding if you should host your POSIX network service (that will use cygwin on Windows) or Linux. That is of course the thing; you should measure as much as possible using the parts of the stack you can't or won't be willing to change, if your benchmark tool inst doing that its probably the wrong tool. So right don't test cygwin network performance if the application is going to be winsock2.

This has been a problem since the dawn of the PC tech press and its probably worse today than ever. In the late 80s and early 90s we reading about how such and such's 386-clone was 16% and sure enough on some synthetic benchmark it was because less memory wait state or something; but low an behold that turned out the be doing some sota softfloat thing for the test, and when you compared a real world app to intel-386 + 387 pair suddenly the performance advantage vanished or even flipped.

Slashdot Top Deals

FORTRAN is not a flower but a weed -- it is hardy, occasionally blooms, and grows in every computer. -- A.J. Perlis

Working...