Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment It is not such a big deal (Score 3, Interesting) 31

The orbits below 400 km will clean themselves over time (at most few tens of years). Companies can make satellites a bit more aerodynamic, load them with enough fuel to maintain the orbit for the expected lifetime and continue launching without issues.
Other options are not feasible anyway. The clean space is a common good and there is little incentive for individual players not to pollute. Barring some international enforcement, space will get full of junk. The current political situation is not cooperative at the international level. The clean space enforcement is very unlikely.

Comment Not that much better than VP9 (Score 2) 46

My small test indicated that it is only about 20% smaller (though TFA claims 30%) and requires about 45% more CPU to decode when compared to VP9. Encoding is also many times slower than VP9. Both encoding and decoding were done on CPU (no HW support; ffmpeg; linux). AV1 may be more interesting later when it is better supported.

Comment Re:Sounds like they have two years to ask. (Score 3, Informative) 6

Don't be scared. You are not affected by CRA if you do not earn money on your open source software.

The CRA regulates commercial activity:
(10) This Regulation applies to economic operators only in relation to products with digital elements made available on the market, hence supplied for distribution or use on the Union market in the course of a commercial activity.
(10c) .. the provision of free and open-source software products that are not monetized by their manufacturers is not considered a commercial activity.
(10c) This Regulation does not apply to natural or legal persons who contribute source code to free and open-source products that are not under their responsibility.

Also non-profit organizations hosting open source are not under CRA. On the other side, commercial entities which profit from open source are responsible! These commercial entities are also responsible to publish vulnerabilities they find in open source and also the patches developed. In general, CRA seems to be more good than bad for open source.

More info. for non-layers is here: https://berthub.eu/articles/po...

Comment Re:We've seen this movie: VB & RoR (Score 1) 159

Cross compiling does not require anything that has anything to do with AI/LLMs anyway.

It does when the target language (like Rust) has ownership and borrowing which strongly favors your program to have a design suitable for linear (affine) type systems. The system must be redesigned to work idiomatically in Rust.

Comment Re:Cassiniâ"Huygens (Score 2) 54

It had only one wide angle camera. You would need about 3400 images to cover the sky when it was already there. You cannot do it in one time. The moons will move by the time you take all the images so you can miss some. Also you need to take a photo of each moon more times to be sure it is in orbit of Saturn. It does not make sense to do it when the probe is already there.
It might have found some while still far away when one picture can capture more space around Saturn.

Comment Re:Are you serious? (Score 1) 96

You can get rid of some limitations of Fourier transform by moving to Laplace transform.

You need a lot of neurons to approximate some more complicated functions. E.g. try to approximate sine function over its full domain with a 3 layer neural network. As running out of available neurons, you will very quickly see that Fourier transform is better in this case. NNs are universal appropriators but only over a limited subset of R^n.

The nice thing of Fourier transform is that the computed parameters have a very well defined meaning. Computing them will give you clear insights into the behavior of the transformed functions. The point is that GP claim that some other function approximators are "arguably" better than NNs is correct. It depends on your scoring system.

Comment Re:Is it 0Gbps or 0bps (Score 1) 94

You are oddly fixated on PD.

Because itsme1234 discussed power delivery over usb-c and that it is negotiated over D+/D- lines. I only pointed out that it is negotiated over CC line. But maybe itsme1234 meant maximum current request over usb device descriptor. Or some other options I mentioned which use D+/D- lines. But yes, I was completely focused only on USB PD specification. I did not even mention standard data communication anywhere in this thread. Good you pointed out that D+/D- are required by standard.

Comment Re:Is it 0Gbps or 0bps (Score 1) 94

I guess it depends whether you consider USB Battery Charging (USB BC) specification (which can run over D+/D- lines) to be part of USB PD specification (which uses CC lines). I downloaded USB PD specification and it did only reference USB BC. It did not contain it. Strangely I could not find USB BC specification on usb.org. Only some agreements and testing protocols. Maybe USB BC is not that open, maybe it derives something from the proprietary standards (like Qualcomm QuickCharge etc. there is a bunch of them).

The physical layer description of USB PD says CC wire is used (i.e. USB-C is needed). Based on a web search some early devices supported a data wire for this but "these devices were very rare". I have not seen such a device personally. Every USB PD capable device I have ever seen always used USB-C.

Comment Re:Is it 0Gbps or 0bps (Score 2) 94

Every usb-c cable should be able to handle at least 5V and 3A. USB power delivery (PD) is negotiated over CC data lines (configuration channel). That is the reason USB PD works only on cables with usb-c connectors. There are some charging standards (like e.g. Qualcomm Quick Charge) which use D+ and D- data lines of USB 2.0. But this is unrelated to USB PD.

Slashdot Top Deals

"Consistency requires you to be as ignorant today as you were a year ago." -- Bernard Berenson

Working...