Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment Re:Smashing! (Score 5, Informative) 32 32

You seem knowledgeable and this indeed could be how it could be made to work but in this case, you are not entirely true. I would know because I reverse engineered thermal control for the Nouveau project (Open source Linux driver for NVIDIA GPUs) from the Geforce 8 to the Kepler. They may have changed it for Maxwell but I have not found the time to have a look. I also think it is unlikely they changed it since they basically never changed it since they introduced it in 2006.

You hope. That's the idea, but components with thermal throttling still die the death of heat. The thermal throttling is controlled by software, and each card (or laptop) vendor has the opportunity to dick around with the maps.

No, software controlled thermals are never working alone. The software one allows a more gentle performance rampdown and fan control, but there's always a hardware override because software to control temps can go missing or simply not be present at certain stages.

Perfectly true! You must have worked on this before ;) However, in the case of NVIDIA, fan management is done by a "microcontrol" running inside the GPU.

In this case, if it overheats so badly the hardware kicks in, they basically kick the fan into high speed and halt the GPU (usually by blocking the core clock).

NVIDIA has a more graceful way of dealing with this. I explained it in my PhD thesis (page 128 of the pdf: http://phd.mupuf.org/files/the...). In short, at different thesholds, they change a clock divider's value. But you should have a look at the page since there are some pretty graphs :p

Usually that's enough to cool it down to a safe zone where it re-enables the clock, so what was once a nice super smooth gameplay turns into a horrendous slideshow.

That could indeed happen if there was a catastrophic failure in the cooling system.

Or, sometimes if it gets really critical, it disables the clock until reset, which basically halts your PC as the busses lock up.

NVIDIA does not rely on shutting down the clock. Well, it does so .... by shutting down the entire power of the board. There is a GPIO that controls the voltage regulator. Once set, the GPU needs a reset command from the PCIe port and needs to get POSTed again to become usable again.

Comment Re:Collaboration strictly limited to Tegra K1+ (Score 1) 169 169

I don't work on 3D, but from what I heard from my fellow Nouveau devs, the hardware is designed to be efficient with both GL and Direct3D.

Anyway, Tegra is based on Kepler for the graphics engine, so if what you said is true for the Kepler cards, it is true for Tegra K1. The Tegra SoCs have never been entirely different from their desktop cards. We actually believe we'll be able to run our userspace with a fairly small amount of modification on Tegra K1. We'll see how it fares when we get access to the hardware!

Comment Collaboration strictly limited to Tegra K1+ (Score 5, Informative) 169 169

Hey, I'm a Nouveau developer and I had a chance to discuss with an nvidia engineer @ FOSDEM. This collaboration is strictly limited to Tegra and on the kernel side (at least for the moment).

There is some overlap with the desktop cards (mostly Kepler family) which will allow us to benefit of this collaboration in more than the SoC world. This is however very interesting and I'm really looking forward to seeing how it will pan out!

Comment Re:What's the big deal? (Score 5, Insightful) 398 398

Oh, it is a big deal.

There are two technologies for touch screens:
- Resistive: It means adding an extra layer on top of the screen, reducing the brightness of the screen or increasing the backlight resulting in a lower battery life.
- Capacitive: As far I know, it is only possible on current screen's surface. It would need some sort of glass like on smartphones. This increases the price of the laptop and makes it more susceptible to breaking if the glass is of poor quality.

The end result in both cases is a higher price ... for no purpose at all. But I guess the average joe would like to have a detachable keyboard and get a tablet.

Comment Re:"a reverse-engineered incarnation" (Score 2) 231 231

I was familiar with clean room because I was once part of such a project. I was aware nVidia drivers had parts that they considered to be "secret sauce" algorithms (and ATI didn't?). From what you said, I'm assuming it was the firmware which must be loaded onto the card?

I'm only vaguely familiar with the requirements for HDCP compliance, but I'm guessing that safeguarding keys is part of it. So, my assumption is that nVidia needed to do that, in general, rather than to specifically make it difficult for the nouveau project.

Perhaps, the libdvdcss approach by players will work. The players don't have de-CSS capabilities themselves, but they do look around for the lib. If it "happens to be around" (e.g. liability is shifted to the end user who downloaded it separately), they will use it.

In other words, you always need the card, so everything else is protected without needing specific protection of its own.

My answer to all that is that nVidia cards are mostly software nowadays (except for the real rendering/computing core). Nvidia uses a common ISA for most engines. It was reversed engineered when Fermi was out and took more than a year before we wrote our core firmwares by ourselves. I'm currently writing the hardware monitoring firmware as a first experience with this ISA.

These firmwares execute on harvard "microcontrolers" and all have some special capabilities depending on the engine they run on.

The firmwares themselves aren't secure at all and aren't meant to be anyway. However, some "memory pages" can be marked as secret so as you can't access them from the host unless you know the "password". I never studied this part, if you are interested, you can read what has already been documented: https://github.com/pathscale/envytools/blob/master/hwdocs/fuc-vm.txt

In the end, the card itself isn't particularly needed since it is mostly software and we should be able to fake many things. But what's the point of hdcp anyway?

Another assurance for nVidia is that they know how slow going the RE is, compared to what they can do. They'll always be several steps ahead, no matter what. So, nouveau is no "threat" to them. The only people they're really concerned about are competitors like AMD/ATI and Intel that make HW.

Right, by the time the hw is shipped, nVidia doesn't worry anymore about their secrets. Nouveau is thus not a problem for them. However, yesterday, the 3d driver for Kepler was released, less than a month after the release of the first Kepler GPU. Some people in the Nouveau really are in-human :D

Be thankful you're in Europe. In the US, the RIAA has been known to sue widows and orphans :-)

Yeah, the US is always crazy about IP. Can't wait for the whole damn thing be to reformed to be friendlier with those who really do make the country go forward.

Comment Re:"a reverse-engineered incarnation" (Score 5, Informative) 231 231

Ok, I created an account. It will be simpler for others to follow. Comparing me to Linus or Eric Raymond is really over-the-top. I'm just a PhD student who has been working on power management in nouveau for little more than 1.5 years.

Anyway, the answers to your post are right, clean-room REing is legal. The shady part is for the firmwares that you have to decode in order to re-implement them. Fortunately, we know nvidia used a compiler to compile them. As we write them in asm only and don't use the same interface, I guess we are pretty covered.

As for video decoding, nVidia though about us and added a "safe" for the encryption keys. So yes, we can re-implement video decoding (it is an on-going work, but it's ugly) but the compliance with hdcp will never come.

As for software patents, we do our best not to do things covered by them but sometimes we have to. In this case, we think about the sane countries that could benefit our code (most of them are "sane" ;)).

Lastly, nVidia said they would neither help nor hinder the project. If there is something they don't like, I'm sure they will let us know before going to court. If they wanted to go to the court, that would be one hell of a pain since they would have to sue individuals from many countries, mostly european.
Most of us are students, that would be bad PR to sue us :D

How many surrealists does it take to screw in a lightbulb? One to hold the giraffe and one to fill the bathtub with brightly colored power tools.

Working...