Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Depends (Score 2) 76

For the "many eyes" to work, there are quite few requirement.

Yes, being opensource is a requirement, but is not the single only requirement.

The code need to be actually readable and to attract users motivated to check it.
That wasn't the case. OpenSSL's code is known to be really crappy, with lots of bad decisions inside. Any coder trying to review it will have their eyes starting to bleed.
It doesn't attract people who might review it. It only attracts the kind of people who just want to quickly hack a new feature and slap it on the top, without having a look at what's running underneath.

The code need also to be reasonably accessible to code review tools.
Lots of reviewers don't painfully check every single last line of code by hand. Some use tools to do controls. OpenSSL has had such a series of bad decision in the past, that the resulting piece of neightmare is resistant to some types of analysis.

Comment Tool assisted review (Score 1) 76

The problem is that some of the design decision behind openssl are so aweful that some of the code review tools just don't work well to detect bug.

Hearthbleed has specifically resisted to valgrind, because the geniuses behind openssl had implemented they own memory management replacement functions in a way that is resistant to memory analysis.
The memory porblem went undetected.

Comment Corrections (Score 1) 374

The problem is that there are NO children yet. Only cells with 2 half nuclei inside (= pre-embryos)

Small correction: apparently you still call it "pre-embryo" even later than that, as long as they aren't implanted into an uterus yet (and they haven't formed a primitive streak. I didn't remember at all this latter part).

Comment Pre-embryo (Score 3, Informative) 374

Who Owns Pre-Embryos?

From a scientist: What the fuck is a pre-embryo.

Wikipedia is your friend.

Basically:
- Bunch of cells, still disorganised (apparently, you wait until for the primitive streak to call it proper "embryo". I didn't remember that from my lectures)
- They float around, they haven't implanted into an uterus yet. (That I vaguely remember from my medical studies).

(Well, of course, they were fertilized *in vitro*. It would be hard to find an uterus to implant onto at the bottom of a test tube).

Comment Complex issue (Score 1) 374

He is the biological father, he is half responsible for these children if they are born.

The problem is that there are NO children yet. Only cells with 2 half nuclei inside (= pre-embryos)

The problems are very real, but concerns children which do not exist yet, but due to biology and the existence of those cells could very well come into existence.

He cannot force her to give them up any more than he could force her to abort the fetus

The notion is different.
- the "abortion" case is about the mother. It's her body, she decides what she's doing, nobody can force her to undergo a procedure that might has consequences on her health / ability to proceates further.
- here the cells are in a test tube somewhere in a fidgde. mother's phyiscal body isn't in any way concern by the discarding of the cells. it's only the mother's role as a potiential parent (if the children come into existence).

Comment VFX was the pong (Score 1) 125

I had a full vr helmet in the late 90s to play doom, decent, and so on. I can't remember the name of the helmet but it came with a mouse that looked like a hockey puck.

I would venture that this was the VFX family of helmet (VFX-1, VFX-3D)
It was one of the first 3D helmet, with extensive support in games.
Resolution was shitty (~260 vertical column per eye) in fact so shitty, that manufacturer did give separate count of R, G and B pixels (call it "790" horizontal resolution !).
Field of view was also awfully small (think looking into a small windows in front of you, as if you looked a laptop screen, instead of today's occulus rift's "surrounded by the picture everywhere").
Image was blurry (LCD. All this was happening long before the advent of OLED and other fast refresh devices).

But still, even if it was in its infancy, it was one of the first big thing to arrive on mainstream market.
I've never had one my self, luckily the local computer shop had one and I hacked around a bit.

A bit later, the "I-glasses" family of device started to get popular. Much lighter, slighty higher resolution, and used a mirror system that made possible to overlay picture over the actual sight ("augmented" reality).

Personnally, much later, I managed to land an eMagine 3D Visor (was working in medical research, had more money than when I was a kid).
Slightly better angle of view (45 one of the best pre-occulus), OLED display (so no blur, high resolution, etc.)
(though support for non-nVidia hardware required ordering a new firmware on a swappable ROM chip)

Nowadday Occulus Rift and the like have advanced a lot:
- replaced the complex optics and simple display, with simple optics and shaders to compensate distortion.
- actual real full field of view. you don't look through a small windows, you actually have a picture completely surrounding you.
- high resolution (thanks to all the "Apple Retina" and "Cram a FullHD 1080p resolution on a smartwarch" craze, we have small high resolution displays)
- really fast / low latency tracking (thanks to cheap high speed cameras, which supplements the electronic accelerometer/gyroscopes of old time)

We've reached the point where the technical short-comings are more or less being solved.

Thus we aren't as much in the Pong-era, as in the late 8bits / early 16bits console era:
technical problems are being solved, hardware gets available and affordable, now we need to learn to harness the medium and develop nice stuff.
Artists need to learn what can be done with this.
We are at the dawn of tons of new things coming out for VR 3D.

It's good that indie dev are currently thriving.

Comment The lawsuite is in Europe - law is different (Score 1) 286

While you own the physical media, you don't own the data on the media. You only have a license to use that data and part of the license is not skipping ads, etc.

In the US maybe... excpet that TFA's lawsuite happened in Germany, EU. European country have their copyright law siding a little bit more toward end users than USA.

Among other, several country have the local DCMA-equivalent law, explicitely granting excetion for fair use. And explicitely consider "fair use" to b0rk the encryption for "technical reasons" such as needing to be able to play your own media because you buy it and want to play it and the manufacturer doesn't support your OS. (e.g.: Switzerland, although it's not *EU*, just geographically in Europe).
deCSS is considered lawful here: you bought the CD, use whatever you need to exercise your fair use rights.
There's no concept of "you're actually just renting the data and thus must follow the license in order to be able to consume it".

It'd be akin to requiring a login to use a free website, but the agreement for the login to say that you accept the ads in order to use the website.

Again, in most european countries, EULA aren't considered binding. You can't sell your soul just because there was a sentence hidden somewhere in the big pile of legalese.
The only things which *are* legally binding are the general provision covered in the law itself (warranties, etc.)

But a website owner CANNOT sue you because you violated the license you were supposed to accept and used Adbock anyway.
On the other hand, nothing forbids the owner to kick you out and ban your account either.

Comment Sports (Score 0) 216

Or you know, you could *actually DO* sports instead of standing in front of a light box and shout at picture of sportsmen.

Common, we're ./ here. We're notoriously sociopathic. We don't have the needs to root for a team or other such pointless ritual to reinforce social identity.

Comment Actually, not. (Score 3, Insightful) 88

Actually the footprint of binary is dwindling.

Before:
- either catalyst, which is a completely closed source down to the kernel module.
- or opensource, which is an entirely different stack, even the kernel module is different.

Now:
- open source stack is still here the same way as before.
- catalyst is just the opengl library which sits atop the same opensource stack as the opensource.

So no, actually I'm rejoincing. (That might also be because I don't style my facial hair as "neck bread" ).

Comment Simplifying drivers (Score 4, Informative) 88

(do I need now binary blobs for AMD graphics or not?)

The whole point of AMDGPU is to simplify the situation.
Now the only difference between catalyst and radeon drivers is the 3d acceleration - either run a proprietary binary opengl, or run mesa Gallium3D.
All the rest of the stack downward from this point is opensource: same kernel module, same library, etc.

Switching between prorietary and opensource driver will be just choosing which opengl implementation to run.

I decided (I don't need gaming performance) that Intel with its integrated graphics seems the best bet at the moment.

If you don't need performance, radeon works pretty well too.
Radeon have an opensource driver. It works best for a little bit older cards. Usually the latest gen cards lag a bit (driver is released after a delay, performance isn't as good as binary) (though AMD is working to reduce the delay).

Like Intel, the opensource driver is also supported by AMD (they have opensource developpers on their payroll for that), although compared to Intel, AMD's opensource driver team is a bit understaffed.
AMD's official policy is also to only support the latest few cards generation in their proprietary drivers. For older cards, the opensource *are* the official drivers.
(Usually by the time support is dropped out of catalyst, the opensource driver has caught up enough with performance to be a really good alternative).

The direction toward which AMD is moving with AMDGPU is even more reinforcing this approach:
- the stack is completely opensource at the bottom
- for older cards, stick with Gallium3D/mesa
- for newer cards, you can swap out the top opengl part with catalyst, and keep the rest of the stack the same.
- for cards in between it's up to you to make your choice between opensource or high performance.

If you look overall, the general tendency is toward more opensource at AMD.
- stack has moved toward having more opensource components, even if you choose catalyst.
- behind the scene AMD is doing efforts to make future cards more opensource friendly and be able to release faster the necessary code and documentation.

AMD: you can stuff your "high performance proprietary driver" up any cavity of your choosing. I'll buy things from you again when you have a clear pro-free software strategy again -- if you're around by then at all.

I don't know what you don't find clear, in their strategy.

They've always officially support opensource: they release documentation, code, and have a few developpers on their pay roll.
Open-source has always been the official solution for older cards.
Catalyst has always been the solution for latest cards which don't have opensource drivers yet, or if you want to max out performance or latest opengl 4.x

And if anything, they're moving more toward opensource: merging the to to rely more on opensource base component, to avoid duplication of development efforts,
and finding ways to be faster with opensource on newer generations.

For me that's good enough, that why I usually go with radeon when I have the choice (desktop PC that I build myself) , and I'm happy with the results.

Comment NAT is just bandaid (Score 1) 390

You know what else solves the "not enough IP addresses" problem? NAT.

It's a short-term quick hack which might make some problem seem to disappear, but creates ton of other problems.
NAT creates layers of indirection, and NAT makes machines not directly addressable.
Require hole punching and the like even for very basic functionality (like VoIP).
The internet was envisioned as a distributed network with all being equal peers, but NAT is contributing to the current assymetry of having a few key content distributor and every body else being a passive consumer.

And it's a lot less of a change than switching to IPv6.

IPv6 here. No it's not that complicated, and can be made automated. (e.g.: you don't even need to setup DHCP. your router just hands out prefixes, and the devices on the net autonomously decide their address by appending their mac address).
With NAT, you'll end up needing to fumble with your router and open / redirect ports anyway, just to be sure that everything works as it should.

Slashdot Top Deals

Happiness is twin floppies.

Working...