Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:Almost... (Score 3, Interesting) 227

Every... Day.... :-/

I have a polite canned reply, which basically says that unless the recruiter's client is looking for developers to work 100% remotely, AND that their pay scales are likely to exceed Google's by a significant margin, AND that they do really cool stuff, then I'm not interested. Oh, and I don't do referrals of friends (they get plenty of spam themselves).

I don't actually mind the recruiter spam. It only takes a couple of keystrokes to fire the canned response, and there's always the possibility that someone will have an opportunity that meets my criteria. Not likely, but possible. I'm not looking for a new job, but if an opportunity satisfies my interest requirements, I'm always open to a discussion.

However, when they keep pushing even when they know their job doesn't fit my requirements, then I get pissed and blackhole their agency. That also takes only a couple of keystrokes :-)

Comment Re: Yes, but.. (Score 1) 324

That's one way. There are always other options. The key is to hook in at the layer that you're debugging. The wire is almost never that layer, unless you're debugging the network card driver. Or the hardware, but in that case Wireshark (or Ethereal, as I still think of it in my head) is usually too high-level.

Comment Re:Point proved (Score 0) 301

I own a 2001 Honda Insight hybrid modified to be a PHEV and plugged in nightly to charge on geothermal power.... and a Ford Ranger ;) The "why" is obvious, because I have regular needs to carry big heavy things, now that I own land in the countryside. Back when I had no such need... I didn't own any such vehicle.

I guess it's hard for him to imagine that a woman would have a need to carry large and/or heavy items?

Comment Re:this is science, so you have to ask... (Score 4, Informative) 301

And the crazy thing is, they did consult with male colleagues before publishing. The reviewer just assumed that because two women submitted a paper with a conclusion that he disagreed with, that it's specifically because they're women "making ideologically biased assumptions" who refuse to talk to men.

Comment Re:This again? (Score 0) 480

Oh hey, since we've got (assumedly) a lot of physics nerds on this thread, and because my mind hasn't suddenly stopped being curious about random topics even though I grew old: here's one of my more recent things that left me with unanswered questions:

One of the commonly cited tritium-generating reactions is 7Li+n(>2.466 MeV) -> 4He + 3H. But is 7Li not also capable of transmutation to 8Li via slow neutron capture? If so would that not yield a 16.004 MeV beta to 8Be, and then immediately into 2 alphas with an additional energy of 0.092 MeV? If so, is there not potential for a future nuclear reactor? Spallation currently yields neutrons for about 25MeV each. If one could cut that in half or less - which I don't see any laws of physics in the way, just improvements in accelerator efficiencies and the spallation process - could this not yield a net positive, using direct deceleration/capture of the beta to generate power without having to suffer Carnot losses? And if so, would that not be a very desireable reactor - nonproliferative, abundant fuel, harmless waste, high ratio of fuel to energy conversion, direct spacecraft thrust possibilities, etc? Or am I totally off base here?

Comment Re:This again? (Score 1) 480

Haha, my concept as a child was to have a buoyant container on wheels in a tube full of water that would rise up, roll down a ramp on the other side, and re-enter the tube through an airlock on the bottom.

Wish my dad had taken the time to tell me why it wouldn't work rather than just saying "perpetual motion is impossible".

Comment Re:This again? (Score 5, Insightful) 480

Or, rather than all of physics being wrong, maybe they have an erroneous measurement setup.

That doesn't mean you shouldn't investigate anomalous measurements. But at this stage you shouldn't be writing fluff pieces with page after page of how much your new technology will change spaceflight. You should be publishing a paper with a name like "Measurement of anomalous thrust in a microwave apparatus operated in a hard vacuum" and trying to avoid the media insomuch as possible - and when you need to talk with them, trying to explain "we don't know what's going on... we have some theories but they're controversial... we need to do more testing." etc.

Comment Re:Also, stop supporting sites with poor encryptio (Score 1) 324

My bank still insists on using RC4 ciphers and TLS 1.

If Firefox were to stop supporting the bank's insecure website, it would surely get their attention better than I've been able to.

What bank is this? There's nothing wrong with public shaming in cases like this, in fact it does the world a service.

Also, you should seriously consider switching banks. Your post prompted me to check the banks I use. One is great, one is okay. I'll watch the okay one.

Comment Re:Yes, but.. (Score 1) 324

That said, if I'm debugging something a browser is doing, the developer console is usually better anyway.

Yes, it is, and the same holds everywhere. Being able to grab the data on the wire has long been an easy way to get sort of what you want to see, but it's almost never exactly what you're really looking for. HTTPS will force us to hook in elsewhere to debug, but the "elsewhere" will almost certainly be a better choice anyway.

Comment Re:Paid Advertisement (Score 4, Insightful) 76

The OpenSSL codebase will get cleaned up and become trustworthy, and it'll continue to be used

Cleanup up and trustworthy? Unlikely. The wrong people are still in charge for that to happen.

Nonsense. The people running the OpenSSL project are competent and dedicated. OpenSSL's problem was lack of resources. It was a side project with occasional funding to implement specific new features, and the funders of new features weren't interested in paying extra to have their features properly integrated and tested. That's not a recipe for great success with something that really needs a full-time team.

Comment Re:He also wants to roll back civil rights too. (Score 1) 438

Because Rockefeller colluded with railroad companies and had secret arrangements to get bulk discount for himself and shafted his competitors.

- there is absolutely 0 wrong with providing a company with a promise to buy scheduled services on the clock without interruptions and to pay for the service whether or not you can use 100% of its capacity that day.

If I want to start a shipping business I can talk to an import/export broker and work out a schedule, where regardless of my circumstances I will ship 1 container every 2 days with him on a clock and because of that certainty of payment he will give me a much better price than he could anybody else.

As to Rockefeller's 'secret deal to prevent shipping for others' - baloney. The so called 'secret deal' was no such thing, it was a discount that Rockefeller was getting that nobody else could get because they would not ship a supply of that much oil on the clock, whether they have it or not that time and pay for a prearranged amount of delivery as promised.

Rockefeller was absolutely right and the reason that oil never went below 7 cents was exactly because government destroyed his company and did not allow him to find new ways to increase demand by lowering prices even further. Nobody was finding any better way of doing business in that time, otherwise they would have won against Rockefeller and that is all there is to it.

Microsoft had a temporary monopoly for a very good reason: they provided the computing platform that nobody else could provide at the price and just because you can't accept that doesn't change that fact. Microsoft and others also pushed hard enough in the market that competitors actually had to innovate to become competitive in that market, which is how free and open source software came to existence.

As to me being 'religious' about free market - I cannot stand hypocrisy of the modern society that will vilify the individual and promote the collective and use the force of the collective to oppress the individual. If I am 'religious' about anything that would be the belief that individual freedom tramps every so called 'societal good' that you can come up with that is based on lies, oppression, destruction of the individual, theft from the individual, slavery of the individual by the collective.

Comment Re:when? (Score 1) 182

nobody is building Internet services that need several hundred megabits for reasonable performance

If there is not a lot of length of copper or fibre between the two endpoints why not? It's only congesting a little bit of a network.

Perhaps I wasn't clear. I wasn't referring to building of network connections, I was referring to the building of user services that rely on them. For example, YouTube is built to dynamically adjust video quality based on available bandwidth, but the range of bandwidths considered by the designers does not include hundreds of megabits, because far too few of the users have that capacity. They have to shoot for the range that most people have.

But as that range changes, services will change their designs to make use of it. We don't really have any idea how things will change if multi-gigabit connections become the norm, but you can be certain they will. Just as programs expand to fill all available memory, Internet services expand to fill all available bandwidth. To some extent that's because more capacity enables laziness on the part of engineers... but it also enables fundamentally different and more useful technologies.

Comment Re:Paid Advertisement (Score 1) 76

Has the fact that there's three major BSDs and one Linux been in BSD's favor?

Being able to choose an operating system (BSDs, Linux, commercial UNIXen, Windows, etc.) has been in your favor, particularly from a security perspective. And would you seriously argue that the existence of multiple BSDs has been a bad thing for their security? I'd argue exactly the opposite. The BSDs, have a well-deserved reputation for being more secure than Linux, and part of that reputation arose directly from the BSD forking. In particular, OpenBSD forked specifically to focus on security, and FreeBSD and NetBSD worked to keep up.

Does it really provide any tangible benefit that not all of us are hit at the same time with the same bug, when we're all vulnerable some of the time?

Yes, it does. You seem to think that being vulnerable none of the time is an alternative. It's not. The system as a whole is much more resilient if vulnerabilities affect only a subset.

For that matter, the eyes in "many eyes makes all bugs shallow" as well.

Look how well that has worked for OpenSSL in the past. The many eyes principle only matters if people are looking, and competition creates attention. Also, it's a common error to assume that the software ecosystem is like a company with a fixed pool of staff that must be divided among the projects. It's not. More projects (open and closed source) opens up more opportunities for people to get involved, and creates competition among them.

Competition also often creates funding opportunities, which directly addresses what was OpenSSL's biggest problem. You can argue that it also divides funding, but again that only holds if you assume a fixed pool of funding, and that's not reality. Google is contributing to OpenSSL development and almost fully funding BoringSSL (not with cash, but with people). That isn't because Google's left hand doesn't know what its right is doing.

Am I supposed to swap browsers every time a vulnerability is found in Firefox/Chrome/Safari/IE?

Huh? No, obviously, you choose a browser with a development team that stays on top of problems and updates quickly. It's almost certain that developers will choose their SSL library at least partly on the same basis, again favoring more work and more attention on the crucial lib.

It's more like math where you need a formal proof that the code will always do what you intend for it to do and that it stands up under scrutiny.

It's not, it's really not. It would be nice if that were true. It's really more like a car that breaks down over time in various ways; some are more reliable than others, but all require ongoing attention and maintenance.

We're not talking about something that must have a fail rate, if you get it right it's good.

This is true in theory, but untrue in practice, because new attacks come along all the time and ongoing maintenance (non-security bugfixes, new features, etc.) introduce new opportunities for security bugs.

Your Apache and IIS counterexamples are actually support my argument. IIS, in particular, was riddled with problems. Yes they've been cleaned up, but you're talking about a space that has been static for almost two decades (though it will soon be destabilized with the introduction of HTTP/2 and probably QUIC) and is, frankly, a much simpler problem than that solved by OpenSSL... and I assert that without the competition of alternatives, IIS never would have been cleaned up as thoroughly as it is.

Slashdot Top Deals

Without life, Biology itself would be impossible.

Working...