Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Fire(wall) and forget (Score 1) 348

Correct me if I'm wrong, but PCI compliance doesn't necessarily require a firewall between each system that takes credit cards. It just requires a firewall to protect all the systems that take credit cards. If you have a few POS systems and a SQL server that access credit card info, you don't need a software firewall on each of those systems. You could set up one hardware firewall that protects all of those systems from Internet traffic (and other LAN traffic, if needed).

Comment Re:Fire(wall) and forget (Score 1) 348

It depends on what you're talking about, and where. A firewall between the LAN and the Internet, yes. Generally speaking, put it up, and then figure out what needs to be opened.

Beyond that, it starts to get a bit more foggy. Security is often a trade-off between making access too easy for attackers vs. making access to hard for authorized personnel. It's not uncommon for security software to do more harm than good, blocking things that shouldn't be blocked, breaking the networking stack in weird ways. When it comes to software antivirus and firewalls, my view is that you should use the more lightweight, least intrusive solution that meets your needs.

I'm not sure, but it seems to me that the original poster is asking about the built-in Windows firewall. Should that be enabled on all machines?

Comment Know what firewalls do. (Score 2) 348

Honestly, determining whether you need a firewall isn't as simple as "yes, always, all the time" or "no you don't need one." You have to know what the firewall is doing, and what security is required. You can set up a firewall, allow all ports to be forwarded through without inspection, and while you have a firewall, it's not helping you. Or you could have a server running a secure OS with only the vital ports opened, without access to anything other than the Internet, in which case a firewall probably isn't doing you a lot of good.

Also, it seems you're talking about a software firewall installed on the server? I wouldn't trust it. If I'm running Internet accessible servers, I generally want separate hardware firewall, and I want to put those servers into a separate DMZ if I can. I might leave the built-in Windows firewall turned on if it's not causing any problems, but if I have to disable it, I don't worry too much about it because I have the hardware firewall.

A properly secured Linux/Unix server should be able to sit directly on the Internet without issues, but you may as well put it behind the hardware firewall if you have the option.

But are we talking about disabling the built-in software firewall on a machine that's only accessible by other computers on the LAN? That's probably fine. You should have some security preventing unauthorized personnel from accessing the LAN, and I would assume the SQL databse it password protected, right?

I guess my bottom line here is this: Since you can't trust a the built-in Windows firewall to actually protect from very much, you shouldn't worry too much about disabling it. Make sure your network is secure without it.

Comment Re:Bullshit.... (Score 1) 133

Well no, the metric is real. The question would be whether it's useful or meaningful. You originally implied that it wasn't because:

A "combined score" for speed and ratio is useless, as that relation is not linear.

It seems now that it's not about the relation being linear, but about something else that you won't say. I'm afraid I'm not closer to understanding.

Comment Re: Bullshit.... (Score 1) 133

Decompression time is always real time? So it doesn't matter what computer, what processor, the size of the file, the complexity of the file, or even what kind of file it is? Or do you mean that it needs to be able to be done in real-time (or faster) for some particular use a a particular kind of file on a particular platform that you have in mind?

Comment Re:Bullshit.... (Score 2) 133

I find it surprising and almost funny how much ire this has drawn from people with some kind of weird "purist" attitude about the whole thing.

It doesn't seem "generally useless" to me, but it would be more appropriate to say that it's "useful only in general cases". I would say that in most circumstances, I'd want compression algorithms that balance speed and compression. I often don't zip my files to maximum compression, for example, because I don't want to sit around waiting for a long time in order to save a very small amount of space. I also don't zip without compression, because speed is not that *that* important. I look for compression that's balanced. "Compress it as much as you can without making me sit around and wait for it."

Similarly, if I were ripping CDs to MP3, and you offered me a different format that would save me 1MB per song, I'd jump on board. If you told me that it would save me that space by requiring 1 hour to compress, and then another hour to decompress before I could play it, I'd tell you to fuck off. If you told me it would drain my battery life on my phone to play it, I'd say it's not worth the trouble.

So I don't know if this is the right metric or the most useful metric, but certainly there could be a metric for compression that deals with "total space savings" vs. "time and complexity in compressing and decompressing". Such a metric could actually be a solid indicator of which compression is useful in a vague general sense.

Comment Re:Bullshit.... (Score 1) 133

How much time it takes to compress is irrelevant, even if you get diminishing returns the longer you take. What's important is to save space when broadcasting the content.

Well, and also that it can be decompressed quickly and with little processing power, or else with enough hardware support that it doesn't matter. Otherwise, it'd take a long time to access and drain power on mobile devices.

Comment Re:Bullshit.... (Score 1) 133

Hence a single score is completely unsuitable to address the "quality" of the algorithm, because there is no single benchmark scenario.

So you're saying that no benchmark is meaningful because no single benchmark can be relied upon to be the final word under all circumstances? By that logic, measuring speed is not meaningful, because it's not the final word in all circumstances. Measuring the compression ratio is meaningless because it's not the final word in all circumstances. The footprint of the code is meaningless because it's not the final word in all circumstances.

Isn't it possible that a benchmark could be useful for some purposes other than being the final word in all circumstances?

Comment Re:Bullshit.... (Score 1) 133

Depending on what you're talking about, providing a huge table of every possible test doesn't make for easy comparisons. In the case of graphics cards, I suppose you could provide a list of every single game, including framerates on every single setting on every single game. It would be hard to gather all that data, and the result would be information overload, and it still wouldn't allow you to make a good comparison between cards. Even assuming you ad such a table, it would probably be more helpful to add or average the results somehow, providing a cumulative score. Of course, then you might want to weight the scores, possibly based on how popular the game is, or how representative it is of the actual capabilities of the card. But if that's the result that's actually helpful, why not design a single benchmark that's representative of what games do, rather than having to test so many games?

Comment Re:Bullshit.... (Score 1) 133

there's not a meaningful way to pick the "best" in that group that everyone will agree on

Metrics often don't provide a definitive answer about what the best thing is, with universal agreement. If I tell you Apple scores highest in customer satisfaction for smartphones last year, does that mean everyone will agree that the iPhone is the best phone? If a bunch of people are working at a helpdesk, and one closes the most tickets per hour, does that necessarily mean that he's the best helpdesk tech?

It's true that a lot of people misuse metrics, thinking that they always provide an easy answer, without understanding what they actually mean. That doesn't mean that metrics are useless.

If you're comparing a bunch of cars that get 32-35 mpg and go 130-140 mph, there's not a meaningful way to pick the "best" in that group that everyone will agree on

Yeah, but that's a really dumb metric since most people don't actually care what the top speed of a car is. Or to be more truthful, only morons care about top speed unless it's below 80mph, since you basically shouldn't be driving your car that fast. So really, in a metric like this, the "top speed" isn't a metric of "faster is better". It's a metric of "fast enough is good enough".

But if you were in the habit of doing car reviews, it might make sense to take a bunch of assessments, qualitative and quantitative, like acceleration and handling, MPG, physical attractiveness, additional features, and price (lower is better), and then weigh and average each score. That would enable you to come up with a final score which, while subjective, makes some attempt to enable an overall ranking of the cars. In fact, this is the sort of thing that reviewers sometimes do.

Comment Re:Bullshit.... (Score 3, Insightful) 133

Since the "correct" weighting is a matter of opinion and everybody's use-case is different, a single-dimension metric isn't very useful...[snip] User A is trying to stream stuff that has to have latency less than 15 seconds, so for him the first algorithm is the best.

And these are very good arguments why such a metric should not be taken as an end-all be-all. Isn't that generally the case with metrics and benchmarks?

For example, you might use a benchmark to gauge the relative performance between two video cards. I test Card A and it gets 700. I test Card B and it gets a 680. However, in running a specific game that I like, Card B gets slightly faster framerates. Meanwhile, some other guy wants to use the video cards to mine Bitcoin, and maybe these specific benchmarks test entirely the wrong thing, and Card C, which scores 300 on the benchmark, is the best choice. Is the benchmark therefore useless?

No, not necessarily. if the benchmark is supposed to test general game performance, and generally faster benchmark tests correlate with faster game performance, then it helps shoppers figure out what to buy. If you want to shop based on a specific game or a specific use, then you use a different benchmark.

Comment Re:Tricky (Score 1) 152

We're talking about streaming media

Who's "we"? Looking throught the comments, I see a lot of people expressing confusion about the point.

Oh, and I'd say that a web-page counts as streaming text. And a lot of people might consider Project Gutenberg's offerings (for example) as more-or-less streaming, if you read them online.

Comment Explains some things (Score 4, Interesting) 250

Maybe these fliers were honest, and Comcast just believes the investing in an ISP is a money-losing venture. It would explain some things.

I guess the only sensible response is to sell your stock in Comcast. They view their own business as a money-pit and a disaster waiting to happen.

Slashdot Top Deals

This file will self-destruct in five minutes.

Working...