Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Trusted network zones (Score 4, Informative) 348

If your database is in a trusted network zone, it's fine.

If you have a bunch of assets outside the corporate firewall, you're doing it wrong. These belong behind a DMZ firewall, blocking any ports not strictly necessary, possibly with PNAT and coalescence (i.e. an FTP, Web, and Mail server, natted to the same address, ports 80, 443, 25, 21, and FTP PASV going to different addresses behind that).

Within that DMZ, servers provide whatever services they're going to. MySQL on port 3306 will provide MySQL on port 3306; if you add a local firewall, you will have a firewall that blocks all non-listening ports and leaves port 3306 open, so no difference. If you're worried about ssh, use an IP console card (DRAC, etc.) on a separate subnet, or put the database servers behind another firewall. It is, in fact, common to create trust zones for front-end, application, and database, such that i.e. your Web servers connect through WSGI to a CherryPy application, which connects back to a Database, through a firewall in each step. You can do this with vlans and broken-down subnets, one switch, and one firewall.

You have to consider everything when you design secure network architecture.

Comment So it's like all other information? (Score 5, Insightful) 189

Take a look at Snopes once, huh?

Every time somebody says something, it passes through the public mind. Sometimes it gets down five people and dies; others, it becomes an ever-growing ball of horse shit, and people start claiming that it takes 8 pounds of honey to build a honey comb that holds 1 pound of honey when, in reality, beeswax is pretty cheap in terms of hive storage economy.

There are so many untrue things on Wikipedia just by way of almost everyone believing them--things that are printed in earnest in College textbooks and technical manuals, repeated by experts in field, and yet readily testable as not-true. These are just like Aristotle claiming heavier objects fall faster--and, 3000 years later, Galileo drops a grape and an iron brick at the same time, and both hit the ground simultaneously; did nobody think to check something other than a rock versus a feather? Today, we have the same.

To make matters worse, anyone can purchase a domain name, set up a Web host or lease hosting, and publish anything they want with nobody able to edit it or mark it as suspect or inaccurate. Between word-of-mouth, books printed by whoever the hell wants to, Web sites with no validating authority, and forums where inaccurate posts aren't edited by moderators or community and are often supported by a circle jerk of clueless idiots, where do you expect to get any authoritative information?

Wikipedia has the public access problem in a different scale: anyone can post anything on the Internet or in books or private magazines without contradiction; but, on Wikipedia, you get only as much contradiction as attention, amplified inverse to plausibility. That is to say: if what you post is not obviously wrong and not on a high-traffic article, it will probably fall through; if what you post is ridiculous or is on a high-traffic article, someone will notice the inaccuracy.

Comment Re:Oe noes! A compiler bug! (Score 1) 739

It's just that saying C++ is more complex than Java has little bearing on C. C++ is an immensely complex language: loading and using C++ programs is slow. The overhead of using C++ is immense. It's incredible. Name mangling causes tons of comparisons in initialization and during lazy look-up; while classes require constant indirect look-ups through the virtual method table as a matter of course.

In C, you have none of that. memcpy() is just memcpy(), and it's in the PLT. A call to memcpy() doesn't invoke a look-up through the virtual method table to determine which pointer to use for a call %register,$pointer; it's just stuffed into the PLT, and a call to it causes a hard-coded call %register,$offset.

There are no template functions in C because of no name mangling.

Comment Re:Get used to this... (Score 1) 250

In Project Management, procurement management involves advertising and bidding contracts, selecting sellers, writing up the statement of work, quality guidelines, etc., then continuing with performance reviews and metrics to track the quality of work and determine if it meets the contract and the project needs.

Obviously, that didn't happen.

Comment Re:Oe noes! A compiler bug! (Score 1) 739

That argument is stronger. Your argument was, "If it wasn't there at the start...." which is irrelevant when speaking about proportion.

You substituted "100%" for "majority", which need only mean "50% plus some". Linux was released 23 years ago; ICC version 6 was released in 2002, 12 years ago. That's 11 years before ICC version 6.0 for Linux and 12 years with it; I don't have numbers for pre-6.0, but assume earlier releases came at chronologically earlier points in time. Given its rapid development in that period, the earliest likely release was 2000 or so; but 2002 is the earliest release I have data for.

There have been no other credible compilers for Linux throughout the majority of its existence

Except the Intel C compiler, which is inappropriate for other reasons stated (i.e. it's shitty for non-Intel architectures). Still, given the argument--a GCC bug on x86/x86-64--and the twelve years of potential tuning for icc to support high-performance situations (i.e. embedded architecture, where 16% speed-up matters), broad compiler support is reasonable. It's not like LLVM just becoming useful last year and triggering a scramble to rebase onto CLANG.

2002 was the year of Gnome 2.0, of Linux entering the 2.5 development cycle (2.4 was state-of-the-art), of SuSE 8.0, of single-core CPUs and no AMD64. It was a long time ago, a different age, when journaling file systems were hip and new and Hans Reiser hadn't murdered Nina yet.

Slashdot Top Deals

"Engineering without management is art." -- Jeff Johnson

Working...