Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:it always baffles me (Score 4, Informative) 113

... why are mission critical devices connected to the internet

sure we know that the weakest link is the meatware, not the hardware, but still...

They aren't, at least, not directly. They are however generally connected at various points to the "business" network which is connected to the Internet (people gotta email). The literal air gap is largely fiction. The business network is hacked, then some vulnerability exploited in the bridge points or routers (it's a network of networks!). Why connect the SCADA to the business network at all? To get the data out to do reports, send email alarms etc. in theory this data exporting should be secure. Problem is that who is hacking your SCADA system? It's not the usual suspects; there is no money in it and the barrier of entry is too high for the script kiddies. It's other countries wanting to perform espionage. How the hell do you protect against that? Look at stuxnet, I mean really look at how that took down the centrifuges. Governments have resources that the average hacking group simply doesn't (or SCADA group). They also have no reason to reveal a compromised system. There could be sleeper, targeted, custom malware sitting on every SCADA server in the US, just waiting for the a time where it will be useful to activate. It's a brave new world!

Comment Re:Don't they use Perforce internally? (Score 1) 227

Linux is small. And it's just source code. Storing binaries happens a lot for a lot of reasons. You might have binaries for a third party library, you might have various art assets, compiled CHM files for help, installers for dependencies, etc etc. Git was designed for a particular problem space, in which binaries were not considered a big issue. Other groups have different requirements.

Comment Re:Wont someone think of (Score 1) 100

Dick smith is a hypocrite, all his electronics stores revolved around importing the cheapest crap from overseas, so now for him to say buy australian is a huge backflip. Back when that was happening with dick smith, australia was still manufacturing lots of stuff, now we're just importing everything, whilst exporting the raw materials.

You do realize that the "dick smith" electronics store was sold to woolies in 1982? 60% in 1980, then the rest in 1982. Are you really talking about the store during the 70's? In addition, it does not make someone a hypocrite to behave in a different way to what the once did. Is the reformed alcoholic a hypocrite for wanting tighter alcohol regulation? You really haven't thought this through.

Comment Re:The giant leach on society (Score 3, Insightful) 524

See this is exactly the ignorance I am trying to fight! That you imagine modern innovation is a product of financial institutions boggles the mind! This is a chicken and egg situation, and you are claiming that egg has feathers! Modern financial institutions are a product of need brought about by massive industrial development. I am not denying the need, I am decrying the abuse. To put it in over-simplified terms, the financial institutions are the middle men in all the commerce that occurs, all the development, all the property. They take a percentage for their services, and there is nothing particularly upsetting about this. Where it becomes a problem is when more money is being removed from the overall system through abuses in the methods. HFT fits this bill, and I see no reason not to decry it. Invest in that which will ennoble; science, arts, engineering, and stop playing these foolish games.

Comment Re:The giant leach on society (Score 1) 524

You're still thinking too small, you thinking of things as they are. Think clean slate, think start again. Remove all the existing pre-suppositions, and work out a system based on what we need. In fact, there are many people who have done just that, and the technical difficulties are really not that great; of course the political and practical realities seem insurmountable. My point is that the intrinsic value of our evolved, poorly designed, and out-of-date system is actually very low. It may be true that a 6ms latency reduction will improve the current system, but it's the wrong end of the problem. We need serious reform, not slight movements towards a localized minimum. My outrage is that this silly human system, slowly evolved to make it easier to trade that pig without actually transferring the pig, has now absorbed humanity to the point where it has real negative impacts on the pig (I.e. the reality behind the system). My anger is at the waste.

Comment Re:The giant leach on society (Score 1) 524

Your an idiot hiding behind anonymity, be a man and log in! Using the term "zero sum" was meant in the casual manner of common conversation meaning "of no real benefit", not in a strict economic sense. If your reading comprehension skills were above that of a 6'th grader, you'd also notice I have been employing the use of hyperbole. I don't need a rather poor economics lesson from someone of such spectacular ignorance who can't even close an italics tag; go back to your basement.

Comment Re:The giant leach on society (Score 4, Interesting) 524

Give me a team of 20 programmers, 2 years and unlimited political cooperation, and I will give you a financial system with unlimited liquidity, complete security, and a tiny drain on the global economy. The thing you don't seem to get, is that there is no value in any of this. A few bits in a database are equivalent to a good meal; except that they aren't. It's all just a way to help us keep score as we go about doing the things that matter. The problem is that the "game" is now more important than the reality, and we all suffer as a result. If too many people go around collecting the colored beads, and not enough people are growing the crops, then we all starve to death.

Comment Re:The giant leach on society (Score 1) 524

I fail to see the benefit to society in your example. What has been produced in this trade that makes society better? If an engineer designs a better tractor, or a scientist advances human knowledg, or a software developer helps create software that automatically load balances distributed natural power generation loads; society is clearly advanced. HFT is a clear example of a large amount of work being performed to acheive absolutely nothing. Nothing except making the perpetrator wealthy at the expense of the rest of society. What exactly has the trade done to improve society? What is the value that justifies this enormous revenue? The engineer can point at the bridge, the scientist his paper, and the trader? Just his big stack of Gold! They are simply modern day pirates without the romance. Leeches I called them for leeches they are. As with all parasites; the host can endure but too many may kill it. (and if you can fumigate, all the better).

Comment The giant leach on society (Score 5, Insightful) 524

The entire finance sector fills me with equal parts revulsion and sadness. This is yet another example of enormous resources consumed for no net gain to society. At least in this case something (however unnecessary), tangible is produced as a result. Think of the huge numbers of brilliant mathematical and programming minds that have been consumed by this nonsense! Think of the resources and financial liquidity that is reinvested into this zero sum game! Every hour of work, every employee, every structure erected in praise of this wholly disgusting idol of modern nihilism, makes the rest of our society just that little bit worse. To those who would praise the enabling power of our new financial systems I say Pah! We can create better financial systems within virtual worlds. The only intrinsic value in the financial institutions is the power it gives; and this has been abused for all it is worth! Give me back my engineers! Give me back my scientists! Give me back my hope for a better future!

Comment Re:C++ blows on multi-core and multi-platform (Score 1) 209

As I pointed out earlier, this was a completely contrived example. On our real system we are looking at data volumes of about this kind of changes/second from which I based my calculations. The actual usage of shared_ptr and it's impact on memory usage/performance is far more complicated than what I demonstrated in my example, and I'm not going to try and explain it all here. To clarify further this is a code base of approx 20 million lines or so (last I checked about 3 years ago), and I am describing the data volume on one machine. It is expected that the system will scale linearly across multiple machines, but the data volume/machine is the all important factor from the end users cost calculations (licensing costs are a big factor here). My point was to try and show that little things such as a shared_ptr, can have a large impact when you have so many operations being performed per second (as we do). As I also said, I code in both .Net (which i understand to be very similar to Java) and C++, and we did go through a process of trying to perform our I/O via managed code, however the performance just wasn't there (particulary with memory overhead). Outside of I/O we use .Net for all new code. C++ still has it's place in high performance code, but it isn't quite as nice to use, and is harder to do "right". I love them both.

Comment Re:C++ blows on multi-core and multi-platform (Score 1) 209

Why generate temporary objects in the millions? drawing from an (garbage-collected) object pool can often make a colossal difference to performance.

Let's say I'm an I/O server processing the data from a moderate number of clients say 5000. These clients are sending me updates for a small data set, let's say 2000 points, once a second. My job is to pluck that data off the wire, format it as required by the rest of the sub-systems, then commit it off to say a database. Say it takes about one second for me receive a response from the system on average, before I can dispose of the data update. 5000*2000, means I've got about 10 million little data items I'm processing per second. Let's add another wrinkle. Worst case, I need to buffer that data in the case of a lost database connection for up to 15 min to give enough time for the database to restart or some such thing. 9e9 data updates in memory. Lets say each update consists of a 32 bit number, a 64 bit timestamp, and a 16 bit status field. That's 14 bytes in total. 14 * 9e9 = 1.8e11 bytes. 117.4 GB. Shit, I may be a big server, but I don't have that much memory!. OK fine, maybe I can make my safety margin smaller, lets just go for 4 min, if we can last 4 min, there will be just enough time for a redundancy switch-over for my database. Still need 31.3GB of memory. My server has 8 cores, and 16GB of ram, but it's still just not quite enough. 1.5 minutes. OK, now we can handle it, 11.7 GB. I also need to keep a reference to all these little data updates. If I go for a smart pointer, that's sizeof(std::tr1::shared_ptr) which = 8 bytes. 6.7GB. Dammit!, still over the 16GB. What if I use a bald ptr? sizeof(thing*) = 4 bytes = 3.35GB. Just fits. There is also a performance penalty for creating my smart pointers. This is obviously a contrived example, but it's not too far off the kinds of problems that have had to be solved in my current place of employment. If you go managed for this kind of stuff, the overheads become too large and the ability to scale is greatly impacted. I know this, because we tried it and just weren't able to get the scalability into the same order of magnitude. As I said before, just use the right tool for the job.

Comment Re:C++ blows on multi-core and multi-platform (Score 1) 209

Multi-core may be new, but multi-processor certainly is not. Do you think multi-threading was only thought about since the advent of multi-core processors? I am a C# and a C++ programmer, and I started life as a low level C programmer, so I can see the pros/cons of the various approaches. .Net (and I assume Java) is clearly the better option for the majority of application development being performed today. The reasons for this are the extensive libraries, large communities, and highly sophisticated tool sets (such as the IDE, unit testing integration, performance profiling, etc). However there is one glaring exception to this, and that when you need to scale. If you need to scale, CPU and memory will frequently become your bottle-neck, and your work at minimizing these will define the limits of your scaleability. Here is where C++ really shines, I can code using all the modern safe techniques, but I can get down to the metal to optimize the hot path when I need to. I can take the smartpointers off those transactional objects that are generated in the millions to save memory and speed. I can't do this in .Net. In summary, if you are writing a desktop app, use Java or .Net, if you are writing a server app, use C++ or equivalent. I love both for different reasons; just pick the right tool for the job!

Slashdot Top Deals

FORTRAN is not a flower but a weed -- it is hardy, occasionally blooms, and grows in every computer. -- A.J. Perlis

Working...