Follow Slashdot stories on Twitter


Forgot your password?
Back for a limited time - Get 15% off sitewide on Slashdot Deals with coupon code "BLACKFRIDAY" (some exclusions apply)". ×

Comment Re: C does not need replacement (Score 1) 437

I would also add other things that changed since Unix/C came out: 1) The whole Unix design is the bottleneck in large servers. Read about "Data Planes" and the 10MillionConnection problem. 2) Physical memory protection and Kernel/Userspace split impose horrible costs. There is a reason that extremely high end server systems essentially run as userspace processes that binds some cores and some ethernets together and basically disables the kernel -- which puts every single thing into one gigantic process. 3) If something needs to be produced, there is a dollar amount that can be sunk into it. If it requires a large group of the most expensive developers, it will simply never be built for economic reasons. Therefore, we see a lot of half-baked stuff get built instead.

Comment Re: C does not need replacement (Score 1) 437

Actually, the environment has changed since C was created. C is still wonderful for programs that are small. It's perfect for an environment of static allocation. It gets its speed *precisely* from assuming that undefined behavior does not happen. This alone makes it terrible a terrible choice for large programs that must stand up to malicious input (let alone bad coding - and even sabotage that slips past code review). The world's cryptography libraries are a fantastic case in point, where the best developers in the world are constantly having their security broken The input that modern programs must consume generally requires Turing Complete recognition. It asks the code to solve NP-complete problems and Undecidable problems at the time the code is written, and at runtime. On Windows systems, more CPU time is wasted running Virus scanners and ad-hoc security checkers than would be consumed by simply having applications just be written in more verifiable languages. This mostly boils down to having type systems which can be verified. Rust has issues in embedded settings, but it's not a fat and lazy language. You can run Valgrind on Rust binaries, as its output is essentially the same as C. There is a lot wrong with C from the point of verification. At the moment, you can't simply assert to the compiler (in C) that if there are type system violations that you don't even want the binary to be produced. Not even for small programs. The software industry needs to grow the hell up and start making systems that can be certified for safety.

Comment Re:Semantic Versioning (Score 1) 86

The version numbering should specifically speak of contracts between components, or else it's not robust enough to help with automating things. Roughly speaking: the contract is the header files exported (or IDL files), and an implementation of the contract imports those header files. The header files should contain all documentation that pertains to the contract, and even documentation changes to the contract should bump the contract version up, while the implementation imports the contract just like the client does. The contracts "5.0" and "5.1" are not necessarily compatible in every way, because the interface expanded; which will cause things like structures having incompatible sizes in C, or unimplemented interface members in Java. For this reason, 5.0.20 and 5.1.20 might be the same implementation exposing two different interfaces. Both being version "5" basically means compatibility at the level of source code, and both being "5.1" should mean that they both work with the same contract. That means that there is agreement on: structure members, sizes, which functions exist, their preconditions and postconditions, data layout in buffers passed around, etc. Assuming that "5.7" can use "5.2" should be expressed in an explicit and computable way. A "5.7" component might export a "5.2" interface, but it should be done explicitly. In the real world, there are compatibility outages in places where theoretically things should be compatible. It's almost easier to dispense with version numbers, and simply declare that if A talks to B, then they both import C of the same git commit, or a C in a list of supported interfaces. Otherwise, the versioning is just a bunch of marketing stuff that has little to say about what is actually connected to what.

Comment Re:Software error ... (Score 1) 234

Do you actually mean alloca() ... no dynamically sized local variables? Because that makes sense. I would say that it makes more sense to ban global variables than to ban local variables. Although, returning pointers to values that were once on the stack could be quite a hazard that might give me pause to ever use pointers to local variables.

Comment Re:Software error ... (Score 2) 234

But there is also the issue of some reasonable level of proof that the code is robust; akin to the assurances you get from a good compiler that the machine language behaves like the source code. If you work within truly large C code bases (I estimate that I'm on one right now), the completely manual approach is just not good enough. Garbage collection isn't the only answer of course, but tooling is essential. In the future, higher languages are definitely going to play a role. C/C++ aren't keeping up with changes being created by multi-core. Innovations like LLVM help to keep making progress, but ultimately, embedded systems are going to look something like Rust while everything else is going to move up to a higher level abstraction. The abstraction just has to be high enough that we can get away from compilers being utterly blind, where we can ask the compiler if code is memory safe or conforms to protocols in its interfaces. (See Coq related projects producing subsets of C that can be proven correct)

Comment Re:I'm torn.... (Score 1) 663

You can easily have enough sugary drinks that no amount of exercise will control your sugar. When I went cold turkey just on sodas for a few years, the difference it made was dramatic. By contrast when I was doing an extreme amount of exercise, it was far harder to stay on the strict diet; and my numbers were (mostly) worse than simple starvation. There is more to this than just the sugar content of the drinks, in that having the drinks makes it harder to stay away from the other carbs.

Comment Re:Writing On The Wall Folks (Score 1) 167

What won't be happening within 10 years is having a cute GUI that a technically unskilled business guy can use to *specify* what he wants. The pointy haired boss will still need to speak a computer's language, or be able to intelligently respond to disambiguation questions from the computer. What is already happening, and what will continue is the extremely rapid improvement of tooling. We are reaching the point where a hundred cowboys writing in C will not be able to keep up with a compiler tool chain that is producing binaries that meet a specification that is both checked for logical consistency, and has (locally) optimal performance. In short, we will start producing reliable software. There are some amazing things going on with respect to Coq these days; and they most definitely require an extremely skilled person to get the specification written. A huge improvement in getting these specifications written would require that it be simple enough for a mortal programmer to do (rather than a PhD mathematician).

Comment Re:Yes (Score 1) 1067

The real problem is that the arguments to Divide are *not* of the exact same type. Assuming denominator and numerator are both multiplied times -1 to normalize them, the numerator is an integer, and the denominator is a positive non-zero number. In code, you should not be able to invoke a/b without being inside code that proves that b!=0 (ie: inside an if(b!=0), or b's type is non-zero integer. In functional languages, there would usually be a match or switch statement for this.

Comment Re:Bah! Media! (Score 5, Interesting) 173

SF86 data is extraordinarily sensitive. What they mean is that the attackers made off with a database of the financial problems, drug habits, family problems, hidden crimes, and sex fetishes of anybody that's working on anything sensitive. This data will determine who comes home to a hooker in his bed with requests for information and a crowbar in one hand and a bag of illegal drugs in the other. I'd say that the information is so sensitive, that it may actually weaken security to continue with this practice of having all of these confessions written down. I mean... if you can approach your boss and say "hey, i need to take a few weeks off to go to jail!" to which he responds "ok. you have plenty of leave!"; then that may leave you far less open to coercion then if you go into a panic over being found out by your boss for adultery. ("gah! i'll lose my clearance and never ever work again!")

Comment Re:Oversimplified (Score 1) 74

Exactly. Encryption hides the conversation from external observation, which won't prevent one party from sending malicious data to the other. In fact, it weakens security in the sense that visibility into these kinds of problems is lost. This is why in a corporate setting, you may be asked to surrender to surveillance of your network connections for legitimate security reasons.

Comment Re:Oversimplified (Score 1) 74

We have been trying to handle security by wrapping various "condoms" around software that doesn't defend itself from bad input. That allows it to be used without fixing it. But this whole strategy is about to break with the widespread use of encryption. We currently protect traffic by inspecting it to observe abuse of the recipient of a message; and yes, it's functionally identical to surveillance in how it works. Ultimately, we need to do something like what LANSEC suggests, and require very strong input handling that is limited to "in the language" inputs. It's an admission that Postel's Law needs an update. We need to be extremely conservative in what we accept, and presume that all out of spec inputs are designed to put us into an illegal state.

Comment Re:Big Data != toolset (Score 1) 100

Actually, the biggest problem with RDBMS and similar tools is the fact that you are expected to mutate data in place, and mash it into a structure that is optimized for this case. Most of the zoo of new tools are about supporting a world in which incoming writes are "facts" (ie: append-only, uncleaned, unprocessed, and never deleted), while all reads are transient "views" (from combinations of batch jobs and real-time event processing) that can be automatically recomputed (like database indexes).

Comment Re:Big Data != toolset (Score 1) 100

Except, if you are talking about a centralized database tool, you already know that the default design of "everybody write into the centralized SQL database" is a problem. Therefore, people talk about alternative tools; which are generally designed around a set of data structures and algorithms as the default cases. A lot of streaming based applications (ie: log aggregation) are a reasonable fit for relational databases except for the one gigantic table that is effectively a huge (replicated, distributed) circular queue that eventually gets full - and must insert and delete data at the same rate. Or the initial design already rules out anything resembling a relational schema, etc.

Comment Re:let's be real for a second (Score 5, Informative) 429

That's a pretty ridiculous statement. My actual experience intuitively says just the opposite. I work at a security company that is largely made of guys who just got out of Israeli SIGINT (their mandatory service). The older guys write kernel code know what C compiles to, and see the vulnerabilities intuitively. The new ones have quite a bit more experience in high level languages, while being almost oblivious to abstraction breakage that leads to security holes. At best, I'd say that the older developers get stuck dealing with older code bases (that are making the money) and tools (because the newer ones can't deal with it anyway). But on security.... Prior to the mid 1990s, everybody in the world seemed to be working on a compiler of some kind. This deep compiler knowledge is the most important part of designing and implementing security against hostile input; ie: LANGSEC.

Comment Re:Well done! (Score 1) 540

Perhaps not directly. But the difference between public school and private schools is impossible to overstate; and it is strongly correlated to houses with one full-time working parent and one part-time or flex-schedule parent. The tuition (almost regardless of how much it is) immediately filters out financially overwhelmed and un-involved parents. Then even for the parents that can afford it, some schools also have involvement quotas that will cause a pair of full-time parents to drop out. Morale and motivation in private schools is extremely high, akin to that of people working in good jobs; which counts for about two or three grade levels. The end result is that you have a kid who is surrounded by children who know nothing other than a 7-day week of school, getting up at 5am to wrap up missed studies, music lessons, sports. Even if they do spend a bit of time goofing on iPads and watching TV, it is nothing like what happens with parents who can only show up long enough to sleep and go back to work. Even people who are poor will try to move their kids into the better school districts. A few will even break the law to do it, with few regrets when they get caught. (You can get sued by the county for doing this.)

At the source of every error which is blamed on the computer you will find at least two human errors, including the error of blaming it on the computer.