Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Re:Do it yourself (Score 1) 83

Cppcheck apparently knows "hundreds of other rules covering a multitude of language aspects" so you don't "have to mentally apply against every single line of code you write."

Cppcheck doesn't flag anything in Waffle Iron's example.

It also doesn't find anything wrong with:

std::vector<int> vec = {1, 2, 3, 4, 5};
auto it = vec.begin();
vec.push_back(6);
std::cout << *it << std::endl;

Which is another common example of how you can write memory errors without using C++ pointers.

Comment Re:There is already a safe subset of C++ (Score 1) 83

In the sort of places where MISRA and similar coding guides apply, yes, never allocating memory is expected, because once dynamic allocation exists you can't guarantee that you won't die with an out-of-memory error and similarly can't guarantee any time bounds on how long an alloc and dealloc will take.

Sure, so C++ is safe as long as it's used in a way that makes it incredibly painful. Sounds good. Let's just require all C++ code everywhere to be written that way. Rust usage will skyrocket overnight.

Comment Re: Is there anyone here that voted for Trump (Score 1) 261

It is hard to have fair democracy with winners take it all.

For a really rigorous definition of "fair", it's impossible to have fair democracy at all. Arrow's Theorem demonstrates this to a large degree, although many have argued that some of his fairness axioms are excessive. More recent research has concluded that fairness is the wrong standard, because there's no way for an electorate's "will" to really be fairly represented by any electoral system, not in all cases. Some systems can do better most of the time (and "winner take all" is particularly bad), but all systems fail in some cases.

What we need to aim for instead of fairness is "legitimacy", which is more about building broad acceptance of the system than about fixing the system itself, though it's easier to build acceptance for better-designed systems.

Having the country's top politicians continually claiming the system is unfair and rigged is, of course, the worst possible thing to do if you want to build support for the legitimacy of the system.

Comment Re:Jokes on you (Score 1) 261

Precisely none of those books were ever banned.

I decided to check :-)

According to the Book Censorship Database from the Every Library Institute, both "Of Mice and Men" and "Adventures of Huckleberry Finn" have been challenged, but only "Of Mice and Men" was removed, though "restricted" is more accurate. The Birdville Independent School District in Texas removed the book from general access, allowing access only to the AP English class, and the Indian River County Schools in Florida restricted it to high school students.

No Doctor Suess books were banned, although Suess Enterprises voluntarily ceased publication of six books.

Comment Re:Do it yourself (Score 3, Interesting) 83

So don't use STL

Indeed, No True Scotsman would use STL with C++.

clang-tidy and Cppcheck and flaw finder and Sonarqube

The last job I had where I had to use C/C++, we automatically ran an expensive static analysis tool every time we checked in code. I'd estimate that it only found about half of the potential segfaults, and it made up for that by finding twice as many false positives.

Comment Re:Do it yourself (Score 3, Insightful) 83

The "rules" of mutable collections in STL state that collections may not be mutated while being iterated.

Nope. If I had used st::list instead of std::vector, it would have been perfectly fine and officially supported. (Assuming I changed "i+10" to "i+11" in order to make the algorithm actually terminate, although that change wouldn't affect the vector crash.).

The problem is that there are dozens of different rules you have to remember to apply to the different types of lists and iterators. And that's only talking about that one topic. There are hundreds of other rules covering a multitude of language aspects that you have to mentally apply against every single line of code you write, many of which can potentially cause memory corruption.

Comment Re:Do it yourself (Score 4, Interesting) 83

You don't need the language to enforce memory safety to program memory-safe. The most important thing is, for example, to never touch raw pointers. C++ makes it very easy to avoid this. Rust forces you to avoid it, but just because C++ gives you the loaded gun, it doesn't mean you have to use it. In particular not on your own foot.

That is a dangerous misconception. You don't need to use any pointers to get memory errors in C++:

#include <stdio.h>
#include <vector>
 
int main() {
    std::vector<int> v = {1, 2, 3, 4, 5, 6, 7, 8, 9};
    for (auto i : v) {
        if (i % 2 == 0) {
            v.push_back(i + 10);
        }
        printf("%d\n", i);
    }
 
    return 0;
}
 
$ g++ -Wall -pedantic t.cpp
$ echo $?
0
$ ./a.out
 
1
2
-947527061
1600570778
5
6
7
8
9

Comment Re:There is already a safe subset of C++ (Score 4, Insightful) 83

languages like Rust exist to put ignorant programmers in straight jackets for their own good

Are you seriously trying to suggest that never allocating memory is not also a "straight jacket"?

You seem to be saying that a currently existing bowdlerized version C++ is safe for close-world problems. Possibly so, but that still leaves C++ unsuitable for open-world problems. That makes C++ only suitable for niche applications. Why learn it?

If you just use Rust or any other memory safe language, you won't have to worry about what kind of "world" you're writing for, or about choosing from a range of increasingly dangerous "profiles".

Comment Kind of? (Score 4, Informative) 159

The BLS monthly numbers are always off when the underlying economy is changing rapidly, because of the "birth death problem", meaning that when large numbers of companies are being created or closed (born or died), the surveys that provide the quick data are guaranteed to be quite far off because the surveys go to companies that are already establish, i.e. those that weren't just born and didn't just die. So when there's a lot of market change, they're sampling the part of the market that is changing less. This means the estimates are off, and the faster the economy is changing the further off they are.

A related issue is that the survey results are only a sample, but BLS needs to extrapolate to the entire population of businesses -- but they don't actually know how many businesses there are in the country, much less how many fit into each of the size / revenue / industry buckets. So their extrapolation necessarily involves some systematic guesswork. In normal, stable economic times good guesses are easy because it's not going to be that much different from the prior year and will likely have followed a consistent trend. But when the economy is changing rapidly, that's not true, so the guesses end up being further off the mark.

Second, it's worse when things are turning for the worse, because of something kind of like "survey fatigue", but not. The problem is that when lots of the surveyed companies are struggling, they're focused on fighting for their existence and don't have time to bother filling out voluntary government reporting forms. It's not that they're tired of surveys, but that they just don't have the time and energy to spare. And, of course, the companies that are going out of business are also the ones w

The phone thing is a red herring, because these BLS surveys are not conducted over the phone.

A new issue compounding the above is that the BLS was hit hard by DOGE cuts and early retirements. They've lost over 20% of their staff, and the loss in experience and institutional knowledge is far larger than that, because the people who were fired and the people who took the buyouts tended to be very senior. So a lot of the experience that would be used to improve the estimates has walked out the door.

Anyway, the core problem is that the economy is going into the toilet, really fast. The BLS didn't break out how much of the 911,000 fewer new jobs were added 2024 vs 2025, but I'll bet a big percentage were after Trump started bludgeoning American businesses with tariffs. Most of that pain won't really be known until the 12-month report next year, because the monthly reports are going to continue underestimating the rate of change. Well, assuming the BLS staff isn't forced to cook the books, in which case we'll just never know.

Comment Re:Transitions (Score 2) 243

Someone didn't live through the loss of the floppy drive, DB9 ports, and parallel ports.

In my day, to plug in a mouse: We took the box apart, installed a proprietary bus card, and then tried to figure out non-conflicting spots for the I/O and IRQ jumpers. Then we typed a bunch of gibberish into AUTOEXEC.BAT. And we liked it!

Slashdot Top Deals

"Unibus timeout fatal trap program lost sorry" - An error message printed by DEC's RSTS operating system for the PDP-11

Working...