Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Comment Kellogg v. Nabisco; Dastar v. TCF (Score 1) 65

So what's the basis of the lawsuit against Disney? There's no damages, so equitable relief? Of what?

You probably guessed correctly: equitable relief in the form of an injunction against Disney bringing a trademark lawsuit. I haven't read the complaint, but I'd be surprised if it didn't cite Kellogg and Dastar.

The Supreme Court of the United States has decided a few cases about the interaction between the Lanham Act, which inclues trademark law, and exclusive rights pursuant to the Copyright Clause. Key cases includes Kellogg Co. v. National Biscuit Co., 305 U.S. 111 (1938), and Dastar Corp. v. Twentieth Century Fox Film Corp., 539 U.S. 23 (2003). In both cases, the Court ruled that the Lanham Act cannot be used to extend the effective term of exclusive rights in an invention whose patent has expired or a work whose copyright has expired. Disney's legal counsel ought to be familiar with the latter case, seeing as it involved a company that is now a subsidiary of Disney.

Comment Trusting trust when bootstrapping a compiler (Score 1) 15

From the article:

The Go project recently arranged for Go itself to be completely reproducible given only the source code, meaning that although a build needs some computer running some operating system and some earlier Go toolchain, none of those choices matters."

[...]

The Multics review is famous for pointing out the possibility of adding a back door to a compiler to insert back doors in critical system programs during compilation [...]. Reading the report inspired Ken Thompson to implement exactly that attack on an early Unix system, probably in early 1975. He later explained the attack in his 1983 Turing Award lecture, published in Communications as "Reflections on Trusting Trust."

David A. Wheeler described a defense against a back door that propagates through the compiler in a 2009 PhD dissertation titled Fully Countering Trusting Trust through Diverse Double-Compiling . Diverse double-compiling (DDC) involves choosing two or more other independently developed compilers A and B for a language, bootstrapping compiler C from source code through each of them (building C with A or B and then building C with itself), and ensuring that the output is byte-identical. This relies on previous effort to make builds reproducible.

However, DDC also relies on having more than one implementation of a particular language. Go and Rust each have only one widely used implementation. This means someone trying to wrangle a supply chain has to do one of three things: trust a particular old version of a compiler not to have a back door, compile every version since the dawn of the language (such as when Rust was prototyped in OCaml), or implement a usable subset of the language in a more widely implemented language. This is why mrustc is so important, as it's a way to skip forward by several years' worth of versions when bootstrapping a Rust compiler.

Comment It always comes back to key distribution (Score 2) 15

From the article: "The only problem left is key distribution: The verifier must know who should have signed the code. [...] To the extent that questions of identity can be solved, having authors sign their software can provide even stronger guarantees." It goes on to describe how Debian and Go package repositories include the expected hash value of a package, so that package downloading tools can reject a package that has been replaced.

However, the approach used by Debian to verify developers' identity, that of new developers physically meeting existing trusted developers at key signing parties to exchange OpenPGP public keys, doesn't scale very well. A lot of contributors are disconnected from the strongly connected set of the web of trust because they cannot travel to key signing parties. This can be because of cost, work or child care scheduling, regulatory restrictions related to geopolitics, or regulatory restrictions related to public health (most recently during 2020-2021). These disconnected contributors must forever rely on the bottleneck of "sponsors" (trusted developers who forward packages from the maintainer to the distribution) to get their work into a distribution.

And sponsors are indeed a bottleneck. From the article: "And then you need to be ready to update to a fixed version of that dependency." When a package's upstream maintainer releases an updated version of a package, the package's sponsor in a particular distribution may be too busy with other tasks to handle it the same day. This can mean that there is no available labor to forward the update to the rolling distribution and backport the fix to the version of the package in a stable distribution.

Comment Re:Do it yourself (Score 3, Interesting) 78

So don't use STL

Indeed, No True Scotsman would use STL with C++.

clang-tidy and Cppcheck and flaw finder and Sonarqube

The last job I had where I had to use C/C++, we automatically ran an expensive static analysis tool every time we checked in code. I'd estimate that it only found about half of the potential segfaults, and it made up for that by finding twice as many false positives.

Comment Re:NPM needs to be burned to the ground (Score 2) 30

ve never seen a software distribution mechanism as careless and sloppy as NPM. Bazillions of dependencies and no signing of packages. [ ... ]

Rust's cargo packaging system is almost exactly the same way. And the last time I looked, Go's packaging was very similar. And package signing won't help if the maintainer's key/cert has been exfiltrated and cracked.

This is what you get when you embrace DLL Hell -- the idea that you should pin your program to a single specific revision of a library, rather than, y'know, doing the engineering work to ensure that, as an app author, you're relying only on documented behavior; and, as a library author, to be responsible for creating backward compatibility for old apps linking to old entry points. Sticking to that principle lets you update shared system libraries with the latest enhancements and bug fixes, while remaining relatively sure none of the old clients will break.

"Sometimes you have to break backward compatibility." Agreed, but the interval between those breaks should be measured in years, not days.

Comment Re:Do it yourself (Score 3, Insightful) 78

The "rules" of mutable collections in STL state that collections may not be mutated while being iterated.

Nope. If I had used st::list instead of std::vector, it would have been perfectly fine and officially supported. (Assuming I changed "i+10" to "i+11" in order to make the algorithm actually terminate, although that change wouldn't affect the vector crash.).

The problem is that there are dozens of different rules you have to remember to apply to the different types of lists and iterators. And that's only talking about that one topic. There are hundreds of other rules covering a multitude of language aspects that you have to mentally apply against every single line of code you write, many of which can potentially cause memory corruption.

Comment Re:There is already a safe subset of C++ (Score 1) 78

Ish.

I would not trust C++ for safety-critical work as MISRA can only limit features, it can't add support for contracts.

There have been other dialects of C++ - Aspect-Oriented C++ and Feature-Oriented C++ being the two that I monitored closely. You can't really do either by using subsetting, regardless of mechanism.

IMHO, it might be easier to reverse the problem. Instead of having specific subsets for specific tasks, where you drill down to the subset you want, have specific subsets for specific mechanisms where you build up to the feature set you need.

Comment Re:Do it yourself (Score 4, Interesting) 78

You don't need the language to enforce memory safety to program memory-safe. The most important thing is, for example, to never touch raw pointers. C++ makes it very easy to avoid this. Rust forces you to avoid it, but just because C++ gives you the loaded gun, it doesn't mean you have to use it. In particular not on your own foot.

That is a dangerous misconception. You don't need to use any pointers to get memory errors in C++:

#include <stdio.h>
#include <vector>
 
int main() {
    std::vector<int> v = {1, 2, 3, 4, 5, 6, 7, 8, 9};
    for (auto i : v) {
        if (i % 2 == 0) {
            v.push_back(i + 10);
        }
        printf("%d\n", i);
    }
 
    return 0;
}
 
$ g++ -Wall -pedantic t.cpp
$ echo $?
0
$ ./a.out
 
1
2
-947527061
1600570778
5
6
7
8
9

Comment Re:Do it yourself (Score 1) 78

You oversimplify. I despise Rust, but it does address real problems. (I'm not sure how well, because I won't use it.) I'm thinking of thinks like deadlock, livelock, etc. As someone above pointed out, there are lots of applications that don't need to deal with that, and subsets can work for them. (The above poster worked in a domain where all memory could be pre-allocated.)

Rust felt like programming with one hand tied behind my back. So I dropped it. Only one reference to a given item it just too restrictive. Perhaps it is really Turing complete, but so is a Turing machine. But multi-threaded programs really do need a better approach. (My real beef with C++ (and C) though is their handling of unicode. So I'm currently experimenting with D [ https://dlang.org/ ], which seems pretty good for the current application (though honestly since it's I/O bound Python would be quite acceptable). )

Comment Re:There is already a safe subset of C++ (Score 4, Insightful) 78

languages like Rust exist to put ignorant programmers in straight jackets for their own good

Are you seriously trying to suggest that never allocating memory is not also a "straight jacket"?

You seem to be saying that a currently existing bowdlerized version C++ is safe for close-world problems. Possibly so, but that still leaves C++ unsuitable for open-world problems. That makes C++ only suitable for niche applications. Why learn it?

If you just use Rust or any other memory safe language, you won't have to worry about what kind of "world" you're writing for, or about choosing from a range of increasingly dangerous "profiles".

Comment Re:Here it comes (Score 4, Interesting) 41

I'm not sure online sales were ever part of Walmart's core competencies; I suspect they contracted all that stuff out to third parties.

The reason I suspect that is that one of my relatives bought a product from Walmart.com and needed to return it, so she called the number listed on the front page of the Walmart.com web site (and dialled it correctly; I later double-checked the call record on her phone against the walmart.com web page), and the representative who answered put her on hold, then forwarded her to a scammer who tried to trick her into allowing him to TeamViewer in to her computer remotely. When she refused, he got increasingly abusive and eventually hung up on her.

So whomever Walmart was contracting for online support, they were at least bribable, and arguably criminal.

Slashdot Top Deals

Luck, that's when preparation and opportunity meet. -- P.E. Trudeau

Working...