Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).


Comment: Re:Beware journald... (Score 1) 379

by jgrahn (#46253409) Attached to: Debian Technical Committee Votes For Systemd Over Upstart

I don't see why any of this wouldn't be possible with text logs. Yes, you can write GUIs that parse text log files. Yes, you canwrite an utility that does all the grepping and sorting for you. Yes, you can filter out logs from the previous boot. I don't think the performance loss would be noticeable. So your arguments in favor of binary log files seem quite moot.

Oh, I can see you are quite new to Linux log files, they are basically free form dumps of whatever anything wants to write to syslog in any which way they please. It is impossible to make anything else than a crude GUI; a 'less' just with windows decorations. This is exactly why both Gnome and KDE dropped their previous attempt on making such a GUI.

So, how does making it a binary blob help? The data sources will just produce free form dumps as before, and that's what will end up on the logs. The things that are structured is the time stamp, facility and critical/error/info etc level, and they are easily parseable anyway.

So either the binary blobs don't help at all, or all Unix applications will be forced to adopt some non-Unix structured logging. For my applications, I can say right now I will not do the changes needed to achieve the latter ...

Comment: Re:Shell? Give me a web based browser (Score 1) 96

by jgrahn (#46231511) Attached to: A Dedicated Shell For Git Commands

What I wish was available was a way to view the source of a git with one minor enhancement - the ability to browse with a click to the declaration/definition of functions/variables a la the IDE of your choice. Would be neat to be able to build an index using some tool, check it into the root of your repo, and then have the site use the index to mark up the source code with the appropriate links. I've dug around for something that does this but haven't found anything suitable - anyone know of anything?

Sounds like plain old etags/ctags. You may have to look at the commit from inside your editor though for that to work, e.g. an Emacs M-x shell.

Comment: Re:Compile time is irrelevant. (Score 1) 196

by jgrahn (#45379063) Attached to: Speed Test: Comparing Intel C++, GNU C++, and LLVM Clang Compilers

Honestly, if your compile times are that much, and that much of a burden, you need to upgrade, and you also need to modularise your code more. The fact is that most of that compile time isn't actually needed for 90% of compiles unless your code is very crap.

Hint: I said 'two hours to compile from scratch'. You can't avoid compiling all your source if you just did a clean checkout from SVN into an empty source tree; as you would, for example, before building a release or release candidate.

That, to me, sounds like "I don't trust my build system to do an incremental build". A lot of people don't, and end up spending $$$$ on building provably the same thing over and over again, dozens of times a day ... Stop doing that, and you can use a compiler which is ten times slower and still have shorter build times.

BTW, Git does the right thing WRT timestamps and Makefiles: when it changes a real file, it updates the timestamp to the current time, not the time the version was created. Now that I've used it, I don't understand why tools made the wrong choice for thirty years.

Comment: Re:TFA does a poor job of defining what's happenin (Score 1) 470

by jgrahn (#45294801) Attached to: How Your Compiler Can Compromise Application Security

OK, that explains why I've been getting away with assuming they wrap since the Clinton administration. I don't know if anybody ever explained it to me in C terms.

It's *your* responsibility as a C programmer to find out what the rules of the game are; you should accept that responsibility. However ...

I always assumed that behavior was baked in at the CPU level, and just percolated up to C. I never felt inclined to do any "bit twiddling" with int or even fixed-width signed integers because on an intuitive level it "felt wrong".

... that's exactly the correct way of thinking (informally) about it. There are so many different representations of signed integers, but only one popular one of unsigned integers.

Comment: Re:Distribute (Score 1) 190

by jgrahn (#44722677) Attached to: Ask Slashdot: Speeding Up Personal Anti-Spam Filters?

It seems you could easily distribute the load on multiple machines, each doing a subset of the regex.

There's something hilarious about having to distribute email filtering across several machines.

No; it's tragic that people reach for distributed, or "multi-core", whenever something runs too slowly. If filtering a mail according to a set of REs takes 15 seconds of CPU time like the OP writes, he's clearly doing something wrong, or hitting some limitation of procmail's design (being unable to amortize work, such as reading and parsing the REs, between mails).

Distributing wasteful work is not the right solution, in particular not if energy efficiency is a major concern like the OP suggests. (And I find it hard to believe that a 15s delay of mail delivery is his main concern!)

Comment: Re:Or... (Score 1) 190

by jgrahn (#44722601) Attached to: Ask Slashdot: Speeding Up Personal Anti-Spam Filters?

Fuck you and your RBLs. RBLs are a draconian solution that do immeasurable damage to those of us who (1) aren't spammers, and (2) choose to run our own mailservers on business-class IPs. [...] Oh, because someone in the same /24 block sent spam? Really? That's a good reason to block an entire /24 subnet?

That sucks. And more generally, I believe the anti-spam fundamentalists have caused as much damage to the use of reliable mail, as the actual spammers. They are much too willing to accept collateral damage. Sad, when you consider how much effort went into designing (a) guaranteed delivery or (b) guaranteed notification of non-delivery.

For this particular situation: I accepted early on that many misconfigured mail servers "work" like you describe, so I make sure that my ISPs lets me relay via them.

Comment: Re:spamassassin (Score 1) 190

by jgrahn (#44722579) Attached to: Ask Slashdot: Speeding Up Personal Anti-Spam Filters?

I don't think there's any such thing as "pretty much finished",

There is; software designed according to "do one thing and do it well" ... for example the Unix cat(1) command is probably pretty stable by now. Same with fgrep(1).

especially with a piece of software involved in the arms race that is spam vs. filtering.

... but yeah, well, I don't know Spamassassin but I suspect it has broader and more loosely-defined goals.

Comment: The interview (Score 1) 252

by jgrahn (#44504007) Attached to: Ask Slashdot: Is Development Leadership Overvalued?
At that point in the interview, I'd respond:

*Shrug* No. I don't like telling people what to do, and I suspect I'm not good at it. NB this doesn't mean I lack social skills. I can work with people; I just don't want to lead them.

Surely it's not hard to explain? It's how most people feel, after all.

Comment: Re:We don't shun those who should be shunned. (Score 1) 479

by jgrahn (#44451753) Attached to: Remember the Computer Science Past Or Be Condemned To Repeat It?

Unless you're a Windows programmer, I'd stick with C, which is infinitely simpler, and provides you freedom to maintain competency in other languages,

Huh? So C++ somehow removes that freedom?

Let me also remind you that while C is simpler than C++, your C source code is more complex than your C++ code -- while doing the same things, using the same APIs and libraries. Personally, I am tired of implementing linked lists and hash tables for the Nth time and hunting for memory leaks and buffer overflows. Learning C++ was time well spent.

On the other hand, learning to use C is a good idea. It's the lingua franca in the Unix world at least.

many of which have far cooler features than C++ will ever be able to provide.

It is true that you're crippled if all you know is C++. Or C. Unsurprisingly, neither language is supposed to be the perfect tool for every job.

Comment: Re:Back to BASIC (Score 1) 479

by jgrahn (#44451491) Attached to: Remember the Computer Science Past Or Be Condemned To Repeat It?

[C++] Talk about understanding value construction and destruction. And exception safety! Does anyone actually grok it to the degree that one doesn't need to think about it all the time?

I do. It's not hard. Not letting objects be in an invalid state in a context which expects them to be in a *valid* state is something that's important in any code in any language (except maybe functional languages?). C++ is not the problem here but part of the solution.

Comment: Re:Back to BASIC (Score 1) 479

by jgrahn (#44451435) Attached to: Remember the Computer Science Past Or Be Condemned To Repeat It?

The why is so that you can take advantage of C++ features. Templates for example are a great way to write very fast code,

Fast code, sure, but the main use for C++ templates is to let you create sane, safe, readable and maintainable code. Or at least remove one of the obstacles for doing so.

Comment: Re:Other options not always an option (Score 1) 238

by jgrahn (#44316027) Attached to: Ask Slashdot: How Do You Automatically Sanitize PDF Email Attachments?

Lots of people here saying "Don't use Adobe" and suggesting alternatives. Reality is, for many of us, we deal with complex PDF forms and applications that integrate directly with Adobe Acrobat. In my business [---] "Axis of Evil" of insecure software.

That seems like an accurate, believable description of a common situation. I just wish I'd some time see someone *try to get out* of a lock-in situation like that. Or try to avoid creating more such situations. It has been well-known for decades that you can end up there, and yet organizations still plunge in, head first, all the time.

(Note that the lock-in isn't just about paying $$$ to the vendor indefinitely. It also means your data is cut off from the rest of the ecosystem; you can't benefit from inventions done elsewhere. No version control for your MS Word documents, and so on.)

% APL is a natural extension of assembler language programming; ...and is best for educational purposes. -- A. Perlis