Please create an account to participate in the Slashdot moderation system


Forgot your password?
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:really?? (Score 1) 1134

I agree; but the article is making out like the command line is such a prevalent force that simple tasks like editing documents and browsing the web require a command line. This simply isn't true, I know many people who shutter in fear at the thought of having to type commands at a text console, and yet use a Linux distribution with ease and satisfaction each and every day.

Comment Re:really?? (Score 1) 1134

What twit would attempt to edit a photo with the command line without actually having a good reason?

Try building software without a command line, sure you could do it...wait no you can't because even with GUI based compilers, buttons are just wrappers around commands.

The fact is certain tasks are better attacked by a GUI or a CLI, to completely favor one over the other is a mistake.

Comment Re:really?? (Score 1) 1134

The summery is assuming that to use an operating system built on Linux you have to be a command line wizard. This simply isn't true. I mean, unless you are setting these people up with Gentoo or Linux From Scratch I can't see why anyone would need to look at a command line unless they where doing something particularly technical to began with.

My mother is a perfect example; completely inept with computers, and yet she browses the web, creates documents, edits photos, makes phone calls, installs software, watches movies, listens to music, does email, shops on line, and a bunch of other stuff from a Linux box. As far as she is concerned, the thing doesn't have a command line.

Comment Re:really?? (Score 5, Insightful) 1134

Why care about the command line? Because it is a whole lot easier then getting carpal tunnel clicking fifty different things when I could just type a couple commands and get the job done.

Just because non-technical users are afraid of a particular interface does not mean you rip it out. After all, distros like Ubuntu, Debian, Linux Mint, Fedora, RedHat and I'm sure plenty of others make it very easy for Joe User to get his computing done.

Comment Re: (Score 1) 1134

I can think of many reasons why the command line is still a very important part of any operating system. If, as a developer, you are worried that Joe User needs access to your tool, then make it easy for them. Rather then have the whole system cater to the computer illiterate.

Comment Re: (Score 1) 305

See, now you are just twisting the whole "buffer" and "dynamically allocated memory" thing and just using whichever term when it suits you.
When I pointed out you where going from one to the other it was because you questioned me, asking how I thought printf("hello world\n"); uses any dynamic memory when I never said such a thing, I specifically said "buffer"

Not necessarily. I agree that simply guessing at the max size of a buffer is incorrect. But I think it is better to start out with a max guess with the option to calculate and adjust as needed. On most platforms it is more expensive to actually allocate new memory then to simply have one big allocated at program start chunk.

You shouldn't criticize a simple call to exit(), it is impossible to think up every single thing users will throw at your program. I think the best approach is to try and figure out the most common errors and create code that adjusts for them, then a generic handler that either calls exit() or returns and lets the calling function decide what to do based on the return value is a good catch all. Even then you arn't completely covered.
What about race conditions? Is it really better to try and hide the fact that data was incorrectly processed, or instead, having realized data is incorrect, calling exit() because it wouldn't make any sense to continue. (though we should try to avoid race conditions!)
What if I run:
yes $VERY_LONG_STRING | your_program
It just can't continue calculating the length of the string, at some point it has to say "this is too long" and exit.
I tend to stick with systems programming, I never got into hardware that much.

I completely agree with the second and third statement of your closing. However the first one, eh. I just don't see how functions like strncpy() are that big of a difference then how things where done before. It is perfectly possible to write code with strcpy() for example, and not leave openings for buffer overflows. I just feel like they are changing the language for the sake of changing it. And I also feel like the more functions like strncpy() that get pushed into the standard library, the closer we are to taking way the power that C gives you by doing what you asked for.

Comment Re: (Score 1) 305

Some how you went from "buffer" to "dynamically allocated memory" which although related is not one and the same. I'm sure you can figure out how too look up the definitions of both and see that there is a difference.

I'll admit I misinterpreted what you wrote.

But here is the problem with strncpy() and similar functions it only reads so many characters. So if you are in a situation where more is being written into a buffer then is there;
1) your string isn't actually a string because it isn't null-terminated.
2) it is actually more work for you to implement the handling of the error, I can't think of too many situations where not having the whole string is useful. By explicitly checking buffer size, I can adjust my destination buffer to include enough space for the whole string and not loose any data OR if it is such a hideous problem I can simply exit() or return right then and there in fewer lines of code that are easier to read. At which point why even bother using strncpy() if I'm already checking buffer size manually?

Slashdot Top Deals

Pound for pound, the amoeba is the most vicious animal on earth.