Bad comments tell what the code is doing when that's obvious from the code itself. Good comments explain why, especially when they're fixes for non-obvious bugs. If you spent a lot of time figuring out the solution to a tough problem, you owe the next guy to see your code an explanation.
Not exactly. Sometimes comments describing what the code does are what is needed. For example when commenting assembly language, describing what instructions do from the point of view of the problem solved is really helpful. Another example is when writing some not-that obvious optimization. Of course both are examples of code with no obvious behaviour. Obvious code, I agree, should not be commented. But obvious to one is not nesseccery obvious to other
I don't understand how we can answer why, especially in the code. For example
//what: Adds two numbers
//why: I don't know
The why should be answerd elsewhere. In some design document or in a design choice comment.
I have a phone. My wife has a phone. Our son has a phone.
My family then has one computer with three accounts on it.
Sure there are families with multiple computers and one phone, but I doubt that one phone is passed around each day to a different family member. A mobile phone isn't consumed like it was a mobile version of a land line (one line per household).
So instead of selling one device per household with a computer, you sell one device per member of household. A much larger addressable market.
And you probably have a computer at work, and may be your wife has one at her work.
Things without keyboard are much tougher to use for real work.
"Arbitrary precision" means exactly what it says: for any given finite precision, there exists an amount of space and time in which the computation of a (computable) number can be successfully completed.
In other words -- limited precision.
I've just outlined why I think that my statements are right, and I'm really interested in why you exactly do you think that I'm wrong.
There are classes of problems which can be computed exactly. My examples were adding natural numbers, or splitting in half on binary floating point. Using rational arithmetics can solve exactly a lot of problems.
Actually, computers are already capable of computing with arbitrary precision - they're just incapable of computing with infinite precision.
Both of your statements are wrong.
First the precision used for computations is limited by both RAM and CPU power.
And the second - for a lot of computations infinite precision is possible, feasible and used. For example computing 2+2 or 17/2.
The advancements in hardware were used to allow a saving in software development costs.
The hardware isn't advancing equally on all fronts. For example the memory latency havent increased noticably in the last 15 years.
Our softwere solves more prblems than the one 20 years ago. Now 95% of the features in each and evry software are not used by 95% of the users. 20 years ago it was much different.
Now we have bloat, but also we have power and freedom to do much more
The software still costs a lot, and it's buggier than ever, because of its quantity.
Any idiot who thinks C++ is a bad language, should be digging ditches for a living.
There are no bad languages. There are simply bad programmers and bad language choices.
You are working yourself to death. Do something different once in a while before you go crazy and kill everyone at your workplace.
What do you suggest? Walking oneself to death?
Debuggers are the worst tool for development, it makes programmers not think and analyze their code and therefore they don't understand it.
Very well said.
My previous company had excactly the same missing herarchy. This is one of the reasons for my leave.
In that company nepotism was the hiring method. Ture, after one was hired, nobody cared whos man you are. But the picture was the following: Everybody got the same sallary and everybody worked as they wanted.
This sucks a lot. One can quickly loose any motivation.
I agree, than hiring workers with higher motivation and skills could lead to different results, but still, in time, everything will fuck up.
My previous company does still all right. It is pulled forward by the same 3 guys. They still get the same money as the other 50.
I'm happy I'm not there. .
C lacks proper object/class support as well; is C a horrible language?
Yes, but not because of that.