Comments, like many things in software development, are really a matter of personal style--though they do serve the purpose of making it clear what your code is supposed to do, so the guy who maintains it later can have an understanding of what is supposed to be going on.
So I tend to lean towards a block comment per non-trivial (10 lines or more) function or method--though sometimes the block comment may be at most a sentence. (I don't think the name of the method or function is sufficient commentary enough simply because cultural differences may make a non-native English speaker unable to translate your personal naming conventions.) I also tend to lean towards describing how complex sections of code work if they appear "non-trivial" to me--meaning if I'm doing more than just calling a handful of other methods in sequence or looping through a bunch of stuff which was clearly noted elsewhere.
I also tend to write code like I write English: a single whitespace used to separate two blocks of code, and moving functionality that is associated into their own blocks (where shifting functionality does not impair the app) is preferable. I also tend to block chunks of methods in a class which are related with each other; the Apple C/C++/Objective C compiler provides the #pragma mark directive to help provide a one line comment to help identify the functionality of those groups of methods.
If I'm doing something really non-trivial--for example, implementing a non-trivial algorithm out of a book (such as a triangulation algorithm for converting a polygon into a triangle) I will start the block comment with a proper bibliographic citation to the book, the edition of the book, the chapter, page number, and (if the book provides) algorithm number of the algorithm I've implemented. (I won't copy the book; the reference should be sufficient.) If the algorithm is available on a web page I'll cite the web page. And at times when I've developed my own algorithm, I will often write it up in a blog post detailing the algorithm and refer to that in my code. (When done as a contractor I'll look for their internal wiki or, if necessary, check in a PDF to the source kit.)
Back when I started doing this a long time ago, technical writers were sometimes used to help document APIs and the project requirements of the project. The reason for having a separate technical writer was because by having a person embedded in a project who is not intimately familiar with the project, they know the right questions to ask to help document the project to an outsider, being an outsider themselves. Developers cannot do this. Developers, being too close to the code they work on, will often run the code down the "happy path" not thinking of the edge cases to run, and they won't describe things they think is obvious (like "of course you must call API endpoint 416 before calling API endpoint 397; the output of 416 must be transformed via the function blazsplat before calling 397, and how do you get the input to blazsplat?). A really good technical writer could also help to distill the "gestalt" of the thing, helping to find the "why" which then makes code or project specifications incredibly clear. (You call endpoint 416 because it's the thing that generates the authentication tokens.)
Back when I was required to train for CISSP certification (I never took the test), one section discussed the three states of knowledge--later used by Donald Rumsfeld in a rather famous quote he was lambasted for. There are things we know we know. For example, I know how to code in C. There are things we know we don't know. For example, I know I don't know how to code in SNOBOL. But then there are also things we don't know we don't know: this is why, for example, why often speakers never get any takers when he asks for questions: because his audience doesn't know what questions they should ask. They don't know what they don't know.
I think there is a fourth valid state here--and we see the computer industry absolutely riddled with them: the things we don't know we know. These are the things we've learned, but then we think somehow "they're obvious" or "everyone knows them." They're the things we don't question; the things we don't think need to be taught, because we don't know they need to be taught. And the problem is, developers--having invented the code they need to describe--don't know that they need to describe it to people who haven't spent weeks or months or years studying the problem and working towards a solution.
I know I suffer from it, I know others who suffer from it. It's why I think we need people checking our work and helping us to document our interfaces. And it's why any answer I give for how to comment code will be insufficient: because of the knowledge problem that I don't know what I should be telling you if you want to understand my code.
And it's why I think we often don't comment enough: because we don't know what we know, so we don't know what we need to tell other people.
I left ACM when it became apparent that the benefits of CACM readings didn't outweigh the harm they do by locking up research behind so many paywalls and special publications.
Two thoughts. First, I'm old enough to remember a time when all information required either a subscription, for you to buy the books or subscribe to the journals, or were a student or a member of an alumni association which granted you access to the libraries where you would physically go to read the latest and greatest. So to me, the idea that I should pay for a subscription to keep abreast of the latest stuff isn't really a foreign idea. That said, the quality of the information now available for free on the Internet is good enough one really doesn't need a subscription.
Second, however, I know quite a few people who don't even know that these organizations exist, or what the benefits are. And while that doesn't bother me, my real point is that I've met a number of developers who, when I explain these organizations, think they're completely worthless without investigating it themselves. Meaning they are not just ignorant, but have no desire to learn, and consider it completely worthless.
The problem is that our industry, unlike every other single industry except acting and modeling (and note neither are known for "intelligence") worship at the altar of youth. I don't know the number of people I've encountered who tell me that by being older, my experience is worthless since all the stuff I've learned has become obsolete.
This, despite the fact that the dominant operating systems used in most systems is based on an operating system that is nearly 50 years old, the "new" features being added to many "modern" languages are really concepts from languages that are between 50 and 60 years old or older, and most of the concepts we bandy about as cutting edge were developed from 20 to 50 years ago.
It also doesn't help that the youth whose accomplishments we worship usually get concepts wrong. I don't know the number of times I've seen someone claim code was refactored along some new-fangled "improvement" over an "outdated" design pattern who wrote objects that bare no resemblance to the pattern they claim to be following. (In the case above, the classes they used included "modules" and "models", neither which are part of the VIPER backronym.) And when I indicate that the "massive view controller" problem often represents a misunderstanding as to what constitutes a model and what constitutes a view, I'm told that I have no idea what I'm talking about--despite having more experience than the critic has been alive, and despite graduating from Caltech--meaning I'm probably not a complete idiot.)
Our industry is rife with arrogance, and often the arrogance of the young and inexperienced. Our industry seems to value "cowboys" despite doing everything it can (with the management technique "flavor of the month") to stop "cowboys." Our industry is agist, sexist, one where the blind leads the blind, and seminal works attempting to understand the problem of development go ignored.
How many of you have seen code which seems developed using "design pattern" roulette? Don't know what you're doing? Spin the wheel!
Ours is also one of the fewest industries based on scientific research which blatantly ignores the research, unless it is popularized in shallow books which rarely explore anything in depth. We have a constant churn of technologies which are often pointless, introducing new languages using extreme hype which is often unwarranted as those languages seldom expand beyond a basic domain representing a subset of LISP. I can't think of a single developer I've met professionally who belong to the ACM or to IEEE, and when they run into an interesting problem tend to search Github or Stack Overflow, even when it is a basic algorithm problem. (I've met programmers with years of experience who couldn't write code to maintain a linked list.)
So what do we do?
Beats the hell out of me. You cannot teach if your audience revels in its ignorance and doesn't think all that "old junk" has any value. You cannot teach if your students have no respect for experience and knowledge. You cannot teach if your audience is both unaware of their ignorance and disinterested in learning anything not hyped in "The Churn."
Sometimes there are a rare few out there who do want to learn; for those it is worth spending your time. It's been my experience that most software developers who don't bother to develop their skills and who are not interested in learning from those with experience often burn out after a few years. In today's current mobile development expansion, there is still more demand than supply of programmers, but like that will change, as it did with the dot-com bubble, and a lot of those who have no interest in honing their skills (either out of arrogance or ignorance) will find themselves in serious trouble.
Ultimately, as an individual I don't know if there is anything those of us who have been around for a while can do very much of anything, except offer our wisdom and experience to whomever may want to learn. As someone who has been around for a while it is also incumbent on us to continue to learn and hone our skills; just this past few years I picked up another couple of programming languages and have been playing around with a new operating system.
And personally I have little hope. Sure, there is a lot of cutting edge stuff taking place, but as an industry we're also producing a lot of crap. We've created working environments that are hostile (and I see sexism as the canary in the coal-mine of a much deeper cultural problem), and we are creating software which is increasingly hostile to its users, despite decades of research showing us alternatives. We are increasingly ignoring team structures that worked well in the 1980's and 1990's: back then we saw support staff (such as QA and QAE and tech writers) who worked alongside software developers; today in the teams I've worked for I'm hard pressed to find more than one or two QA alongside teams of a dozen or more developers. I haven't seen a technical writer embedded in a development team (helping to document API interfaces and internal workings) for at least 20 years. And we are increasingly being seen as line workers in a factory rather than technically creative workers.
I'm not bitter; I just believe we've fallen into a bunch of bad habits in our industry which need a good recession and some creative destruction to weed out what is limping along. And I hope that others will eventually realize what we've lost and where we're failing.
But for me; I'll just keep plugging along; a 50+ year old developer in an industry where 30 is considered "old", writing software and quietly fixing flaws created by those who can't be bothered to understand that "modules" are not part of VIPER or that MVC can include database access methods in the "model" and who believe all code is "self-documenting."
And complaining about the problems when asked.
Dude, you can't even reasonably calculate the number of man-hours that have been lost to air-powered hammers when used to frame a stick-and-nail framed house. Or the number of man-hours lost to power saws (over using a hand-saw). Or in other industries, the number of man-hours lost to software developers as we transitioned from punch cards to having our own desktop computers, or the number of man-hours lost as we transitioned to better IDEs which allow us to more quickly find and fix problems in our software.
Or take the production of films. Can you reasonably calculate the number of man-hours lost when movie makers transitioned from cellulose film stock to using Red cameras and an all-digital production process?
And part of the reason why you cannot say what has been lost is because two things happen when automation takes people's jobs. Prices for a thing go down, but also, money is available to expand the offerings we get. Houses get bigger. Software gets more complex and more intricate. Movies contain more special effects and become "grander" and on a larger scale.
The real problem I have with the reasoning used by those who assume increased productivity (which is what "robots" give us) is that they assume, like Charles Duell's apocryphal quote from 1899 presumes, that everything that can be invented has been invented, and that life will continue on pretty much the same, with the same offerings, same products, same goods and services--but just with fewer people doing them. It's zero-sum thinking--and from an economics perspective, zero-sum thinking has been the source of pretty much most of the evils of the past century.
Remember: whatever happens, whenever anything changes there are always winners and always losers. It doesn't mean we should do nothing, however: doing nothing is a choice with its own array of winners and losers. Remember: millions and millions of low-paid, low skilled jobs were lost in the shift from an agrarian to a manufacturing economy.
What matters is how it affects the overall population as an aggregate. To decide not to act because you're afraid automation will drive lower-paid workers out of the marketplace is to decide to stall wealth creation because you're afraid the wealthiest will benefit.
My understanding was that the $75,000 (or €75,000) cap was a statistical average across regions with varying levels of cost of living and across varying groups of people--including those who definitely report greater levels of happiness and those who do not. And my impression was that the structure of the questionnaire was capturing the happiness which comes from socio-economic stability: at around $75,000, on average, you have achieved a degree of socio-economic security which means you are no longer worrying about where your next meal is coming from, if the roof over your head will be there next year, how you're going to pay for the kid's clothes, and if you can afford a vacation. (And as that number rises, the cost of the vacation or the car or the clothes may increase--but the fact that you will have access to them at some level does not.)
Meaning if you suddenly found yourself flooded with riches, and you are not able to use that money to find happiness, learn to shop better.
One place where I see a 3D printer being of use is when repairing things with hard-to-obtain parts. But of course you can't do this unless you have a database of parts you can print for the thing you are repairing. So like MP3 players (which did not explode until there was a database of downloadable songs that you could buy for 99 cents), we need a database of 3D printable parts for things like dishwashing machines and refrigerators and the like which can be downloaded for relatively cheap and printed on your printer which can be used to fix the broken component.
Of course not all parts can be replaced like this. But certainly there are plenty of components (such as the plastic drive gears in a garage door opener) which can be printed and replaced by consumers.
At the higher end I can see companies like auto repair shops using professional or pro-consumer level printers for printing harder, and more refined components for auto repairs, and even using 3D subtractive technologies (like CAD-driven lathes and CAD-driven milling machines) for making metal components which fail that do not require tight tolerances.
I think where things like the MakerBot gadget failed was that it seemed to be oriented around the idea that everyone could design their own components. But even in today's environment there are far fewer mechanical engineers and designers than folks like that give credit for.
By the way, my own personal feelings are conflicted here. On the one hand I can understand a baker who may not wish to participate in a gay wedding by baking a wedding cake, or a company like Twitter not carrying the President Elect's messages on their platform; forcing them to participate means in order to do business they cannot be selective about their customers. On the other hand, I do find it despicable that a customer would be discriminated against for something as personal--but ultimately inconsequential to the greater public good--as one's own sexual orientation or political beliefs.
In this case I believe the baker or the message aggregation company may hold to their own personal beliefs, but should then suck it up and perform the work realizing they are performing a sort of public service. I have less patience for the baker in this case, only because Christian faith teaches one to love the sinner while hating the sin--meaning if they believe homosexuality is a sin (I don't, BTW), they still need to forgive the sinner and to act gracefully. (I'm reminded of the story of a bunch of Jewish rabbis who visit a farmer to slaughters his prized pig in order to celebrate the visit. As the story goes, the rabbis eat the prepared pig because acting graceful in this situation supersedes the requirement to keep kosher.)
It's sad that we now live in an era where gracefulness is no longer respected, emulated or even acknowledged in the public sphere. And to me it's sad that Twitter responds to President-elect Trump's apparent lack of grace by doubling down on being even less graceful.
As it turns out, for the purposes of hiring and firing, political orientation is a protected class in some states, such as California. Which is why the CEO of Grubhub had to backtrack from his original e-mail to employees asking Trump supporters to resign.
In the case of Twitter, I'm not sure if they aren't already exposed to some legal consequences for their decision--however, I'm not seeing the AG of California stepping up and doing anything here, since it's pretty clear in this political climate we're slowly turning all of these tools (both legal and technological) into weapons to bash the opposition--generally the right, as most tech companies both reside in liberal areas and tend to lean liberal themselves.
However, given the problems Twitter seems to be having as a business (being unable to monetize its audience without breaking the platform), I suspect pissing off half your audience is not going to help the bottom line--and ultimately Twitter may find itself on the losing end of free market competition.
Does a bakery, as a private company, have the right to say "No cakes that we don't like"?
Actually, it depends on the state where you live. Legal Zoom observes that there are about 20 states in the United States which have anti-discrimination laws based on sexual orientation, so if you are in one of those states, as a practical matter (rather than as a moral one) you cannot discriminate against a gay couple wanting a wedding cake for their gay marriage. However, you can discriminate against a skinhead group from wanting a swastika on their cake, since membership in a skinhead group is not a protected class.
That's akin to saying I wonder who would win a football game if we awarded points for things like field goals and touchdowns differently--without realizing that by altering the point system, you alter the behavior of the players on the field.
There are plenty of places in California--a solidly Democratic state which is highly unlikely to vote for a Republican candidate anytime soon--where a popular vote scheme would yield plenty of votes for a Republican candidate, such as through the San Joaquin Valley.
U X e dUdX, e dX, cosine, secant, tangent, sine, 3.14159...