Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 Internet speed test! ×

Comment Re:More ... (Score 2) 73

What determines walkability or bikability is population density. The population density of most populated areas of the United States is very low compared to Europe. On the other hand, if given a choice most people in the United States want their single family detached homes, their modest yards and a little elbow room from their neighbors--which directly implies either a low population density, or desirable housing only being accessible to a very small percentage of the population.

Comment Makes sense to me. (Score 2) 73

As someone who used to bike to work, I understand how it is possible more people work at home instead. To be able to walk to work or to bike to work is a luxury driven by being able to live close enough to where you work--and for many jobs that means living in a highly populous urban core or being wealthy enough (or in my case, lucky enough) to live in a home near the downtown corridor where your job is located.

Working at home, on the other hand, is simply a function of having the right job. And I know quite a few people who work at home: I know a couple of people who work for Apple as customer support who work out of their homes, and I know of a bunch of account managers at YP.com who work out of their homes. If your job involves a lot of time talking to people on the phone or chatting over the Internet, it doesn't really matter where you are located so long as you have a phone line and an internet connection.

Comment Re:Sounds dangerous (Score 1) 73

Came here to say more or less the same thing.

Depending on where the part is used and how the part is made, I'd be concerned with the integrity of the part over time. Remember: airplanes tend to fly for a half-century before they're scrapped--and if the part is used somewhere structural which is hard to access or hard to replace, chances are it won't be inspected or replaced as regularly as the FAA would like, simply because long-haul aircraft are turned around rather rapidly. (The financial incentive is for a fast turn-around, which can lead to things not being done quite right or inspected as closely as they should.)

Comment Generally I only use a subset of C++ (Score 3, Insightful) 315

I first started using C++ back in the 1980's.

I am a huge fan of classes. When C++ was basically a preprocessor for C which introduced the class keyword, I thought it was pretty cool.

When exceptions were introduced to the language I thought C++ was fairly complete as a language. If from there the designers of C++ had addressed the fragile base class problem, lambdas and some form of introspection, I think C++ would have been a fantastic language.

Instead, we got templates. I'm not a fan of templates.

And when we got the standard template library, I thought someone was smoking crack. Using the left shift and right shift operators to mean "input" and "output"? Really? Really?!?

When I write C++ I try to stick to using a subset of C++ which includes classes and exceptions. I use templates sparingly, only when they are really needed, and I refuse to use the C++ left-shift and right-shift operators. (I really feel like the person who designed that thought "how cool; we can override a shift operator to mean input and output!" But just because you can doesn't mean you should, and now we're stuck with this bit of syntactic bit of bullshit.

We're finally getting around to lambdas, though too late: Apple has already wedged blocks into a non-standard extension. And I'm not holding my breath on introspection or on fixing the fragile base class problem simply because the run-time implementation recommendation for classes way back in the late 1980's has become a baked in de-facto feature. (Sadly it would have been relatively easy to solve by introducing link-time assignment of the indexes in the virtual table pointer; that way, as the base class is recompiled the index references for the methods in the virtual table could be reassigned at link time. This also solves fragile access to public class fields; simply replace them with a standard getter and setter method which is accessed via the virtual table.)

Comment Not really that surprised. (Score 4, Interesting) 155

I have ESPN as part of my cable package. I don't watch ESPN. I don't care at all about ESPN. Yet somehow because I have a package I'm getting ESPN. And when the future changes such that I can cancel my cable package and get the channels I actually watch a' la cart, ESPN would be the last channel I subscribe to.

And I suspect a lot of people are like me: we have ESPN, we're paying for ESPN, but we don't watch ESPN. And I strongly suspects ESPN knows this. So it makes sense ESPN would be worried; suddenly the unsustainability of their economic position--being subsidized by millions of viewers like myself--will be exposed.

Comment Re:We worship at the altar of youth here. (Score 1) 347

Comments, like many things in software development, are really a matter of personal style--though they do serve the purpose of making it clear what your code is supposed to do, so the guy who maintains it later can have an understanding of what is supposed to be going on.

So I tend to lean towards a block comment per non-trivial (10 lines or more) function or method--though sometimes the block comment may be at most a sentence. (I don't think the name of the method or function is sufficient commentary enough simply because cultural differences may make a non-native English speaker unable to translate your personal naming conventions.) I also tend to lean towards describing how complex sections of code work if they appear "non-trivial" to me--meaning if I'm doing more than just calling a handful of other methods in sequence or looping through a bunch of stuff which was clearly noted elsewhere.

I also tend to write code like I write English: a single whitespace used to separate two blocks of code, and moving functionality that is associated into their own blocks (where shifting functionality does not impair the app) is preferable. I also tend to block chunks of methods in a class which are related with each other; the Apple C/C++/Objective C compiler provides the #pragma mark directive to help provide a one line comment to help identify the functionality of those groups of methods.

If I'm doing something really non-trivial--for example, implementing a non-trivial algorithm out of a book (such as a triangulation algorithm for converting a polygon into a triangle) I will start the block comment with a proper bibliographic citation to the book, the edition of the book, the chapter, page number, and (if the book provides) algorithm number of the algorithm I've implemented. (I won't copy the book; the reference should be sufficient.) If the algorithm is available on a web page I'll cite the web page. And at times when I've developed my own algorithm, I will often write it up in a blog post detailing the algorithm and refer to that in my code. (When done as a contractor I'll look for their internal wiki or, if necessary, check in a PDF to the source kit.)

Back when I started doing this a long time ago, technical writers were sometimes used to help document APIs and the project requirements of the project. The reason for having a separate technical writer was because by having a person embedded in a project who is not intimately familiar with the project, they know the right questions to ask to help document the project to an outsider, being an outsider themselves. Developers cannot do this. Developers, being too close to the code they work on, will often run the code down the "happy path" not thinking of the edge cases to run, and they won't describe things they think is obvious (like "of course you must call API endpoint 416 before calling API endpoint 397; the output of 416 must be transformed via the function blazsplat before calling 397, and how do you get the input to blazsplat?). A really good technical writer could also help to distill the "gestalt" of the thing, helping to find the "why" which then makes code or project specifications incredibly clear. (You call endpoint 416 because it's the thing that generates the authentication tokens.)

Back when I was required to train for CISSP certification (I never took the test), one section discussed the three states of knowledge--later used by Donald Rumsfeld in a rather famous quote he was lambasted for. There are things we know we know. For example, I know how to code in C. There are things we know we don't know. For example, I know I don't know how to code in SNOBOL. But then there are also things we don't know we don't know: this is why, for example, why often speakers never get any takers when he asks for questions: because his audience doesn't know what questions they should ask. They don't know what they don't know.

I think there is a fourth valid state here--and we see the computer industry absolutely riddled with them: the things we don't know we know. These are the things we've learned, but then we think somehow "they're obvious" or "everyone knows them." They're the things we don't question; the things we don't think need to be taught, because we don't know they need to be taught. And the problem is, developers--having invented the code they need to describe--don't know that they need to describe it to people who haven't spent weeks or months or years studying the problem and working towards a solution.

I know I suffer from it, I know others who suffer from it. It's why I think we need people checking our work and helping us to document our interfaces. And it's why any answer I give for how to comment code will be insufficient: because of the knowledge problem that I don't know what I should be telling you if you want to understand my code.

And it's why I think we often don't comment enough: because we don't know what we know, so we don't know what we need to tell other people.

Comment Re:We worship at the altar of youth here. (Score 1) 347

I left ACM when it became apparent that the benefits of CACM readings didn't outweigh the harm they do by locking up research behind so many paywalls and special publications.

Two thoughts. First, I'm old enough to remember a time when all information required either a subscription, for you to buy the books or subscribe to the journals, or were a student or a member of an alumni association which granted you access to the libraries where you would physically go to read the latest and greatest. So to me, the idea that I should pay for a subscription to keep abreast of the latest stuff isn't really a foreign idea. That said, the quality of the information now available for free on the Internet is good enough one really doesn't need a subscription.

Second, however, I know quite a few people who don't even know that these organizations exist, or what the benefits are. And while that doesn't bother me, my real point is that I've met a number of developers who, when I explain these organizations, think they're completely worthless without investigating it themselves. Meaning they are not just ignorant, but have no desire to learn, and consider it completely worthless.

Comment We worship at the altar of youth here. (Score 5, Insightful) 347

The problem is that our industry, unlike every other single industry except acting and modeling (and note neither are known for "intelligence") worship at the altar of youth. I don't know the number of people I've encountered who tell me that by being older, my experience is worthless since all the stuff I've learned has become obsolete.

This, despite the fact that the dominant operating systems used in most systems is based on an operating system that is nearly 50 years old, the "new" features being added to many "modern" languages are really concepts from languages that are between 50 and 60 years old or older, and most of the concepts we bandy about as cutting edge were developed from 20 to 50 years ago.

It also doesn't help that the youth whose accomplishments we worship usually get concepts wrong. I don't know the number of times I've seen someone claim code was refactored along some new-fangled "improvement" over an "outdated" design pattern who wrote objects that bare no resemblance to the pattern they claim to be following. (In the case above, the classes they used included "modules" and "models", neither which are part of the VIPER backronym.) And when I indicate that the "massive view controller" problem often represents a misunderstanding as to what constitutes a model and what constitutes a view, I'm told that I have no idea what I'm talking about--despite having more experience than the critic has been alive, and despite graduating from Caltech--meaning I'm probably not a complete idiot.)

Our industry is rife with arrogance, and often the arrogance of the young and inexperienced. Our industry seems to value "cowboys" despite doing everything it can (with the management technique "flavor of the month") to stop "cowboys." Our industry is agist, sexist, one where the blind leads the blind, and seminal works attempting to understand the problem of development go ignored.

How many of you have seen code which seems developed using "design pattern" roulette? Don't know what you're doing? Spin the wheel!

Ours is also one of the fewest industries based on scientific research which blatantly ignores the research, unless it is popularized in shallow books which rarely explore anything in depth. We have a constant churn of technologies which are often pointless, introducing new languages using extreme hype which is often unwarranted as those languages seldom expand beyond a basic domain representing a subset of LISP. I can't think of a single developer I've met professionally who belong to the ACM or to IEEE, and when they run into an interesting problem tend to search Github or Stack Overflow, even when it is a basic algorithm problem. (I've met programmers with years of experience who couldn't write code to maintain a linked list.)

So what do we do?

Beats the hell out of me. You cannot teach if your audience revels in its ignorance and doesn't think all that "old junk" has any value. You cannot teach if your students have no respect for experience and knowledge. You cannot teach if your audience is both unaware of their ignorance and disinterested in learning anything not hyped in "The Churn."

Sometimes there are a rare few out there who do want to learn; for those it is worth spending your time. It's been my experience that most software developers who don't bother to develop their skills and who are not interested in learning from those with experience often burn out after a few years. In today's current mobile development expansion, there is still more demand than supply of programmers, but like that will change, as it did with the dot-com bubble, and a lot of those who have no interest in honing their skills (either out of arrogance or ignorance) will find themselves in serious trouble.

Ultimately, as an individual I don't know if there is anything those of us who have been around for a while can do very much of anything, except offer our wisdom and experience to whomever may want to learn. As someone who has been around for a while it is also incumbent on us to continue to learn and hone our skills; just this past few years I picked up another couple of programming languages and have been playing around with a new operating system.

And personally I have little hope. Sure, there is a lot of cutting edge stuff taking place, but as an industry we're also producing a lot of crap. We've created working environments that are hostile (and I see sexism as the canary in the coal-mine of a much deeper cultural problem), and we are creating software which is increasingly hostile to its users, despite decades of research showing us alternatives. We are increasingly ignoring team structures that worked well in the 1980's and 1990's: back then we saw support staff (such as QA and QAE and tech writers) who worked alongside software developers; today in the teams I've worked for I'm hard pressed to find more than one or two QA alongside teams of a dozen or more developers. I haven't seen a technical writer embedded in a development team (helping to document API interfaces and internal workings) for at least 20 years. And we are increasingly being seen as line workers in a factory rather than technically creative workers.

I'm not bitter; I just believe we've fallen into a bunch of bad habits in our industry which need a good recession and some creative destruction to weed out what is limping along. And I hope that others will eventually realize what we've lost and where we're failing.

But for me; I'll just keep plugging along; a 50+ year old developer in an industry where 30 is considered "old", writing software and quietly fixing flaws created by those who can't be bothered to understand that "modules" are not part of VIPER or that MVC can include database access methods in the "model" and who believe all code is "self-documenting."

And complaining about the problems when asked.

Comment Re: How is that supposed to happen? (Score 1) 388

Read my whole comment. Because in our world of increasing automation some things got more expensive, not less--but the also got corresponding far more complex. Like the cell phone in your pocket which is several orders of magnitude more complex. Without automation ranging from better IDEs to automation in assembling the printed circuit boards, the modern smart phone would not be possible.

Comment Re:How is that supposed to happen? (Score 1) 388

I'm wondering how much IBM should be taxed for introducing Eclipse, which made Java programmers more productive. I mean, if I can produce code twice as fast in Eclipse than I can in a command-line editor like 'ed', should IBM be forced to pay a tax equivalent to my income tax because I downloaded their software?

Comment Re:How is that supposed to happen? (Score 4, Interesting) 388

Dude, you can't even reasonably calculate the number of man-hours that have been lost to air-powered hammers when used to frame a stick-and-nail framed house. Or the number of man-hours lost to power saws (over using a hand-saw). Or in other industries, the number of man-hours lost to software developers as we transitioned from punch cards to having our own desktop computers, or the number of man-hours lost as we transitioned to better IDEs which allow us to more quickly find and fix problems in our software.

Or take the production of films. Can you reasonably calculate the number of man-hours lost when movie makers transitioned from cellulose film stock to using Red cameras and an all-digital production process?

And part of the reason why you cannot say what has been lost is because two things happen when automation takes people's jobs. Prices for a thing go down, but also, money is available to expand the offerings we get. Houses get bigger. Software gets more complex and more intricate. Movies contain more special effects and become "grander" and on a larger scale.

The real problem I have with the reasoning used by those who assume increased productivity (which is what "robots" give us) is that they assume, like Charles Duell's apocryphal quote from 1899 presumes, that everything that can be invented has been invented, and that life will continue on pretty much the same, with the same offerings, same products, same goods and services--but just with fewer people doing them. It's zero-sum thinking--and from an economics perspective, zero-sum thinking has been the source of pretty much most of the evils of the past century.

Comment Re:Theory vs. Practice (Score 1) 318

Remember: whatever happens, whenever anything changes there are always winners and always losers. It doesn't mean we should do nothing, however: doing nothing is a choice with its own array of winners and losers. Remember: millions and millions of low-paid, low skilled jobs were lost in the shift from an agrarian to a manufacturing economy.

What matters is how it affects the overall population as an aggregate. To decide not to act because you're afraid automation will drive lower-paid workers out of the marketplace is to decide to stall wealth creation because you're afraid the wealthiest will benefit.

Comment Re:Or course not. (Score 1) 406

My understanding was that the $75,000 (or €75,000) cap was a statistical average across regions with varying levels of cost of living and across varying groups of people--including those who definitely report greater levels of happiness and those who do not. And my impression was that the structure of the questionnaire was capturing the happiness which comes from socio-economic stability: at around $75,000, on average, you have achieved a degree of socio-economic security which means you are no longer worrying about where your next meal is coming from, if the roof over your head will be there next year, how you're going to pay for the kid's clothes, and if you can afford a vacation. (And as that number rises, the cost of the vacation or the car or the clothes may increase--but the fact that you will have access to them at some level does not.)

Meaning if you suddenly found yourself flooded with riches, and you are not able to use that money to find happiness, learn to shop better.

Slashdot Top Deals

Computers are useless. They can only give you answers. -- Pablo Picasso

Working...