Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Strong disagree (Score 2) 333

I strongly disagree with the poster. We are in the best period of mobile apps, not the worst. During this period programmers are learning what can be done in the mobile environment and what can not be done well. The Android and Apple mobile environments are very much an exciting, experimental playground right now. The major problem cited is that mobile apps are inefficient, and that they slow down your phone. That won't be much of an issue in a few years, as processors keep getting faster and phones start to ship with 64 GB or memory or more.

Sure there are a lot of bad apps, but that's the point. Try out the bad apps, and learn what you should and should not be using your mobile environment for. Maybe what you originally thought was a bad app turns out to be quite useful for you. On the other hand, the most obvious mobile app might not be very useful. I make under 7 phone calls a week, but spend 4-5 hours per week using the Facebook app. At first I thought Instagram was fairly useless but today I use it more than Facebook. I don't fully know all of my use cases for my phone, but already it's an indispensible tool, and I find about 20-30 apps to be quite useful. Another 100 apps are marginally useful. The rest are as much an experiment for the programmers as for me.

The issue with permissions is being worked on by the Android developers. It's a separate issue.

In 10 years mobile apps will be quite stable. We'll have maybe 50 winners, and things will be quite boring.

Comment No (Score 4, Insightful) 627

Whether or not you use an IDE ought to say very little about how good of a programmer you are.

What makes a good programmer is someone who can produce stable, maintainable code in a reasonable time frame and someone who isn't worried about getting fired in order to fight for these goals. One part of maintainability is readable code and the other part is being able to communicate what you've done through documentation, written or oral.

Over the decades I've found that it makes no difference what tools you use, or what your age or educational or cultural background is. It doesn't matter so much whether you write few or many tests. You need to be patient, stubborn, thorough, curious, a problem solver, a voracious reader, and a great communicator to be a great programmer, and you need to have been doing it for at least 10 years. But companies should not shy away from helping to give someone those 10 years, because the best programmers will still do good work early on in their careers.

If you write code that just works but is unmaintainable by anyone and you hole up to write your code and you have no ability to communicate what you have done then you are a horrible programmer and you should be fired. There is a myth among some people that these are actually great programmers. These types of programmers tend to be, but are not always, extremely well qualified in terms of their educational or other experience but they make life difficult for all the other programmers that have to maintain their fragile junk. Fortunately, this type of software is less common in the free software community because this type of programming is called out.

Comment What? (Score 1) 139

"When you hear a friend's voice, you immediately picture her, even if you can't see her."

Who is this referring to? I never visualize people. I'm a non-visual learner. When I hear someone's voice I think of the person's identity, never an image.

Comment Re:It's personality (Score 4, Insightful) 312

No, those aren't the best engineers. Those are terrible engineers, people who have done a great job memorizing their university textbooks and they probably got all A's and can tell you 100 useless computer science facts about trees.

The best software engineers were child prodigies who began programming as children, saw the forest for the trees at the university and didn't care much about their grades, people who have done hobbyist software work throughout their lives. These people can explain engineering to a child, admit when they make mistakes, and you can discuss with them any subject whatsoever. These people find what they need using Google, because they are great general problem solvers.

Comment Programmers (Score 1) 130

We need all types of programmers, as eventually all jobs, no matter how menial, will require some programming skills.

Most technological device use is what was once called programming, but is not considered programming anymore. Think about the complex things millions of people do with their telephones. In the 1980's only a select group of the world's most skilled programmers could do some of those things. Is it really so different to do something with five button presses than doing something with 500 lines of code? You're still programming, just at a higher level. That 500 lines of code has been written and it's what makes an operation feel more natural but still be programming.

Maybe we need classes in modern devices, and to get away from the word programming. When you play your favorite smartphone app, you're programming. Take it to the next level and write an app yourself. Not everyone has to write industrial grade banking software.

Comment All software is buggy (Score 1) 187

No software in common use today is mathematically proven to be correct; therefore, all software is buggy.

The most likely place for bugs is in error handling code, because no matter how many tests you write it is impossible to simulate every possible error condition.

We hope that everyone walking into a store doesn't steal something. Only a tiny minority do but a much larger number could get away with it.

The same goes for software. Any halfway decent programmer can find bugs in error handlers. If he chooses to be a whore, then he uses that skill to make money for criminal gangs or in some cases for anti-malware companies. Programmers who are not whores write actual new and useful software, and usually get paid enough that they can lead fairly happy lives. But it always helps to program defensively. Make your error handling just a bit better than the next piece of software. It will never be perfect. But as a society we count on the fact that nearly all people don't try to use whatever particular knowledge they've acquired to screw you over. Programmers are especially moral. We could bring society to its knees if we wanted to, but we prefer to make the world better.

I don't blame Adobe for the bugs. Millions of people are using this software and probably a dozen or two as I put it whores are in league with criminal gangs trying to sell you boner pills and the like. This handful of people aren't the ones finding new classes of exploits. That is a good function of security researchers. These people are instead likely just exploiting old, known, and quite ordinary bugs.

Comment Last about five years? (Score 4, Interesting) 237

I've either personally owned or purchased for companies I've worked for dozens of hard drives of all (except ESDI) technologies including MFM/RLL, IDE (parallel and serial), SCSI (original, wide, ultra wide, etc.) of form factors from full height 5.25 inch to 2.5 inch, dating back to 1991, and in my experience most hard drives last until you throw them away after 10 or 15 years because they're too small.

A few hard drives die in the first 6 months, and maybe 5-10% die in 3-5 years. Saying that disk drives last about 5 years just doesn't agree with my experience at all. Hard drives essentially last an infinite amount of time, defined by me as until they're so small that their storage can be replaced for under a dollar.

I do agree with the author's other points. Certain lines of hard drives have more like a 100% failure rate after 5 years. One 250 GB hard drive I purchased was RMA replaced with a 300 GB model because the 250 GB line was essentially faulty.

I think these studies might be looking at 7200 or 10000 RPM SCSI units under extremely high use. That's not how consumers use hard drives.

Comment b should be the first to go (Score 1) 254

802.11b should be the first to go, but not 802.11a. Even though it didn't get good industry support, 802.11a is great. People instead adopted 802.11g, which is not 5 GHz like 802.11a, but it had better compatibility with 802.11b.

I was pleasantly surprised when I learned that my Samsung Galaxy S III supports 802.11a. I took my 802.11a AP out of storage and returned to wireless.

At some point in the near future I'll be purchasing 802.11ac equipment and putting my a network to bed. My two 802.11a adapters are PCMCIA, and laptops don't have that anymore, so I'll be generating three pieces of fairly useless eWaste.

Comment Correct, as usual (Score 1) 1098

Mr. Stallman doesn't have a publicist, a speech writer, or an image consultant; therefore, the way he phrases things can hurt people's feelings. Mr. Stallman has a damned good track record of being correct, though. Sometimes it takes a decade or more for what he says to be proven correct but rarely is he ever completely off the mark.

You can either spend millions of dollars spreading a polished version of your message or be deliberately provocative and offended parties will spread your message for free. If you take a step back and think for a minute, you realize how smart of a man Mr. Stallman is.

Comment You get what you pay for (Score 1) 324

Move somewhere without these types of covenants and this type of association. Sounds a little bit like you're getting what you deserve or you didn't do the research before moving in.

Ham radio operators have been dealing with this since I was licensed in 1991 and probably much earlier. Move somewhere, they forbid you from erecting an antenna, and you can't set up your station, public service or otherwise.

Comment White male advantage (Score 1) 353

You have an automatic advantage in many technical fields in the United States, Canada, Great Britain, Australia, New Zealand, Western Europe and most of Eastern Europe and probably other places as a white male. In engineering fields people of East Asian descent are also afforded an advantage, because they are assumed to excel at math and by extension all technical fields.

That is why it is very important in the fields of computer science, programming, and software engineering (and where the three overlap) to assess people based upon as broad criteria as possible. White men are a tiny minority of the world's population, and even in the United States do not represent the majority of users of computer software.

I'm not advocating that we hire less talented individuals. What I'm saying is that we're not measuring talent correctly. I think most of you know that already when a recruiter asks you about specific skills in a certain computer language or maybe a specific database rather than focusing on your ability to design software, your ability to manage other programmers, and your ability to see an idea all the way through to completion. Beyond this, think about how software can benefit from different perspectives and ideas from different cultures and backgrounds. Software today comes from an awfully homogeneous pool and we all should know by now that the best software comes out of as many competing ideas as possible. And not necessarily one idea will win. Many times several (and a lot of times two) ideas are equally as good.

Comment My 2 bitcoin (Score 2) 396

Bitcoin has fundamental utility, just like Internet companies did in 2000 before the crash, except its price includes manic speculation. I don't want bitcoin to die, nor did I want those Internet companies to die. They all had utility, but manic speculation killed them. Speculation didn't kill them, or even rampant speculation. Manic speculation killed them, speculation without any justification except emotion, herd mentality, or what have you.

In my opinion, Bitcoin will be worth, after a number of years, about a dollar a bitcoin. In other words, what I'm saying is that one bitcoin has about a dollar's worth of long-term utility and [today] 999 dollars worth of speculation.

The two reasons I cite are: 1) Hundreds of groups of people, at least, are sophisticated enough to create a new cryptocurrency, and the interesting world of the future will be dozens of cryptocurrencies being traded like stocks on an exchange. You will be able to buy a basket of cryptocurrencies to minimize risk, similar to buying an index fund. 2) There are many large holders of bitcoin who at some point will want to move on to their next big tech project. They will over the long term bring down the price of a bitcoin to its intrinsic worth to society, let's say about a buck a bitcoin.

If I had risk capital, which I do not because at this time in my life I choose to do less work for less money, I would ride the volatility wave of bitcoin, making a few hundred bucks here, a few hundred bucks there: Definitely not enough money to quit my job. There is no problem with mining bitcoin or using bitcoin at this value, because you go in and out of bitcoin so quickly it doesn't matter. A cynic would say that the long term worth of many assets will eventually die down to zero, for example some highly-valued stock for a technology that is obsolete in 50 years, but I think that the value of bitcoin will fall back down to Earth in more like 5-10 years or less. It will be one useful cryptocurrency among many.

Comment A good lesson (Score 5, Interesting) 232

It's a good lesson, but for different reasons. Here's why.

In the real world, you pick the right tool for the job. You never pick a language because it's the best language. There is no such thing. Factors going into language selection where technical merit plays no role include what the other developers at the company and/or the project are using, what environment you're using (if Apple, then Objective C; if Android, then Java), what language the code you are maintaining was written in 5, 10, 20, or 30 years ago, and (hopefully if you are a great programmer this will be a minor issue) what languages you're comfortable with using.

After 30 years I've learned that basic computer science concepts are helpful, but only to a point. Google may want you to know specifics about certain types of trees for their interview process, but if you need to know that level of detail for a job, you spend a few hours on Google and learn it. The same goes for languages. Figure out what you need with a bit of research before you start the job. You should have a great idea of what environments a language is nearly always used, so you don't try to do something weird nobody can maintain. If you're going to write an iPhone app, you're going to adhere to whatever specific Objective C thing Apple is doing. Maybe I'm slightly out of date and Apple is doing something else, who knows? I don't work in that space.

Python everywhere, be damned with you, is a quick way to make enemies of people 10-20 years down the road who have to maintain your code. I was doing web development in the 1990s, and everybody used Perl. For everything. Now I work with a legacy Perl code base, and mod_perl seems to be completely abandoned, and it certainly hasn't been released for apache httpd 2.4 yet. We're using Perl because we have to, but not for new stuff. But for the Perl part of our system in bug fix maintenance mode, it is the appropriate language. We didn't have the attitude that we'd continue to use Perl for everything just because that's the way things were done. We were flexible enough to slowly switch over to PHP for certain things that we had been using Perl for.

Avoid fads like the plague. After 30 years of programming, I just ignore marketing. I have no gee whiz attitude about anything. I focus on perfecting my craft, learning how to program better, to debug better, to test better. Learning how to deliver code that works now and five years from now. All that is way more important than figuring how how some language is subjectively the best.

Android

Nvidia Announces 192-Core Tegra K1 Chips, Bets On Android 128

sfcrazy writes "Nvidia just announced Tegra K1, its first 192-core processor. NVIDIA CEO Jen-Hsun Huang made the announcement at CES 2014. He also said that Android will be the most important platform for gaming consoles. 'Why wouldn't you want to be open, connected to your TV and have access to your photos, music, gaming. It's just a matter of time before Android disrupts game consoles,' said Huang." Nvidia's marketing department created a crop circle to promote the chip after CEO Jen Hsun Huang declared that it was so advanced that "it's practically built by aliens."

Comment Good (Score 2) 221

The NSA is supposed to be working on cryptography technology.

The NSA needs to get back to doing its job, and stop spying on Americans. We already have several branches of government that are responsible for domestic criminal investigations, and they're subject (in theory anyway) to the robust safeguards in the Constitution.

The NSA helps everyone with robust cryptography. It's in nobody's best interest when one government can decipher everyone else's communications, except maybe for that handful of codebreakers.

Regardless of what they say, terrorists are low tech. They do not have access to a large pool of cryptography talent, nor will they ever.

Slashdot Top Deals

There are two ways to write error-free programs; only the third one works.

Working...