Follow Slashdot stories on Twitter


Forgot your password?

Comment Skepticism (Score 1) 223

There is some skepticism of wireless charging, so I will address all of it with some facts and personal experience.

In my opinion, the micro USB port was not designed with smartphones in mind.

With heavy use (such as shooting and uploading video) I can completely drain my Samsung Galaxy S III from 100% to 0% in about 90 minutes. Unless less strenuous use such as gameplay I can easily drain my phone in 3 or 4 hours. With the lightest of use and having a number of useful apps installed on my phone doing things in the background the battery never lasts more than 12 hours. With the first scenario, I have no way to charge my phone and it is out of commission in an hour or two (this is frustrating enough that I might buy a second battery) but to make the second and third scenarios feasible I leave my phone plugged into a charging port continuously. Scenario 4, less common but common enough, is that I can't use Waze the GPS app without being plugged into my car's cigarette lighter unless the trip is under an hour. But if I use the GPS during my trip without being plugged in, I arrive with a completely drained phone. That is not acceptable.

So instead of the charging port being used once every couple of days with the phone more or less stationary (like the designers of micro USB probably intended), most people including myself have the phone plugged in via the charging port continuously, always putting a strain on the connector.

What that means for me is that my phone was dead after 9 months of heavy use. The connector became bent internally. No micro USB cable would fit anymore, and I was in Detroit with only an approximate idea of how to get home with a dead phone. After a while of trying to work the cables I had on hand in, the motherboard cracked.

If you visit your smartphone insurance site, you'll see a failed charging port as one of the reasons why you're submitting a claim, right under lost and stolen. Essentially micro USB is too fragile for this type of application, and Apple and your insurance company know this.

Luckily I didn't have to go through my insurance and I got a new phone under warranty. The first thing I did was buy a Qi module to place inside the phone. Qi charging isn't feasible in the car. It's most feasible at work, so roughly 1/3 of the time I'm wirelessly charging. I expect the port on this new phone to last roughly 1/3 longer, or around 1 year.

Clearly, Samsung needs to figure out how to make the micro USB charging port more robust but Qi will help you until that happens.

Comment Bad company (Score 1) 419

They had the worst selection in the industry and once they pushed out ma and pas they jacked up their prices too high. I don't watch television or movies myself but my mother switched to Netflix by mail. Personally, I don't pay for any video ever. I watch what I need on Youtube.

Comment After 30 years of programming (Score 5, Insightful) 598

Forget about having to learn any specific language or environment. You should be able to pick up any language or environment on the job.

You need to learn how to plan, estimate how long that plan will take to complete, and finish it on time. Very few programmers I've worked with are any good at estimating how much time they will take to complete anything. The worst offenders take double the amount of time they say they will.

Forget about specific computer science trivia. You can look that all up, and it's all available in libraries with various licenses. When you're starting a new job, refresh yourself on how that problem is already being solved. If you need a refresher on a specific computer science concept, take some time and do so.

With this advice you won't burn out at age 25.

Comment Missing the point (Score 1) 381

I didn't read the article, so your mileage may vary, but I have been programming since I was around 7 or 8 years old and I just turned 38.

Here goes. The vast majority of code that you will need to write and maintain does simple things. It should be simple and expressive so the next person can both fix your mistakes and add new behavior.

Every once in a long while you will be given something fairly complex to do, and as you gain experience you learn to isolate that complex piece, perhaps write tests and drivers, special debugging code for it, documentation and wiki entries, and otherwise treat that code specially so it doesn't become a burden.

Under 1% of code does tricky things. It is true that you will find non-trivial code in academic papers, maybe something like a cryptography research paper. Eventually that code will become something like openssl. But 99% of what you write will be simple database, GUI, or text processing work, or something else fairly mundane. That code is arguably much more important. The cryptography routine won't need much maintenance but GUI and database requirements change all the time.

If you find yourself always writing mind-numbingly difficult code, perhaps you need to relax a little bit, because you won't make it 30 years.

Comment To spit (Score 1) 323

My analysis is that to spit was conserved because it is an onomatopoeia or onomatopia for you picky Brits.

Some of the other words are not much more than ma, ba, da, etc., the sounds that babies make, which form the cognitive basis for all languages. Some Google sleuthing will reward you with some interesting papers on this.

Comment de Icaza (Score 5, Insightful) 815

Please refrain from attacking de Icaza for these simple reasons.

Like Stallman, de Icaza has donated countless hours of organization and programming time to Linux. Neither got rich as a result. Politics aside, Linux is about superior engineering, even if only as a side effect. Because of the efforts of these two individuals, among many others, Linux is now the most popular operating system on the planet. By any stretch of the imagination, they were and are victorious. Android is closing in on a billion users, but regardless of what Google's marketing materials may tell you, Android is a Linux distribution, and GNU and GNOME have been perfecting Linux distributions for over two decades.

I understand that Android does not ship with much GNU or GNOME software, but GNU and GNOME are what built Linux. Without either, the foundations upon which Android runs would never have accreted enough functionality to even think about running a smartphone.

As mostly non-rich people, often not closely allied with specific companies, we don't have publicists or agents. We don't come off as polished. We don't have speech writers. Forgive us for seeming offensive, rude, obnoxious, conceited, full of ourselves, or some other adjective. We're people, and as engineers we're trained to traffic in the honest truth. Once you meet us you'll like us, for the most part. And even if you don't, enjoy using our software. Contribute if you like.

Comment Correct observation, wrong understanding (Score 4, Interesting) 342

Yes, hardware is super cheap. That's because we make it all in China. China has a huge labor base that has no say whatsoever in the political system. Labor and environmental laws, lax as they are, are not enforced.

However, the Chinese economy is beginning to falter and labor unrest is on the rise. I used to think that Chinese pay would normalize with the West and that manufacturing would move to cheaper markets. Now I'm beginning to think differently. There will be major political unrest in China, supply chains will be severely disrupted, and hardware will move back to expensive labor markets, not cheap ones. Cheaper markets just don't have the infrastructure to match China and the West. Observe what happened in Thailand last year because they couldn't deal with a simple flood.

So this period of super-cheap hardware fueled by the greed of CEO's will come to an end, factories will move back to the West, and things won't just be a bit more expensive, they will be considerably more expensive because of technical expertise lost to a Chinese state in chaos or decline.

Comment QR code horizon (Score 1) 127

QR codes are extremely unlikely to persist any longer than ten years. If you've programmed a point-of-sale system like me you probably know that there are more coding schemes for barcodes than you can shake a stick at. QR codes are just the current encoding fad that will soon be replaced by something better.

Comment Incorrect analysis (Score 1) 283

I feel that this is an incorrect analysis. Depending upon the problem to be solved, all programmers will adapt the appropriate style.

When life is on the line, all programmers are conservative. When money is on the line, cautious. When writing a one-off script to test a programming technique, liberal. Even management, who tend to push programmers to be less cautious, adjust their demands for the situation.

Comment Odd question (Score 1) 1086

Computer programs are mathematical expressions, so I am using math virtually all the time at my programming job.

I have never solved a differential equation for my work, or even while programming for fun, but techniques I learned while taking Calc I-IV I no doubt use in my work without even thinking about it. Practice solving difficult math problems both stimulates and creates the areas of the brain you use when programming.

Discrete math and digitial logic design? Yes, I use what I learned there sporadically. The most important part of discrete math for most programmers is understanding computational cost. You don't need to take a discrete math course to learn Big O notation, but it might put it in better context with other ideas. I don't do hardware, but digital logic design was my course in logic. In order to read someone else's code and fix his stupid mistakes, you have to understand his flawed logic and fix it.

Signals and Systems and Probability and Random Processes? I got a terrible grade in the first class, but when what the professor was trying to teach finally began to sink in 10 years later signal processing became a lot clearer. Much software can be thought of as a signal processor. You have an input, you process it, and you have an output. Basic probability is also quite handy when programming. Pretty much the only thing I think about is a normal distribution. The ideal piece of software is correct in all cases, but you always have to consider the probability of situations when making cost trade offs or pitching an argument with management about the importance of something you want to work on.

Comment Re:They don't teach languages (Score 3, Insightful) 193

It's almost like that except they teach data structures, object-oriented programming, and other idioms that are useful to both academics and industry. They don't exactly teach you "programming languages" except maybe when you're taking compilers, and in that case it's more than the language itself, it's how to design, gramatically specify, and "compile" one language into another language (I'm taking educated guesses, I haven't taken the course but I've studied gcc).

You can't teach anything with a hypothetical language. That would be far too abstract, and difficult to grade. You have to decide upon a language and you have to inherit its flaws, design compromises, and strengths. I disagree that texts use pseudocode. At least in my experience, they use some but not a whole lot.

When teaching a student grammar, you first teach their native language. English implements all sorts of biases, trade offs, and lacks features of other languages (gender, tone, irregular verbs, and many more). You have to direct your grammatical instruction in an incomplete manner or else things would be too abstract for the student, and when students learn new languages they have to learn new features of the language as well. Once you start looking into many, many languages it's pretty damned cool because you thought you knew how languages worked based upon your own but you begin to see how languages work in general. This whole aside, of course, applies to the formal languages we use for programming.

Slashdot Top Deals

The "cutting edge" is getting rather dull. -- Andy Purshottam