Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment C/C++? (Score 1) 1

Nice article. I knew some of it already but had not thought of the function calls across multiple files. My problem is however is that why people use C/C++ at all. I understand that C/C++ programmers need to jump through a lot of hoops to get the compiler to understand the big picture. This is why I prefer JAVA because their is no ambiguous code. The compiler knows everything and every dependency.
Software

Submission + - How to make C/C++ compilers generate terrible code (futurechips.org) 1

An anonymous reader writes: Compilers are designed to make pessimistic assumptions about the code if the code is ambiguous. These pessimistic assumptions significantly limit the compilers' optimizations. Programmers can help compilers by avoiding code patterns that are known to confuse them. This article presents three examples of compiler-unfriendly code and explains them in great depth. I found this post enlightening and novel since I haven't seen a post on this topic. Its a good read if you want to understand what a compiler does to your code.
Software

Submission + - Is it necessary for programmers to learn hardware? (futurechips.org)

AustinAlert writes: "Intel's Aater Suleman explains how new programming platforms like multi-core, iPhone, Android, and GPGPUs have changed the mainstream programming. Extracting high performance from these new platforms requires more efficient programming. Understanding a handful of basic hardware concepts can be a very handy tool for todays programmers. This article explains how hardware knowledge has become essential for writing good code and lists some items every programmer must know. (http://www.futurechips.org/tips-for-power-coders/programmer-hardware.html)."
NASA

Submission + - Cybersecurity, the next frontier for NASA engineer (scmagazine.com.au)

mask.of.sanity writes: After 28 June, some 8000 NASA engineers could be out of a job, but one cyber security training centre, linked to the space agency and situated nearby, hopes to recruit them.

These guys are some of the best and brightest engineers in the world, and the operators of the training organisation say they will do wonders to improve the state of information security.

Android

Submission + - Should programmers today learn hardware? (futurechips.org)

AustinAlert writes: "This article by an Intel hardware engineer makes a great point. Programming has indeed changed in recent years. Instead of writing code for infinite-resource desktops and laptops, we now write code for iPhones and Androids. These new platforms are more constrained in terms of processor performance, physical memory, and even virtual memory. To write competitive and powerful apps that do not dry the battery, programmers are forced to write efficient code. Actually, this is like the days when I used to write code for Commodore and every bit and byte literally mattered. I know that companies preferred ninja coders who knew hardware to get an edge over competition. This post is probably right, all that is in fact coming back in the form of phones, Multi-cores, and GPGPUs.

I agree with the author that knowing just a handful of concepts can come in very handy. For example, I have personally noticed that L1 cache resident programs run 3x faster on the iPhone which means better user experience and times less battery drain.

I like this article because it provides real-life code examples where knowing hardware helps. The author even provides a very comprehensive list of important concepts that every programmer must learn (http://bit.ly/mxgQMw).I don't know if I agree with his claim that the enlisted items can be learned in hours but it definitely doesn't take weeks either. By the way, his tutorials on hardware topics are nice!"

Google

Submission + - Google releases video chat source code (computerworld.com.au)

angry tapir writes: "Google has released the source code for a technology that it hopes developers will use to embed real-time video and voice chat functionality in their Web applications. Google acquired the technology, called WebRTC (Web Real Time Communication), when it purchased VoIP (Voice over IP) software developer Global IP Solutions in 2010, for approximately US$68.2 million."

Comment Re:Nostalgic. Interesting. But single thread? (Score 1) 3

Nope dude. Wrong argument! Speeding up a single thread is like speeding up each individual thread, in addition to your Ahmdahl's (dunno what you really meant there). Make single-thread faster then all threads become faster which makes the program faster. Multi-threading is not a magic, just means there are multiple single threads running with each other so single thread is always relevant. LOL@nostalgic btw.

Comment Doesn't this mean iPhone needs serious work! (Score 1) 3

I see his point that in-order cores are becoming common. I mean iPhone has an in-order core and they are common, hence in-order cores are common ... no debate. What I am thinking about is the importance of the optimizations Shouldn't all iPhone app developers know about these optimizations? This seems like it can be a big win for all cell phone developers to follow some of these techniques. I do notice some very slow apps on my iphone and wonder his arguments play a role.
Programming

Submission + - Tiny cores are here, and they change programming (futurechips.org) 3

An anonymous reader writes: Intel is returning to in-order cores after two decades with Atom and Knights. ARM is already building in-order cores for iPhones, iPads, and Androids. IBM has switched to in-order cores after building generations of out-of-order cores. This indicates a clear trend that in-order cores are back in the mainstream. Highlighting the performance characteristics of in-order and out-of-order cores, Dr. Aater Suleman's article explains why programming for in-order cores is very different from programming for the now-traditional out-of-order cores. Thus, this new trend requires a change in compilers, tools, and programming techniques. Compilers need to get better at removing useless code and instruction scheduling. Programmers need to weigh new trade-offs and perform classic optimizations that have been forgotten. I liked this article particularly for the very simple code examples and a simple explanation of in-order and out-of-order differences. The message is clear: programmers and compilers need to understand in-order cores and target their code better.
Apple

Submission + - Amazon Challenges Apple With Mac App Store (computerworld.com) 1

CWmike writes: "Amazon launched a Mac-specific application download store on Thursday that will compete with Apple's nearly five-month-old Mac App Store. The new subsection of Amazon's massive online store, dubbed 'Mac Software Downloads,' kicked off quietly Thursday. Amazon has long offered software downloads for both Windows and Mac customers, but this was the first time that the company called out its Mac-centric 'store.' The retailer, however, apparently did not want to goad Apple into another legal battle by mimicking its rival's 'App Store' moniker: The two companies are already in court over Amazon's 'Appstore for Android,' which Apple claims violates its trademark. Unlike the Mac App Store, which Apple opened in early January, Amazon's includes the popular Office for Mac line from Microsoft."
Programming

Submission + - What makes parallel programming hard? (futurechips.org)

An anonymous reader writes: Intel’s Aater Suleman writes about why parallel programming is difficult. He uses real life code examples to show why finding parallelism is difficult and specifying it is a daunting task. I was unaware of the fact that a major challenge in multi-threaded programming lies in optimizing parallel programs, not just getting them to run. Aater Suleman presented a full case study (http://www.futurechips.org/tips-for-power-coders/writing-optimizing-parallel-programs-complete.html) of how code is parallelized and the kind of issues parallel programmers must tackle to get high performance. His analysis is insightful and the case study is very enlightening if you are unfamiliar with parallel code debugging. His article has already been featured on sites like insidehpc.com and multicore.info .
Idle

Submission + - Nazis taught some dogs to speak, others to talk. (nzherald.co.nz)

An anonymous reader writes: Durring WW2, some dogs have ability to speak primitives in language such as tapping their paws to answer or barking for different symbol responses to an alphabet, but many German Shepherds were taught and even learned to talk in complete sentences! Talking dogs trained in Germany were part of the war effort, as was Adolf and Herman being strong proponents of animal rights. Animal Psychologists appeared in Germany in the decade of 1920's and there reports were that some species of dogs were judged to be capable of almost human-like intelligence, even participating in thought-provoking activities and processes.

Slashdot Top Deals

"God is a comedian playing to an audience too afraid to laugh." - Voltaire

Working...