Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror

Submission Summary: 0 pending, 121 declined, 51 accepted (172 total, 29.65% accepted)

×
Android

Submission + - Petition to make Patent Trolls PAY (whitehouse.gov)

jd writes: "The makers of X-Plane, Laminar Research, are unhappy. Very unhappy. They are being sued by a patent troll (Uniloc) over using an industry-standard Android library for copy protection. Essentially, if the troll wins, it will shut down Android (and, by implication the Kindle) because existing app writers aren't able to pay the sorts of money being asked. Open Source may survive, but most Android apps are not Open Source.

Copy protection brings its own issues, but setting those aside, this is a serious effort to bring patent trolling (and software patents) under some sort of control. This is one of those times where the Slashdot Effect could really be useful. If enough people sign, given the increasing hatred in industry towards trolls, we might see something done about it for a change."

Programming

Submission + - What modern paradigms are worth pursuing? (aosd.net)

jd writes: "There seem to be a number of new paradigms emerging in the programming world. Templates and other similar features don't seem to have solved the problems of complexity in modern software, with the inevitable result of people inventing other forms of abstraction.

Of these, the two that seem the most significant are Aspect-Oriented Programming (an example of a compiler can be found here) and Feature-Oriented Programming (>a href="http://wwwiti.cs.uni-magdeburg.de/iti_db/forschung/fop/featurec/">again, a sample compiler).

Intel, on the other hand, is disregarding the complexity of the problems and is focusing on the complexity of the solutions. They have bought and fully opened the Cilk++ frontend for G++, which adds instruction-level parallelism to C++ programs.

But are these actually any use? I don't expect to see businesses crying out for coders experienced in FOP, nor do I expect to see complex projects such as KDE exploit such features. Although Cilk and Cilk++ have now been out a while, I can name no program that uses them.

Are they underused because nobody's heard of them or because nobody who has heard of them has found anything they're a good solution for? Are additional layers on top of Object Oriented languages the equivalent of Fifth-Generation Languages — a warning flag that the entire approach has hit a brick wall, requiring a rethink rather than a new layer?"

AMD

Submission + - CPU competition heating up in 2012? (eejournal.com)

jd writes: "2012 promises to be a fun year for hardware geeks, with three new "Aptiv-class" MIPS64 cores being circulated in soft form, a quad-core ARM A15, a Samsung ARM A9 variant, a seriously beefed-up 8-core Intel Itanium and AMD's mobile processors. There's a mix here of chips actually out, ready to be put on silicon, and in last stages of development. Obviously these are for different users (mobile CPUs don't generally fight for marketshare with Itanium dragsters) but it is still fascinating to see the differences in approach and the different visions of what is important in a modern CPU.

Combine this with the news reported earlier on the DDR4, and this promises to be a fun year with many new machines likely to appear that are radically different from the last generation.

Which leaves just one question — which Linux architecture will be fully updated first?"

Submission + - Handling large amounts of data with complex relationships? (google.com)

jd writes: "This is a problem I've mentioned in a couple of posts, but I really need the expert advice only Slashdot can offer. I have a lot of old photos (many hundreds) and old negatives (about 7,500 or so) covering 150 years and four different branches of the family.

The first challenge is to find a way to index every scan (date, geography, people) to be able to relate the images. Google+/Picasa doesn't even come close to what is needed — its capacity to relate information is very limited.

The second challenge is to identify major landmarks. Few of the pictures have any information and whilst I can identify some places I cannot identify everything. Not even close. Searching the web for similar images using the image as the "keyword" — that is an interesting challenge.

The third challenge is to store the images. Each of the scans is around 3.5 gigabytes in size using CCITT 4 compressed TIFF files. That gives me a storage requirement of 28 (SI) terabytes (27.3 real terabytes), which is more than I really want. Since I am producing a digital backup of the negatives, I don't want to lose resolution or detail where I can avoid it. Clearly, I can't avoid it completely — I can't afford a personal data silo — but keeping loss to a minimum is important.

What would people suggest as the best solution to these various technical problems? Besides getting a brain transplant and a new hobby."

Science

Submission + - You really are what you know (bbc.co.uk)

jd writes: "There has been research for some time that shows that London cab driver brains differ from other people's, with considerable enlargement of those areas dealing with spacial relationships and navigation, with follow-up work showing that it wasn't simply a product of driving a lot.

However, up until now it has been disputed as to whether the brain structure led people to become London cabbies or whether the brain structure changed as a result of their intensive training (which requires rote memorization of essentially the entire street map of one of the largest and least-organized cities in the world). Well, this latest study answers that. MRI scans before and after the training show that the regions of the brain substantially grow as a result of the training, that they're quite normal beforehand.

The practical upshot of this research is that — even for adult brains, which aren't supposed to change much — what you learn structurally changes your brain. Significantly."

Submission + - When and How to deal with GPL violations? (nicta.com.au) 1

jd writes: "There are many pieces of software out there such as seL4 (kindly brought to my attention by another reader) where the vendor has indeed written something that they're entitled to Close Source but where their closed-source license includes the modifications to GPLed software such as the Linux kernel.

Then there's a second type of behaviour. Code Sourcery produced two versions of their VSIPL++ image processing library — one closed-source, one GPLed. It was extremely decent of them. When Mentor Graphics bought them, they continued developing the closed-course one and discontinued, then deleted, the GPL variant. It's unclear to me if that's kosher as the closed variant must contain code that had been GPLed at one point.

Here's the problem: Complaining too much will mean we get code now that maybe 4 or 5 people tops will actually care about. It will also make corporations leery of any other such work in future, where that work will be of greater value to a greater number of people.

So the question I want to ask is this: When is it a good time to complain? By what rule-of-thumb might you decide that one violation is worth cracking down on and another should be let go to help encourage work we're never going to do ourselves?"

Science

Submission + - Open Source Cancer Research? (guardian.co.uk)

jd writes: "Dr Jay Bradner is claiming that it is possible to conduct cancer research using open source methodology. Certainly, his research lab has produced a few papers of interest, though the page describing the research is filled more with buzzwords (post-genomic?) and hype than with actual objectives and serious strategies. I'm certainly not seeing anything that fits either the "open source" and crowdsource models.

Certainly, there are some areas where open source really is exceedingly useful in science.

Then, there are plenty of projects that use volunteers to help solve complex problems.

So, I'm going to ask what is probably a dumb question — is there actually anything new that science can do with open source techniques? Has that path been mapped out, or are there actually new (as opposed to merely buzzword-compliant) approaches that could be followed which would get useful results?"

Science

Submission + - Brain uses Self-Modifying Code (bbc.co.uk)

jd writes: "Each and every brain cell alters its own DNA thousands of times over a person's lifetime, say researchers from the Roslyn Institute in Edinburgh, Scotland.

The paper, formally published in Nature (abstract visible, article behind paywall), describes the mechanism by which this happens.

I have written to the lead researcher to get confirmation that this is actually a change in the DNA sequence itself and not a change in the epigenome that alters what the DNA transcribes to. He has kindly written back to confirm the findings. It IS a change in the DNA. Every brain cell in you has a genome unique to itself.

In short, the brain is a cluster where each node is running self-modifying code — a practice that no computer scientist or software engineer would dream of trying to do, considering it way too fragile, too unpredictable and too difficult. The university I went to, you'd be murdered in the hall if you'd proposed even single-threaded self-modifying algorithms, never mind a few trillion tightly-coupled threads.

The hope in genetics is that this will lead to a better understanding of genetic diseases, such as the various forms of dementia. My fear is that it will have the opposite effect — you can't exactly sequence every cell in the brain of a live patient to see what is going on, which may lead to geneticists ruling the problem too hard.

The other consequence of this find is that we are all chimera. Human DNA can no longer be regarded as a single thing in a single person, with only a few exceptional cases. The terms "Chimera" and "Gestalt" apply to everyone on a fantastic scale. Which makes them meaningless, unless they get redefined to work around the problem.

Arguably, that's an academic point at the moment. Nomenclature is nothing too serious and there's no actual hard evidence that it would cause problems in DNA forensics, though the mere fact that there's a possibility might cause problems in the courtroom whatever the science itself says about the impact."

Technology

Submission + - Bloodhound SSC partially opens source (bloodhoundssc.com)

jd writes: "I've been monitoring the progress of Bloodhound SSC (the car aiming for the 1,000 MPH record) and it looks like they're opting for some interesting tactics. In April, the car itself went partially open source, with a complete set of schematics and specifications and an invite for engineering bugfixes. According to them, it's the first time a racing team has done this. Sounds likely enough. The latest patches to be released were a tripling in fin size and a switch to steel brakes because carbon fibre would explode."
Politics

Submission + - Wisconsin tried to ban Internet2 (the-scientist.com)

jd writes: "The Wisconsin legislature attempted to pass a budget that would ban any school, college or university from being a member of Internet2 or WiscNet on the grounds that such networks "unfairly competed" against commercial offerings.

Of course, Internet2 is already partly supplied by those very same commercial vendors and last I heard there weren't too many DSL providers offering 100 gigabit pipes running onto 9 terabit backbones. Nor, as we all know, do that many ISPs offer IPv6. So who, precisely, were Wisconsin concerned about?

For now, there has been a reprieve. But the legislature has made it clear that academic networks will not be tolerated in future and it does not seem far-fetched to expect other legislatures to prohibit such systems."

Science

Submission + - Just In: Yellowstone is big (livescience.com) 1

jd writes: "Really big. By using electrical conductivity tests rather than seismic waves, geologists have remapped the Yellowstone caldera. Whilst seismic waves indicate differences in the reflectivity of different materials, it doesn't show up everything and contrast isn't always great. By looking at the electrical conductivity instead, different charcacteristics of molten and semi-molten rock can be measured and observed.

The result — the caldera is far larger than had previously been suspected. This doesn't alter the chances of an eruption, it's not even clear it would change the scale (prior eruptions are very easy to study as they're on the surface) but it certainly changes the dynamics and our understanding of this fierce supervolcano."

Games

Submission + - Realtime Worlds Goes Under (bbc.co.uk)

jd writes: "On June 29th, Realtime Worlds released APB in North America. Less than two months later, they have gone bankrupt with the loss of 200 jobs (an additional 60 being shed last week). Created by the author of Lemmings and Grand Theft Auto, the company was probably the largest games outfit in the UK (Scotland to be exact), so this is more than just a loss to gamers. According to the article, APB had a poor reception, so it is unclear if this failure was genuinely a result of poor market conditions (as claimed in the article) or a result of a failure to understand the gamers."
Hardware

Submission + - When mistakes improve performance (bbc.co.uk)

jd writes: "Professor Rakesh Kumar at the University of Illinois has produced research showing that allowing communication errors between microprocessor components and then making the software more robust will actually result in chips that are faster and yet require less power. His argument is that at the current scale errors in transmission occur anyway and that the efforts of chip manufacturers to hide these to create the illusion of perfect reliability simply introduces a lot of unnecessary expense, demands excessive power and deoptimises the design. He favors a new architecture, which he calls the "stochastic processor" which is designed to gracefully handle data corruption and error recovery. He believes he has shown such a design would work and that it will permit Moore's Law to continue to operate into the foreseeable future. However, this is not the first time someone has tried to fundamentally revolutionize the CPU. The Transputer, the AMULET, the FM8501, the iWARP and the Crusoe were all supposed to be game-changers but died a cold, lonely death instead — and those were far closer to design philosophies programmers are currently familiar with. Modern software simply isn't written with the level of reliability the Stochastic Processor requires in mind (and many software packages are too big and too complex to port), and the volume of available software frequently makes or breaks new designs. Will this be "interesting but dead-end" research, or will the Professor pull off a CPU architectural revolution really not seen since the microprocessor was designed?"

Submission + - Car computers dangerously insecure (bbc.co.uk) 7

" rel="nofollow">jd writes: "According to the BBC, it is possible to remotely control the brakes, dashboard, car locks, engine and seatbelts. Researchers have actually done so, with nothing more than a packet sniffer to analyze the messages between the onboard computer systems and a means of injecting packets. There is no packet authentication or channel authentication of any kind, no sanity-checking and no obvious data validation. It appears to need a hardware hack, at present — such as a wireless device plugged into the diagnostics port — but it's not at all clear that this will be a limiting factor. There's no shortage of wireless devices that must make use of the fact you can inject packets to turn on/off the engine, lock/unlock the doors, track the car, etc. If it's a simple one-wire link, all you need is a transmitter tuned to that wire."

Slashdot Top Deals

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...