Forgot your password?

Comment: Re:Time for GATT Article XX tariffs (Score 1) 427

by Desty (#47873121) Attached to: UN Study Shows Record-High Increases For Atmospheric CO2 In 2013

Surely, when the UK has a population of around 65 million, and China has a population of around 1400 million it makes a difference. We are talking about influencing government policy. So, we spend a huge effort changing UK policy, and at most we can effect a reduction in an output of:

7.7t * 65m = 500.5 million t ... while at the same time China is outputing:

7t * 1400m = 9800 million t.

The entire UK output is 5% of China's. If the UK can reduce its output by 20% (hugely unlikely, as just holding steady seems impossible to do), while the Chinese increase theirs by just 1% then the two effects cancel out (to some rounding error that I can't be arsed to calculate).

Focusing on those countries who are both raising their output the most and also have the largest populations (hello too India) seems perfectly sensible.

So if it's "hugely unlikely" to reduce the UK's output, when their per capita emissions are higher than China's, how can we expect China to do it?

Surely it's both fairer and more realistic to treat the targets on a per capita basis and not penalise China for having a large population.

Comment: Re:R... (Score 1) 143

"Portable assembly language" is an oxymoron. And I have never heard anyone use that phrase to describe C.

A quick web search will solve that for you. The phrase has been bandied about for quite a few years now, although many people disagree on the topic.

Never mind building abstractions. The C language itself is a significant abstraction from the machine level. Only a small handful of operators and constructs in C have a close analogue to assembler statements (e.g., accumulation, shift and bitwise logical operators.) Therefore I maintain that it is not a low-level language.

While I certainly agree that C is a significant (and useful) abstraction from the machine -- or more specifically, the assembler -- level, I think you've glossed over quite a few things that are much closer in C to how it works at the machine code level, compared to most other languages. Some that spring to mind are:

  1. Pointers and pointer arithmetic -- and not just for fun; you need to use them to get things done.
  2. Array indexing -- this is little more than syntax sugar atop pointer arithmetic. Indexing element i of an array is equivalent to referencing *(array_base + sizeof that type * i), and accordingly there is no bounds checking etc.
  3. No real strings -- just arrays of characters which suffer from the same problems above.
  4. No garbage collection or even refcounting -- if you want to return a string value from a function, you need to either take a buffer (and, ideally, the size of the buffer since you can't automatically determine it) and store the string there, or you need to dynamically allocate memory and return a pointer. Whoever called the function then needs to "free" that pointer later. That's some pretty low-level stuff you have to do just to pass some strings around!

These are just a handful of many things you need to live with in assembly AND in C, unless you use some non-universal, non-standard library for e.g. strings or GC (Hans Boehm's drop in one seems pretty good though). Additionally, C forces you to declare the types of each variable and function, yet cannot properly enforce type-safety and has no type inference which would have made the job easier.

C is undeniably less low-level than assembly, I consider C to be a low-level language in an arbitrary line-in-the-sand sense. Of course, my view on this is no more valid than yours since how we define "low-level" as some absolute marker is pretty subjective!

Comment: Re:R... (Score 1) 143

C is most certainly a low-level programming language. There's a reason people call it "portable assembly language". Of course, as with almost all programming languages, people build useful abstractions in C to bridge the gap somewhat. But that doesn't make C itself a high-level language, any more so than does the use of functions and macros to increase the expressive power of an assembly language.

Comment: Re:I've also had this happen with HFS+ (Score 1) 396

by Desty (#47240723) Attached to: One Developer's Experience With Real Life Bitrot Under HFS+
JPEG is quite robust to corruption, and even PNG's lossless compression seems to be tolerant of a few stray bytes. However, encrypted files would probably be badly damaged by this sort of corruption.

This is the kind of situation where some form of transparent, redundant error-recovery system is extremely important. I'm sure that in the medium term future (after everyone is using SSDs and the cost/capacity ratio falls much further) some kind of RAID setup will be the norm and these kinds of problems will become vanishingly unlikely.

Comment: Re:Simple (Score 1) 92

by Desty (#46914119) Attached to: Why Speed-Reading Apps Don't Work

Actually you're completely wrong on that, the data suggests that no one can demonstrate long term photographic memory, which is true. If I read a book today, I can probably recall the content for about a week or so, after that it will start to slip and certainly by a mouth or two out I'll retain very little of the information I read over. However that also doesn't matter, usually is something is important enough to remember longer then a week out, you'll have to read it more then once.

Actually you are completely wrong about that. The data suggests that no one can demonstrate even SHORT term photographic memory.

You stated that you can "look over a page" and "load" all the information into your memory before comprehending it. This is an utterly ludicrous claim. You also stated that you can read at over 1000 WPM with very high comprehension. Also a laughable claim that is not supported by any scientific studies.

Comment: Re:Simple (Score 1) 92

by Desty (#46863999) Attached to: Why Speed-Reading Apps Don't Work
Agreed. There's only been one scientifically controlled study that suggested anything remotely like an eidetic memory, and the subject ("Elizabeth") later married the scientist running the study. Then they refused to participate in any further experiments. So, in effect, nobody in the world has ever successfully demonstrated either an eidetic memory, or (much less) an ability to read significantly faster than average while maintaining decent comprehension. Any anecdotal arguments (for example, Murdoch5's frankly laughable claims) otherwise are almost certainly pure Munchausen-esque nonsense and should be disregarded with prejudice.

Comment: This is an "argument to moderation" fallacy (Score 1) 388

by Desty (#46121729) Attached to: Edward Snowden and the Death of Nuance

Although it's certainly important to consider multiple viewpoints in a debate, it's wrong to assume that the truth must lie between two extreme positions.

Personally, I think the guy risked everything to expose what is clearly (to most people in the world) a bad and pointless system. He's certainly more deserving of a Nobel peace prize than Obama, who hasn't even shut down Guantanamo Bay yet. The ironing is delicious.

Comment: Re:See a psychologist. (Score 1) 384

by Desty (#45974953) Attached to: Ask Slashdot: How Can I Improve My Memory For Study?

When you get to a "normal" sleeping state, then you can look at improving memory. Reading is probably one of the best things to do regularly, followed by logic puzzles and memory games. Fish and high protein foods are good for brain function, but also can be a health concern due to mercury and other toxins found in nearly every fish today.

Apparently logic puzzles don't provide much benefit in brain function, even though it seems intuitive that they should. Strangely enough computer games of various sorts have been shown to improve cognitive abilities. Playing tetris for a little while every day actually resulted in a measurable increase in thickness of cortex tissue in one study, along with improved language skills of all things. Playing first-person shooters helped make certain parts of the brain process things more efficiently.

Then, for working memory and (I think) executive function, the adaptive dual n-back (DNB) has shown to be quite useful, although the effect size might be smaller than initially reported.

You're absolutely right that proper exercise, diet and sleep are by far the most important things to have. Having poor quality sleep in particular has an enormous negative impact on long-term memory and can lead to atrophy of the hippocampus and parts of the prefrontal cortex. So I'd recommend spending a lot of effort on really fixing your sleep as it's worth proportionally more than probably any other intervention, and it's easy to keep fixed once you've gotten into a good routine. A good routine is at least 7.5 - 8 hours of decent uninterrupted shuteye at consistent times. Dietary and behavioural changes can also help improve sleep and ensure that you get good slow-wave activity (this is the part that usually suffers most as we age, and is linked with memory deficits). Try to measure your activity with a sleep tracker, and see what's going on.

All life evolves by the differential survival of replicating entities. -- Dawkins