Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re:Damn... (Score 1) 602

I think the reason why Asperger's Syndrome is classified as a disease or a disorder is because that a lot people with this disorder have problems with social interactions just like a person with dyslexia or even myopia have a problem with reading. Some people even have problems with taking care of themselves so the spectrum may vary. So whenever there is a problem there is a desire to find a solution to this problem.

Ultimately I don't think this classification is intended to judge or put any sort of moral value on this medical condition but to understand what's going on and to find ways to help these people better cope with their surroundings and the surroundings to better cope with them. After all there are quite a few successful people out there diagnosed with this condition such as Bill Gates, Albert Einstein, Bob Dylan, Daryl Hannah, Alfred Hitchcock, ... the list is long.

Comment Re:Alternative: XFCE (Score 1) 152

My pet peeve with all of those window managers is that they don't scale well on high resolution displays, especially xfce. I know Linus Torvalds have written a critical post about this awhile ago in connection with Apple's retina displays on their macbooks. But even on a low-resolution 1080p display, fonts and the user interface don't scale well.

Comment Re:Why? (Score 1) 102

Nowhere did I state that it should be free. If you remove the "F" from the FOSS you are mentioning then you are talking my language. While I'm a proponent of FOSS I don't think all open source software necessarily should be free. If people are concerned about this openness then perhaps some kind of encryption or other ways to obfuscate the source code that would make it understandable only to the compiler would be in place. However, in the end I don't think people would want to obfuscate the code before distribution. After all, even binaries can be reverse engineered and cracked so going open source shouldn't be that big of a deal.

Comment Re:Why? (Score 1) 102

The problem many CISC CPUs (such as x86 based CPUS) are facing today is that they are encumbered by legacy instruction sets so as to maintain backwards compatibility. I understand that there is an abstraction layer in many x86 CPUs that emulates some of these legacy instructions at the hardware level. The downside with this is that the wafer space required for this circuitry logic could be used for something else that would improve performance instead of maintaining this backwards compatibility.

As an abstraction layer between hardware and software; CISC cannot be compared to the implementations I mentioned in prior post. Assume that Intel introduces a new instruction set that would make any concurrent CPU without it pale in comparison. Let us call this instruction set SSE6. Any precompiled software will not take advantage of this new instruction set. The software has to be recompiled. In the examples I mentioned, the hardware support is determined at the driver level while the applications take advantage of whatever is available. As we all know, hardware and their drivers/compiler stick together like a horse and carriage.

Maybe the ideal CPU is EPIC based, maybe it is a CISC that is not encumbered by legacy instructions or even a RISC. We will not know until we spend time and research to find out. Most likely, what is optimal will depend on circumstances or the quantum mechanical properties of the materials used which is likely to change as newer and more efficient materials are discovered. Maybe we will see all these CPUs in one and the same system eventually as they are all good at specific tasks. So it would mean a great deal if existing software could immediately take advantage of the new hardware features and optimizations as they reach the market.

Comment Re:Why? (Score 1) 102

Perhaps breaking of compatibility between CPU generations is not a weakness of the VLIW/EPIC architecture per se but rather a weakness in how people look at software and software distribution. First of all, why should software be distributed as pre-compiled binaries? A much better way would be to distribute the sources while maintaining a compiler/installation environment that automatically handles the software. This environment would then automatically optimize the software for the specific computer system and its particular hardware configuration during the installation process and migration of this software to newer generation systems would be a non-issue.

Another approach would be to add an abstraction layer between the hardware and software very much like what is done with virtualization, Java, ZFS, LVM, DirectX, Crossbow et al. That would make the software more independent of the underlying hardware...

Comment Re:Why? (Score 1) 102

Who is to judge whether developing and marketing the Itanium is worthwhile other than Intel themselves? Perhaps the development and marketing of these chips will give them valuable information that is useful for the development of future generation processors.

The EPIC architecture (which is looked upon as a continuation of the development of the WLIV architecture) is significantly different from other more wide-spread architectures and perhaps the performance issues are there because people have not yet figured out how to fully utilize such an architecture in an efficient manner. So maybe one day when the compiler tools get more mature we might see EPIC CPUs with a competitive price/performance in the market. But that's my two cents.

Btw, damn to the depths whatever muttonhead thought up 'all butt'!

Comment Re:There's always a downside (Score 1) 533

Wind power is not as clean and safe as you may seem to imply. Windmills have given rise to mass death of endangered bird species when they get hit by the turbine blades while in the air. Piles of dead birds, especially birds such as hawks, falcons and eagles are commonly found dead near those windmills.

Hydro power is a very strong intrusion on the course of nature and the aquatic ecosystem therein. Many species of fish, especially trouts, crayfish are now extinct from waters upon where a hydro power plant has been built, but nobody talks about it.

Comment Re:Not the big one (Score 1) 102

But will/does HAMMER2 have end-to-end checksumming like ZFS has? Will it have support for software raid similar to ZFS raidz/raidz2/raidz3 (like hardware RAID5/6/7 but safer)? Will it have support for ditto blocks, or par2-like single disk/vdev redundancy on both data and meta-data?

If it doesn't have these above features it will be a deal-breaker at least for me.

Other things that I miss in some of these file systems are; defrag (even ZFS has potential fragmentation issues), the possibility to convert a raidz2 pool to raidz3 after adding an extra disk to cover for that extra redundancy, a laid out "contingency plan" if a corruption would occur.

Comment Re:Old Moto Razr II V8 (Score 1) 396

I also still use my old RAZR v3 and I absolutely hate it. The menus are sluggish as hell, I can only store about 60 or so text messages and the email client is a pain in the rear side.

The things that are good about it is the reception which is abso-f:ing-lutely AMAZING compared to other phones and yet the SAR level is among the lowest that can be found on a phone. Where other peoples' phones fail, it still shows 100% reception. I don't think there is any phone out there dumb as smart that can beat this phone in this regard.

The phone is seeing its 6th year with me in its possession and I'm on my second replacement battery which I have paid about $10 for and it is still going strong. I'm waiting for a good Smart phone but I have not yet seen one so I'll stick with this one as long as I can make phone calls and it wakes me up in the morning. If I'm gonna let a phone get into my pants it better be a damn good one ;)

Comment Re:Always a niche (Score 1, Offtopic) 317

A good blackboard, or a chalkboard as you call it when frequently cleaned properly, (which the ones at my primary university are) and operated by an experienced lecturer beats any whiteboard when it comes to readability.

A particular nuisance with whiteboards is when the pens are about to run out of ink and dry out; the text is barely readable and the glossy surface doesn't exactly make things better. A chalk on the other hand always deliver 100% color, or nothing which makes it more reliable as a writing tool. The exception is when the board is wet, but with proper technique you can overcome that which most lecturers have.

My primary university has chalkboards in 99.999% of all the lecture halls. There have been whiteboards in some halls in the past but eventually they got replaced by chalkboards. Only in some departments, the smaller lecture rooms have a whiteboard. I guess that it is more convenient for inexperienced users in smaller rooms.

In the other university that teaches social sciences (such as economics, business administration and law) it is the other way around but these boards are barely used in lectures and I have so far only met one lecturer who could use a whiteboard properly, all other other peoples' attempts of using it have been a total disaster.

Chalkboards come in different colors, we have dark green, dark blue, dark brown, dark red and black / dark gray at our university and I cannot say that one color is better or more readable than another. My guess is that there are other factors than the color that determines the quality of the chalkboard. I personally like the blue and red ones best as I somehow find the color somewhat soothing. Our chalkboards are made of some kind of frosted glass, I think and they are probably of the best quality there is available.

I know that some people suffer from the sound the chalk makes as it moves over the board. I'm not one of these people though and I guess that most people get used to it after a while. That's the only downside to the chalkboard I can tell. I take it that you are in high school. It's not common with kids horsing around at universities so I cannot really relate to what you're saying about that.

Comment Re:Always a niche (Score 2) 317

While I agree that we have seen technical advances during the past 20 years that have provided previously unseen tools that could be usable for teaching I don't think we should dismiss the old ways altogether.

Some subjects or fields of research if you will, are actually best taught using a chalk and a blackboard. One such an example is mathematics. A well trained lecturer that proves theorems and solves problems on a blackboard beats any powerpoint any day of the week. In fact, math is one of the oldest subjects there is that is taught and the so called didactics (the science of teaching and instruction or pedagogics) behind it has evolved over several hundred years and it is well understood. At least at the university level, high school math is a joke as far as I can tell, at least the math that has been taught there during the past 3-4 decades or so.

I've also been to business schools/universities where the blackboard has been replaced with a whiteboard and the lecturer is using powerpoints and I can tell right away that more efficient ways to ruin teaching are hard to come by! To put it simply, a subject such as math should never ever never ever be taught with powerpoint slides and a whiteboard!

So the bottom line is that don't dismiss the good old ways that have been developed and refined for centuries!

But carefully note that there are a lot of "new" subjects that have not yet found a good (consensus) way to be taught, examples of such subjects involve; computer science, economics, finance, operations management, logistics, ... etc. So it would be interesting to follow how the didactics behind them will evolve over time.

Also note that the average skills among people in math and language have steadily declined during the past 3-4 decades at least in the western world so I would say that at large, the educational system has rather devolved than evolved in spite of computers and the whatnot. I would even dare say that computers and all the gadgets around us have dumbed people down quite a bit. We don't need to be able to read and interpret maps anymore, we don't even need to be able to spell properly as spell checkers take care of that. Fact is that computers and the technical means available do more and more of the thinking for us and we should be careful about it as these means can do more harm than good as we can grow to become too dependent of them.

Comment Re:What are you smoking... (Score 1) 367

Maybe it is you that are confused. You are talking about the "2^N" approach to these prefixes as if everyone in the world uses it. Well, I've got a news flash for you; they don't. According to the ISO system, one kilogram is 10^3 grams, one megawatt is 10^6 watts, one gigapascal is 10^9 pascal and so on and this is what is widely accepted. The 2^N convention is not widely accepted. I understand it is reasonable to use it on bytes because the unit itself holds 8 bits (which is 2^3 bits) and not 10 bits. But the bit in and of itself does not have that oddity. One bit is one, just like one atom or one line of code (LOC). As far as I can recall one "k-loc" is 1000 lines of code and not 1024 lines of code.

Comment Re:bits or bytes (Score 2) 367

Since we're talking 'bits' and not 'bytes', isn't it a little bit misleading to say 512kbit/s to denote half a megabit per second? The convention of using the power of twos that are closest to each decimal prefix usually applies to bytes for memory addressing reasons but not on bits as far as I know.

The memory addressing space i.e. set (see mathematical sets ) of all possible memory addresses are always a power of 2. On a 16-bit memory bus the address space is 2^16 = 64 kB. More can be used by using memory banks where you can switch between them using some kind of a binary multiplexer. On a 32-bit system the address space is 2^32 = 4 GB and on a 64-bit system the address space is 2^64 = 16 EB (exabyte). In for example a 16-bit memory bus the address of a certain memory byte is expressed by a sequence of 16 binary digits e.g. 1101001010100101 (0xD2A5 in hexadecimal) which together gives rise to 2^16 possible combinations which is what defines the memory addressing space in a computer. This is why the power of two addressing convention is being used and widely accepted by the majority of computer users as a measure of amount of memory in RAM and hard drive storage.

Slashdot Top Deals

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...