Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re: Distrust of scientists and not science... (Score 1) 173

Science itself does not have a bias, but scientists have bias' and they have vested interests. Who is going to bite the hand that feeds it? It does not mean it is a conscious bias, but as humans we all have bias as such the "expected" result has to be hidden in some sort of double-blind methodology.... For hair -- give them the prime sample and then 10 or 20 samples that have a mix of similar DNA origins (i.e. if it is asian hair - 20 samples of asian hair mixed in with the suspect sample to match to). For reporters, after they write a story - the story should be vetted by a knowledgeable scientists to make sure that the summation of the work is inline with what the scientific paper is actually indicating (often it is 90 to 180 degrees off of the scientific publication).

Comment Re:Lifting a *PDP*?! (Score 1) 85

It was such a long time ago, I can't remember what it said on it.

It was probably about the size of 4U rack (though it was not in a rack), the bootstrap I believe was input by a series of switches..... (it was not long - maybe 16 or so words) - I think there were 16 switches.... It would have been around 1978 (I believe I was 14 while doing some free work in the datacentre - though the big computers were around 100 miles away).... and it was probably not leading edge :p. I remember the tail end of an era.... the programs were still on punch-cards that were loaded in and transmitted to the big computer for real work.... (not the small one) which probably disappeared the following year or two after that work experience.....

Comment Re: It is about balance and sufficient resources.. (Score 1) 85

HDD is many orders of magnitude slower than SSD and quite a high latency, SSD is orders of magnitude slower than Memory, Memory is orders of magnitude slower than CPU cache etc. Getting a much faster CPU (GHz wise) does not increase the performance of the machine as much as many seem to be brainwashed into thinking. If you just upgrade a CPU with one 30% faster you will only get a fraction of that overall. It is all about having sufficient resources when you need them for the task at hand.

Performance for user computers (as opposed to servers) is very much about perception. If your application opens snappily to begin with, the user will feel the computer is faster. If there is sufficient CPU power when you need it, it will have the same effect. It is all about balancing and making sure that your slower components are not needed as much as your faster components. Most users have CPUs sitting 85% idle most of the time..... getting a faster CPU only will increase the amount of idle time and not give the user a better experience.

The greatest performance boost in recent times is the advent and rapid improvement of the SSD. The stock CPU in many computers is increasing in performance in the single digits...... The majority of applications don't tax even the Core-M CPU (for the majority of users).

Comment Re:History repeating (Score 1) 85

I suspect the Macbook makes better use of the Aluminum shell as a heat sink of sorts. One review said the laptop would become warm (but not hot) under load - and was not throttling..... There were issues with power utilities for Lenovo early on with regards to this line - something Apple has worked on the last few major releases of OS X - which gives me some confidence Apple is ahead of the curve there. The first few Macbook benchmarks seem to match what I would have expected (above others) which seems to bear this out. BTW, Are you sure you are not remembering PDP.... since that was what we did with the old PDP-?? (11 I think) was if it did not boot - lift and drop then boot.

Comment Re:New Macbook (Score 1) 85

It may be as simple as how much a CPU/GPU is used in decoding the video stream (resolution / compression etc.). In addition the built in player is going to be more efficient than MPlayerX and VLC (I have noticed internal one can decode higher resolution videos on a machine than both of the others - I am guessing because of use of GPU or something).

Comment Re:Automatic Reference Counting better... (Score 1) 211

It is functional at it's core but not purely functional. I could write purely functional code inside Scala, it is more difficult with Swift. Simple things like flow control in Swift is just that.... flow control.... not functions. if in Scala is a ternary function - similar to java ternary if - but more verbose. Switch - same. On things like optional values - I do not have access to map them to something else, I have to unwrap them using "if let" flow control structure. Swift could easily add most of these in effectively make Scala redundant, but as the designer of the language said.... it is not a priority at this time (aka don't count on it).

Yes, Scala is every bit object-oriented as Swift (actually more so since everything in Scala is an object, while in Swift the same cannot be said). I have NEVER heard Odersky admit Scala is a disaster. He has said that it needs a fundamental rethink as in make it simpler and moving things from core to non-core etc. All languages need that though. Java could really really use a rethink at it's core as well.... any aging language can. There are areas (edge cases) in Scala that are problematic. The compiling of the language is really slow. And of course the manufacture of the jvm has determined that they want to become a malware distributor....

What I am looking for is something along the lines of Scalas object-oriented / functional language balance in an LLVM based compiler -- that is cross-platform/open. Swift could easily get close enough if they wanted to. Scala can get there as long as they don't require exact language compatibility between the jvm version and an llvm offshoot.

Comment Re:Automatic Reference Counting better... (Score 1) 211

There is a vast difference between an interpreted language such as Python and a low level compiled language when you are talking about "reference counting". With a language such as Python it is really still garbage collection.

With a compiled language on the LLVM (high level transportable assembly language which in turn is turned into machine code) it is not actually a thread running garbage collection in the background it is the Clang/LLVM compiler inserting "free" memory calls used as it goes out of scope. This is of course a little simplified, but there is a difference.

Apple had garbage collection (optional) for use in objective-C up until they started releasing iDevices where it becomes a zero-sum game when it comes to inefficiency. When you are talking about a computer plugged into the wall -- you won't notice if an application is consuming n% more energy or not. When you have a finite amount of battery power, it becomes noticeable (especially for the iDevices, but it has had a knock on effect with laptops). Apple has worked hard both in the core operating system on how it implements cpu cycles through it's scheduling to save energy on laptops and in the application development platform itself to squeeze just a little more out of the battery which allows the battery to be smaller or the amount of time the system can operate on battery.

When java and smalltalk were created, a garbage collector solved more problems than it created since no one worried about efficiency.... but platforms have changed since that time and efficiency has to enter the equation.

Comment Re:Automatic Reference Counting better... (Score 1) 211

Swift is a nice language, I get the feeling the creator/driving force behind it wants it open-source cross platform -- but no commitment yet (and likely will not be for years until it has completely matured). The only issue I have with Swift is that it does not fully embrace functional programming and is really an object-oriented programming language at heart (with some functional niceties) .... but then the language is designed for UI development and the UIs themselves are object-oriented.

I would really like some entity to get behind making an LLVM version of Scala -- or close to it (which would dump the JVM and of course the expectation of JVM garbage-collection).

Comment Rust looks like an exciting addition... (Score 1) 211

I spent the last 45 minutes watching an introductory video on Rust and I believe it is a new exciting language that is badly needed. Yes there is a lot of language competitors out there, but most of those languages are higher level than C / C++. C / C++ have had very little competition in their end of the language spectrum. They are old, not very safe. Rust on the other hand is type/memory safe by default -- although you can run "unsafe" code within there by explicitly stating so and then the compiler relaxes the checking a bit (a reasonable compromise). Most people will not need much if any "unsafe" code to be performant.

The fact that there is another LLVM, cross platform compiler is also a plus for me.

That said, I don't write much C / C++ code because nothing I am writing needs it -- but those languages are still necessary (and very popular) so coming up with a newer competitor (not 40+ year old language)..... is great.... and exciting.

Comment Automatic Reference Counting better... (Score 1) 211

The end result is not going to "fragment" memory, it is about when and how memory management is done. Unfortunately for the sake of allowing programmers not to know what they are doing or what is going on they implemented garbage collection in object oriented languages like Java, which is generally fine in server applications that run in a VM.... but it has a cost to doing that. A better compromise (IMHO) is "Automatic Reference Counting" which is done at compile time and does not have to be programmed cluttering up your application..... BUT you have to be aware of memory management (which is a good thing for programmers to have to know) and not program cylindrical references where both references are strong. Best of both worlds (for the most part) without the unpredictable and resource "hungry" garbage collection running in the background. The side benefit is that you have C/C++ efficient code that can be compiled down to the machine level and not run in a vm

There are generally three options available:

Manual Memory Management:
- Cons:
1. Pain to code having to write code that does memory management and just adds extra bulk to a program
2. Easy to make simple mistakes that cause the system to crash or become unstable
- Benefits:
1. Instantly freed objects
2. Predictable behavour
3. Smooth Performance

Garbage Collection:
- Cons:
1. Garbage builds up
2. Nondeterministic
3. Performance stutters
4. Extra CPU cycles taken to have a process run in the background kicking in (drains battery on portable devices).

Benefits:
1. Development Ease
2. Don't have to understand Memory Management
3. Reduces Crashes

Automatic Reference Counting:
Cons:
1. Have to be aware of memory management and not program cylindrical references where both are strong references (actually have to understand what you are doing).
Benefits:
2. All of the Benefits AND Cons from the other two memory management schemes
3. "garbage" collection / memory management is inserted at compile time and is not a background thread running on your device.

Comment Selected the wrong datatype = poor programmer (Score 1) 486

So they basically selected a bad datatype and wrote a very inefficient program to handle manipulation of data and they use that as the basis to say that memory was the issue. The issue was programming without thought to what the computer was actually doing. Is this what these two Universities are teaching their students? Were they being purposely bad programmers to prove a point?

God help the world if these people ever have to program efficiently....

Comment Sounds mostly like sour grapes.... (Score 2) 269

I find a number of facts to be in basic conflict in the report. Most developers can't make a living through the app store, yet they are afraid of Apple for some reason - even though they cannot make a living. First the App store makes it fairly simple for every tom, dick and harry to write an app and put it on the store shelves. They don't need to package it, they don't need to setup their own web-sales site.... The problem is that you have a bunch of app developers that think if they write some small app that a trail of customers will beat a path to them and buy it, they think that any stupid app will make money. A lot of small apps will drive down prices for those apps, the smaller the easier to make the app the more competition. I remember 30 years ago that there were many substantive applications to do some basic functionality... word processing. I don't know how many different ones were created, but there were quite a lot. I know my father had 9 installed on his Windows computer just to compare them himself (head of an institution) to see which ones were any good. Most of those companies went bankrupt quickly - even though there was substantive (much much more than most apps in the store) development put into them. Unfortunately the current generation seems to think they are somehow privileged and if they write something they should be able to make a living at it... it is not the way the world works. You have to compete, you have to invest time developing an app that you are passionate about, you have to risk losing time/money on the venture. You have to market your own app outside of the store, and you have to differentiate your product from all others. If you are really lucky and you do all those things correctly, then maybe you can be one of the few that can turn it into a viable business. What it strikes me is that there are a lot of cry babies out there that either have not invested enough or have enough skills to make a go of it. Apple does not owe you anything -- it is up to you to market it. You have to approach it like Apple would which means you have to differentiate your product and worth more to people to buy it than the other products -- even if the other products are lower priced. All the app store did was give you a place where someone can enter the credit card and buy it.... Apple does not owe you anything. As far as developers being afraid... guess what.... it is not that much different than normal business.... When I do business I don't go out of my way to stab companies that I am working with -- it is just not good business. I usually approach it with two faces.... one for when I am dealing directly - where I am more honest and then one that is a public face where I don't air any dirty laundry because it is not good business.

Slashdot Top Deals

If God had not given us sticky tape, it would have been necessary to invent it.

Working...