Become a fan of Slashdot on Facebook


Forgot your password?

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).


Comment: Re:Who's chips do they use? (Score 3, Insightful) 53

Given that the SIM is supplied by the carrier, and we don't know where our carrier gets his SIMs, - they probably all get them from the same place, we are all fucked.

If you have a secret, I do not recommed using a mobile phone to discuss it.

Or indeed, telling anyone about it at all.

Comment: Re:Fragmentation is terrible for hardware owners (Score 1) 136

by Anne Thwacks (#49145191) Attached to: Who's Afraid of Android Fragmentation?
How many people kept on using 8086's once there was a 486? Kept on using DOS3.1 once DOS5 was available if they had an HD?

The early Android phones had very limited processing power, ROM and RAM - the equivalent of an 8086. Now we are getting phones with 4GB RAM (I remember mainframes with less than 1MB), then upgrading is not going to face the restrictions it did when there was only 86M of RAM.

As for "waiting for you carrier to upgrade" - you must live in the USA or Australia. Carriers elsewhere DO upgrade. However, I predict that locked boot loaders will go the way of the "Not quite IBM compatible PCs", and the Video cameras that were not Super8.

I have two Samsungs, one with a third party ROM, other with Touchwiz. The way I hear it, Lollipop is not yet ready for primetime. If you want a bug infested phone with no upgrade path, there is always Windows.

Comment: Intrusive (Score 2) 188

by Anne Thwacks (#49132153) Attached to: Google Now Automatically Converts Flash Ads To HTML5
Google ads on Slashdot have become so intrusive on Android mobiles it is not actually possible to use the web site any more!

Nothing to do with Flash, the popup covers most of the screen on my Note3, and there is no obvious way to get rid of it other than leave the site. I thought it was a varus, til I found I did not have the problem on other sites.

This is a major achievement in the foot shooting league.

Posted from my PDP8 using an ASR33.

Comment: Address the cause (Score 4, Interesting) 243

by Anne Thwacks (#49131953) Attached to: The Peculiar Economics of Developing New Antibiotics
The main reason why its insanely expensive is the approval process. Of course big pharma does not want the cost reduced, as it prevents new netrans to their cosy cartel, and America effectively enforces this worldwide.

Once an alternative approval process with sufficent credibility gets going, the story will change very fast.

Comment: Re:To answer your question (Score 1) 279

by Anne Thwacks (#49123109) Attached to: Intel Moving Forward With 10nm, Will Switch Away From Silicon For 7nm
You, speak like its 1995 before anyone fully understood OoO, or started decoupling the micro ISA from the actual ISA.

That is a fair comment - I stopped designing processors around that time, and never actually implemented OOO. I have written assembler for x86, MIPS and Sparc, and many others besides, but I have never written a serious compiler for anything. I certainly would not want to have to debug modern OOO execution hardware!

My point is that OOO is an evil brought on us by poor mapping of high level concepts onto the hardware. I would prefer to have more threads and program in Algol68. If no OOO, then the discussion is different. As it is, I have retired.

There is a small charge for the use of my lawn. Bitcoins not accepted.

Comment: Re:To answer your question (Score 1) 279

by Anne Thwacks (#49122975) Attached to: Intel Moving Forward With 10nm, Will Switch Away From Silicon For 7nm
x86 was by far the best when it came to performace per dollar, the critical aspect when it comes to corporate purchasing.

Well, we here all know that in silicon, volume is king. As Sam Cohen (Who started Tesco), said, "pile them high and sell them cheap". However, in the 1970's a lot of PHBs did not know that. They argued that you should go for margin, rather than volume. And Visicalc had not been invented (well it had, but it was only used by small businesses).

Bean counters control of large organisations. IBM corporately may only have expected to sell 10,000 PCs, but even then a lot of us were expecting sales to run into millions. We were nerds, so what did we know about business? No one was going to listen to us. (Am I sore, hell, yes!)

Comment: Re:To answer your question (Score 1) 279

by Anne Thwacks (#49122911) Attached to: Intel Moving Forward With 10nm, Will Switch Away From Silicon For 7nm
there was that whole RISC/CISC thing going on and, you know what? RISC sort of won the technical war - it may be papered over with an ugly CISC instruction set on the inside, but internally, it's all condensed onto execution on a mostly RISC core.

RISC vs CISC choice can be made on a sound mathematical basis, and is dependent instruction decode speed vs the bandwidth of the memory interface, taking into consideration the caching available.

The PDP11 was designed at a time when instruction decode was fast relative to memory bandwidth, and caching did not exist. The ISA was designed to allow a single instruction fetch to manage multiple data movements - that was the reason for being able to specify two complex, multi-byte, address computations within one instruction. Of course, later PDP11's had very powerful caching, but not the original 11/10 and 11/20.

The earlier PDP8, which had only 8 instructions, was as RISC as you can get. It was designed when instruction decode was really slow compared to the memory (even with an asynchronous Omnibus). The 8/S was insanely slow! However, during the life of the PDP8 architecture, clocks speeded up enormously (100 x ?).

The RISC vs CISC advantages are not stable, and change with hardware developments. The problem with RISC is that the compilers are extremely difficult to design and maintain, and that is always true.

I am not saying x86 is bad because it is CISC, I am saying it is bad because it is a POOR CISC.

Good CISC is VAX or NS 32032. Unfortunately Ken Olsen refused to compete in the PC market in any serious way. (And said Unix is snake oil, but would not sell VMS for x86). NS were far too late with 32032.

Comment: Re:amazing (Score 1) 279

by Anne Thwacks (#49117801) Attached to: Intel Moving Forward With 10nm, Will Switch Away From Silicon For 7nm
Personally, I am quite OK with 80MPH cars running into cockroaches.

Unfortunately, neural computing, as demonstrated by animals, is guesswork. People mostly buy computers cos they want the right answer, not a good guess. Babbage's original intention was to build a machine that gave the right answer, or no answer at all. (Not even "Error at or near line 1, column 1").

If people want a good guess, they ask Uncle Eric, or the postman, or the nice lady in the house opposite (or the Office of National Statistics, but since she works there, it is usually easier to ask her). But if the answer is floating point, you could try a Pentium.

Comment: Re:To answer your question (Score 5, Interesting) 279

by Anne Thwacks (#49117777) Attached to: Intel Moving Forward With 10nm, Will Switch Away From Silicon For 7nm
Just translate them on the fly, as they've been doing for years.

You can, and people do. However, the issue is not translating one x86 instruction to one [insert ISA here] instruction. That has been done since x86 was invented, and was common with previous ISAs before that. The real requirement is to translate source code that maps to a bunch of x86 instructions into ONE [trendy ISA] instruction. This will obviously be easier if x86 is thrown out the window.

Historical note: x86 is a bastadised rip-off of the PDP11 instruction set. The PDP11 was built as a "hardware Fortran machine" ie one instruction represents one Fortan instruction as far as was achievable in 1970. C is (just one) PDP11 assembly language! The VAX instruction set was an attempt to achieve a higher level machine code, which worked quite well - most VAX assembly instructions are actually function calls to application specific microcode.

X86 was a poor ISA when the first 8086 chips were made (but good, given hardware capabilities at the time). That was about 40 years ago. MIPS and Sparc (and ARM) are all better than x86.

The moral of this story is that it is "first past the post" in this game, cos people hate it when their favorite app stops working. (See Great Western Railway, Brunel and 8' gauge).

Byte your tongue.