Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:C++ is probably a little bit better (Score 1) 407

Well basically no. Given your statement:

  So flame away, fanboys. I'm used to it. The truth hurts, and the more squealing I get, the more I know that I am saying the truth.

I don't expect this reply will get through to you or get meaningful a responses. Others might benefir so here goes:

Even with the standard libraries, there were rarely systems without a lot of custom storage code. By it's own claimed abilities for code reuse, C++ was a failure before C++11.

Well, depending on how you meant that, it's either wrong or a triumph of C++. I have used systems where there has been extensive use of custom containers. The main reason for that was it targeted android and android didn't really ship with C++. It shipped with a language looking much like C++ except with several key features and the standard library nuked. It's hardly C++'s fault for languages that are almost-but-not-quite C++ not being C++.

Other than that one, people mostly seem to make do with the standard containers for things they work for.

In my own code I also make use of two non STL containers, one for images and one for Linear Algebra (vectors and matrices). It's a triumph that these work and look and feel just like native or standard library things. They also don't use custom code: it's either C++ arrays for fixed In my own code I also make use of two non STL containers, one for images and one for
sized objects and backed by std::vector otherwise.

So now I'm going to make the same mistake again. If we take Stroustrup's publication of The C++ Programming Language in 1985 as the start of the ongoing C++ era, then it took over 25 years for the language to become somewhat OK.

No one will deny that C++11 was late. Hell it was meant to be C++0x. However, you're comparing one of the better languages today (C++14) with languages from 30 years ago which is disengenuous. It took C++ 25 years to become a good language relative to others 25 years from its inception. But it was good in the mean time compared to its contempories too.

I wouldn't want to use '98 era C++ now, but I wouldn't want to use Java 1.1, VB6, ancient javascript, bash 2.06 or a host of others now either.

In my estimation, C++ was never a good idea.

Well, the world more or less disagrees with you. It's the only language out there that provides high level abstractions and low runtime penalties. The only serious competitors have come around recently and aren't really production ready. What else scales as well?

Changing the internal workings of an object is very likely to propagate outside the object.

Not if you code worth a damn. Yes we all know objects are sized things and changing the object requires a recompile. However, my code compiles on LLVM, GCC and Visual Studio and ha at various times compiled on MIPS Pro and whatever that sun compiler was. I think someone got it to compile with STLPort on Android too back when that was a thing.

That's 6 complete, from scratch reimlementations of the same objects (the standard library). Yet despite those complete changes no fooling around was required in that regard.

Comment Re:Java (Score 1) 407

Well, I've got one from about 16 years ago but I'm pretty sure they still make them.

It has a large knife, a small knife, a corkscrew, a hook, some scissors, a small flathead blade/can opener combo and a larger flathead screwdrive/can opener combo an awl, a toothpick and some tweezers.

The handle is a good width and I've used all of the tools repeatedly. The main disadvantage is th blade is quite soft and so it's now worn down from multiple re-sharpenings.

Comment Re:C++ is the only logically option (Score 1) 407

I still prefer my homegrown lib for lists and trees and such

Uh so how is your library less "bloated" and more readable than the STL?

ut when I have to choose some publicly available software, I pick Boost over STL

So how on earth do you square that with boost mostly being a proving ground for the standard library?

Comment Re:The real morale of the story (Score 1) 217

What's wrong with wireless interfaces? There's oodles of bluetooth, xbee and wifi modules out there, so unless the requirements are far out (e.g. very small, VERY long range), the problem is easily solved.

Not that I'm defensive, but I say this as someone who is currently developing some hardware and it's close to production ready. I'm also mulling a kickstarter campaign to get over the last bump. It uses a BLE module and it works fine. In fact that was the easiest bit to get working.

Comment Re:Is it finally happening? (Score 1) 112

Well, I'm not really sure about that. The ultrabooks are generally quite high end as laptops go. Sure, they're not a match for a 17" luggable but unlike the cheap small laptops, they tend to be pretty fast, and have a decent i5 or i7 on board and a decent amount of RAM and drive space. So, the little pocket sized computer would perform awfully compared to an ultrabook.

Comment Re:Closed source GPUs (Score 1) 112

Just open source users? Those blasted things are the bane of ALL users.

I had the misfortune to be on a project using a Toughbook CF-U1. It's a tablety thing and the toughest of the toughbooks. One project user dunked it in a sink of water to see if it really was up to spec. It was. Our one was dropped on to rocks, fell into snow and I think someone actually started to slip down a snowbank with it and dug the machine in to stop himself. By the end it didn't even have a scratch on it.

Great hardware!

But oh god. The thing was a single core 900MHz atom and didn't exactly run XP well. The 3D graphics would hard lock the machine if you sneezed wrong. On Linux, well, it ran Linux better except the graphics would lock it up even if you didn't sneeze at it wrong. It was a bloody nightmare.

In the end the group needing 3D ran XP, and the other group needing 2D only ran Linux in VESA FB mode, because that seemed the most crash free solution.

But in general, the result on either platform was shockingly terrible. Panasonic must have been seriously pissed off because they put in a lot of effort and built a class-leading platform only to have intel shit out some awful drivers that ran nothing well at all and were so crash prone it felt like running Windows ME all over again.

Comment Re:Deja vu all over again (Score 1) 112

I have trouble seeing room for the ARM chips at the moment.

I mention this in manu such threads, but you can already buy quad socket, 64 thread, 512G (1TB if you're very rich) RAM 1U boxes. There's very little spare room in those. It's mostly CPUs and RAM. There's a small gap above the rest of the motherboard, where an expansion PCIe card can sit, room for a few drives at the front and sometimes a second power supply.

Very dense, very fast and excellent VM performance: that's one area where AMD do particularly well. They're also pretty cheap, especially given that level of performance.

I don't know what the use case for ARM would be. They have less performance per thread, a smaller maximum system image and a few other downsides. Given that the servers already perform well with VMs, having more, slower cores won't help for much. You can always partition up the existing machines, but can't unpartition the ARM ones. I'd be surprised if you could fit more CPU grunt into a much smaller space. Many have tried, but currently the cheapie 1U/4S boxes are some of the densest things you can buy unless GPUs do your workload well.

If you really don't need a large system image, then you're paying for the silicon and power for a fast HT bus (or QPI if you go for intel). However, that's solved by using 4 1S desktop processers in the same box (AMD ones do ECC RAM), and they still partition up well. Some vendors alread make 2x 2S in 1U. I've not personally seen 4x 1S, but I'd not be surprised if they exist.

Comment Re:Deja vu all over again (Score 2) 112

There wasn't really a legacy software advantage for x86 in the Mac arena either.

Indirectly there was and that's all that matters: The x86 legacy advantage was unassailably strong in the wintel world. And wintel had the lions's share of teh sales. As a result, Intel had more money than the competitors to invest in both processor design and process technology, the result of which is that intel eventually overran their competitors.

The processors for macs just couldn't keep up because Motorola and then IBM didn't have the volume and margins in their chip business to be able to compete with Intel.

The world is a bit different now, but is it different enough to matter?

The whole expensive x86 front end decoder cost used to matter on the desktop, but eventually the large number of parallel functional units and the OoO logic to keep them filled started to dominate massively. Then it used to matter on phones, but now it's pretty much reaching the stage where phone processors are so large and powerful that similar things are happening.

But the low end still exists (below phones), so ARM will never be squeezed out by Intel. There will always be a market for some noddy core with 2K of RAM and on that scale the decoder matters.

So, ARM is there. While nothing like as rich as Intel, they put most of their developement into the CPU tech, not process. The world has also hit diminishing returns in CPU design. Back in the past, there were "easy" developments like the caches, MHz wars, the transition to superscalar, out of order execution and vector instruction units. Once those topped off, the next bit was tweaking the cores for more IPC, e.g. the Core 2 to i7 transition, but returns have really diminished in that regard. Now it's got to the stage where it takes massive effort to get a few percent IPC better.

In recent times Intel have dominated IPC. However, while competitors may not ever catch up completely, it's easier for them to close the gap than it is for Intel to keep it open because Intel have already taken their low-hanging fruit.

Still, Intel have one of the best CPU design teams out there, which is always going to be an advantage.

Then there's the process tech. This is another area where intel lead, but the world has been losing fabs at a shocking rate. Previously, Intel was the 800lb gorilla up against a lot of other smaller chipmakers with smaller market sizes. Everyone else has been consolidating so intel now has fewer, but much larger competitors. This will make life harder for Intel relative to the past.

Intel is not a gun for hire. This has positives and negatives. On the plus side, they bend all their resources to fabbing the top end PC chips. On the minus side, the major phone manufacturers can't get custom chips like they can with ARM, which means that unless they are very lucky, they're paying for things they don't want or have lower integration if they go with intel.

It also means that the other people can test out new processes on smaller chips. Large area makes the probability of damaging defects go way up. Other fabs can do smaller chips on new processes which keeps the proportion of defective units lower.

Comment Re:A different take on this (Score 2) 234

That's bullshit, because the ISPs sold "all you can use" plans, then failed to deliver. The only reason the so-called "cost shifting" went on is because the ISPs outright lied about what they were selling to consumers. To imply that Netflix allowing customers to use what they've paid for is somehow wrong is just plain wrong-headded.

You're basically blaming Netflix for the ISPs mis-selling a service.

Comment Re:Can someone please answer (Score 2) 420

I said elsewhere that this is a scam for the following reasons.

Except a good chunk of slashdot, absolutewrite and a few other also completely unrelated forums and IRC channels are in on the "scam" and have a bunch of people who have joined the conspiracy to pretend it's blue and black. Or white and gold, in which case I'm in on the scam and hereby declare I got my note from a shady black vehicle with blacked out windows this morning at precisely 5:50am at the dedicated drop point.

It's not a scam, because it frankly doesn't matter what the original colour of the dress is.

The interesting thing is it's sufficiently close to some average threshold of human perception that nearly half the population perceive it as completely different from the other slightly-more-than half.

At that point it wouldn't actually matter if it was a 'shopped image of a dress covered in purple unicorns.

Slashdot Top Deals

"May your future be limited only by your dreams." -- Christa McAuliffe

Working...