I started using UNIX with SunOS 4.1.4 on SPARCstation 1+. Still have my SS2 somewhere in the attic. Still have my Ultra 1 Creator. Can't throw them away.
I like the speed of my W3680 - but it's just not the same.
Guess I'm old
- How many hardware engineers does it take to change a light bulb?
- None, we'll fix it in software.
Doing stuff in software to make hardware easier has been tried before (and before this kid was born, perhaps why he thinks this is new). It failed. Transputer, i960, i432, Itanium, MTA, Cell, a slew of others I don't remember...
As for the grid, nice, but not exactly new. Tilera, Adapteva, KalRay,
1) block outgoing port 25 to everything but their own mail servers;
2a) add an optional feature in each customer account to reopen outgoing port 25;
2b) add an optional feature in each customer account to pick the reverse DNS entry;
3) tell every other ISP/mail servers operators what they have just done,
so they get un-blacklisted since they won't be sending much spam any more.
This should block most of the outgoing spam without any side-effects,
since power users will still be able to operate their own mail servers,
complete with reverse FQDN. Non-power users won't notice a thing.
Also, they will save money on bandwidth to the outside world.
That's what my (strictly residential) ISP has been doing for almost a decade.
Works perfectly well for everyone involved.
64 cores, mesh network that extends off the chip, in production.</p><p>Try harder MIT
They already tried harder : http://www.tilera.com/. And as another post mentioned, Intel Knights Corner is cache coherent on 61 cores (62 architectured).
The summary doesn't get the point of the article: what's novel is not the presence of cache coherency, it's just the new way of implementing snoop-based cache coherency over their network. Cache coherency for a large number of cores can be very expensive time-wise, so any idea to improve it is more than wecome.
Why is everybody thinking this is big news?
The previous compiler, based upon Open64, has been available in source form since CUDA 1.0. They (partially) switched to LLVM in 4.1, and they also release the source code. They didn't have to, because unlike Open64 LLVM is not GPL, so it's nice of them, but it's not exactly earth-shattering news...
And because a picture straight from the horse's mouth is worth a thousand words, here's what NVidia has to say about it:
Go to 36.5, figure 36-11 & 36-13.
The Library of Congress used to have a goal of including complete hard copies, at least for items of US origin and 'good grade' (that is, they aimed to have copies of things such as hardback books that were intended to last, more than, say, ephemera such as the pulp magazines). However, that goal has become an obvious impossibility due to sheer volume. After about 1960, the library began being more selective.
And the situation is infinitely worse for other medias. Not only aren't people trying to preserve them, in many case they have been actively destroyed, in particular television broadcasts.
Two examples of the casualties:
- http://en.wikipedia.org/wiki/List_of_surviving_DuMont_Television_Network_broadcasts lists what survived from a decade of broadcasting on the DuMont network. Everything else was destroyed for various reasons.
- http://en.wikipedia.org/wiki/List_of_The_Avengers_episodes labels most of the first season of the famous TV show as "missing", because the tapes were re-used in the 60s or 70s to save money.
The relevant wikipedia category is http://en.wikipedia.org/wiki/Category:Lost_television_programs. It's hard to believe so much television history has been lost forever.
If I buy a Chablis or a Burgundy I want a particular type of wine. So what that these wines originated in certain regions in France?
They didn't "originate". If it's a burgundy, then it hast to come from the region of Burgundy. It's that simple. Also, for the record: if you buy a Chablis, you also buy a Burgundy. Chablis is a sub-region of Burgundy.
I don't give a damn where it was made. I would say most people who drink them don't know or care either.
Some of us haven't ruined their taste buds with bad beers and ketchup sauce, so we do care. Where the wine was produced makes a lot of difference to the taste. If you can't tell the difference, please go back to drinking Budweiser.
I'm told by a French friend who is a wine buff that the Aussie wines he can buy are superior to French wines (seriously), so this makes the whole thing sound like a ploy to recapture an ailing market.
There is no such thing as "superior", either way. There is such as thing as "different". Then it's a matter of taste. Australia, California, Chile, Algeria all make very good wines. They just aren't Burgundy, or Champagne. Would you expect a "Scotch Whisky" to come from Polland? Obviously no. It doesn't preclude Japanese to make great Single Malt Whiskies. They just don't make Scotch Whiskies. Think of it as a trademark, shared by all the producers from one geographic region. You can't buy a Macintosh from Hewlett-Packard, can you? So why should you be able to buy a Burgundy from someone that isn't located in the region of Burgundy, and therefore doesn't share in the trademark?
Link to Original Source
And how many MB can you address with a 32-bit pointer under the IEEE recommendations?
1) I don't know...
2) It doesn't [censored] matter!
3) It's exactly 4 GiB, or 4096 MiB, how hard is that?
4) I don't use 32 bits pointer anymore anyway.
See, it's very easy: just add a little 'i' in there, and it works exactly like before, just unambiguously.
Over 300 posts and counting, and all because people can't type 'i' to make sure there's no possible mistake... The world is doomed, I tell you, doomed!
(sigh) No. The maximum available memory was measured in Megabytes, then Gigabytes. It was always base 2.
Well, not at first, 'cause a megabyte (of whatever size) of memory didn't exist yet.
Second, it was wrong but easy. It's still wrong and for memory, it's still easy. That's what it's still in use for memory, and is being ditched from everywhere else.
The main problem was that the binary prefix came too late. Old habits die hard. But as AA proves, it's possible to ditch a bad habit if you really want to. It's a matter of willpower. Yes we can, if I may so bold as to say so.
(for the record, I have been involve in computing since before Bill Gates informed us that 640 Kilobytes should be more than anyone will ever need, so I actually was there as all this unfolded)
And the fact that you're the new kid on the block is important because...?
Because our computers, almost since their earliest inception, work in base-2 arithmetic.
And this matters because...? The SI prefix are used to denotes quantity. Except for the total size of semi-conductor memory (and sub-elements thereof), those quantities are usually completely unrelated to power of 2.
Disk sizes are not a power of 2. Files stored on them are even more arbitrary in size. Same for memory requirements of various codes. Heck, once upon a time, HPC programs would overallocate arrays to avoid power of 2 allocation (multiple of the page size wrecked havoc on direct-mapped caches)!
Frequency are not power of 2. Your 3 GHz processors runs at 3*1000^3 Hz (well, probably not very precisely
Heck, these days, even memory buses are moving away from strict power of 2: GTX 275 have a 448 bits-wide memory bus, Core i7's can be described as 192 bits.
And for the few cases where it matters (usually not end-user visible) that's what the freaking/frelling/rutting/smegging/[pick you favorite SciFi show euphemism] binary units are for !
(I'm not going to try to make sense from the rest of your post, because I can't see how the number of bits in a bytes is related to the discussion in any way).