- plastic explosive
- any other high tech device
You will note an oversupply of:
- fanatics prepared to use them
You will note an oversupply of:
I used the unpaid example to draw a sharper contrast. A large block of time off is generally unavailable under any terms, except at companies like FB (or apparently everywhere in Europe) that explicitly call out child-rearing.
Since you seem to know of the system: If European democracies have a state system for paying for the leave, did the debate include proposals to allow payments for other avocations?
Europe is a big place and it probably varies to jurisdiction to jurisdiction, but I'd say generally no. I know that here in Norway there are a few other exceptions where the government may step in and pay like if you're giving care to someone seriously ill because you're a de facto replacement for public healthcare but for personal projects you are on your own. It has been suggested though that those who want to be slackers can be employed for a relatively short while, then go on unemployment benefits while making crap applications for jobs they won't get, flunk interviews and in general be unemployable while formally meeting the requirements. For the more serious people though I know some that have gotten 6-12 months unpaid leave to pursue some personal dream in the private sector, in the public sector it's even easier.
I think this very much relates to the use of overtime and wage politics, in Europe you generally have to pay for every hour and to be honest you're usually paying overpaying unskilled/untrained people and underpaying your best people. Which means that if good employees don't get their leave and instead quit thinking their CV is good enough to get re-employed a year later you as an employer lose. They have to deal with similar leaves quite often for the 50% that's female and we also have a shorter paternity leave, so really there's no reason to be a dick about it. I guess it depends on why though, if you're starting a competing business then no.... the one I know who got a 12 month leave sailed around the world. Pretty bold move, but it was also fairly certain he'd be coming back. I think it takes a large company though, the smaller the harder it'll be.
It would be hard to argue that Apple's decision to leave out the floppy drive didn't cause the situation we had 5 years later.
It wouldn't be hard to argue that at all! (#)
As I said, CD writers were already getting cheaper by the late 90s, and Apple can hardly claim credit for hastening their adoption since they didn't even include one.
Yes, the 1.44MB floppy format's capacity was already outdated and starting to look badly out of sync with the sort of file sizes and uses common by the late 90s (cf. the rapidly-growing capacity of hard drives, and the amount of data already-widespread CD-ROMs could hold). The pressure for a replacement was already there in the PC market, the only problem was that no realistic alternative at a practical price had received universal adoption by then. Apple's abolition of the floppy didn't provide a solution at all, it only forced their users to buy external floppy drives.
At best, as the other guy suggested, Apple provide a marginal level of forward pressure to something that would have happened anyway.
If anything, what Apple *do* deserve some credit for is encouraging the adoption of USB, whose time had- or should have- arrived by then. And even that was available in PCs at the time- the one I bought 3 or 4 months before the iMac came out included USB, the problem was that it wasn't that well-supported, and there seemed to be no hurry to do so. So maybe they helped that- and it could be argued, indirectly helped the adoption of USB pen drives several years later- but even that was by forcing the issue (i.e. abolishing legacy ports), and I suspect that USB would have taken off eventually anyway. At least in that case they included a realistic alternative, unlike with the abolition of the floppy.
(#) I think your nickname gives away your slightly partisan nature
Apple has always been doing stuff like this. I remember when they removed 3.5" floppy drive [..] Cue a lot of companies having to buy external floppy drives at ridiculous prices.
Don't know if you were thinking of the original late-90s iMac, which Apple made a big hurrah about not including a floppy drive. Except that- for all its archaicness- there was still no universal affordable alternative to the floppy (#), which is why almost every bondi blue iMac you saw had a external floppy (in matching colours) hanging off it anyway!! (Ironically far less tidy and aesthetically pleasing than having it built in like the CD reader would have been).
Had they done that five years later, yeah, it'd have been more sensible. Circa 1998, it was just a contrived anti-feature that gave Apple a "we're so futuristic" selling point anyway, one that fanboys still trumpet today.
(#) CD writers were starting to come down in price quite fast by the late-90s, but they still weren't cheap enough at that point to be included as a default option, which would explain why Apple didn't even include one! The Internet (which IIRC was one of their suggestions for transferring files) was still 56kbps dial-up even for most people that *did* have it, and far from everyone did back then (remember that the other person you wanted to exchange files with would *also* need Internet access). Pen drives weren't even around then- Wikipedia claims that the first ones came out in 2000- and would take quite a bit longer to reach dirt-cheap floppy-replacing affordability.
Win32 != WinNT. Win9x (Windows 95, 98, ME) implemented Win32, but did not use the NT kernel. Hell, Windows 3.11 (which was 16-bit, while all NT versions have been at least 32-bit) had a partial Win32 implementation called "Win32s" that could be used to run some Win32 programs even though the kernel was still 16-bit. Windows CE (including Windows Mobile, aside from the confusingly-named "Windows 10 Mobile") implements Win32, but is completely unlike the NT kernel. A modified version of the CE kernel was used on Windows Phone 7 (I know, because I wrote code that parsed WP7 kernel data structures, which scarcely resemble NT ones), but CE was scrapped in favor of NT for WP8 and later.
CE has been ported to more platforms than NT and has much lower minimum requirements, but uses a far simpler memory manager, does not support SMP (multiple hardware cores), does not support NTFS (it uses a weird variant of FAT that allows files to be "modules" loadable as executable images but not openable as files; NT has no such concept), does not support any kind of access control (a rudimentary access control system was grafted onto the CE kernel for WP7, but it bore no resemblance to the NT access control system that WP8-and-later use), does not support NT drivers, does not use the NT bootloader, is missing many security features (of which user accounts and access controls are only the most visible) that NT has, and is generally unsuited for anything except embedded systems (but has some features targeting those, such as some real-time support, that NT lacks).
Just basic literacy will help a lot. Most conflicts in the world involve illiterate soldiers on one or both sides. Modern war is very expensive, and very destructive. War almost never makes economic sense. Most countries have market economies, so if your neighbor has resources that you want, you don't need to take it by force, you can just buy it.
Bad for you, worse for the other guy. Don't underestimate how much the stronger player can abuse their position until they go one step too far.
I posted a link to the actual thread in Reddit (on the SpaceX subreddit) where the discussion too place, and Slashdot stripped out the link. Is this now normal?
Some of the component shops around here have PC-builders, basically you pick (from their approved selection) case, psu, mobo, cpu, ram, graphics card(s), disks etc. and they'll assemble and test it for you. If you want to start fresh and not use any parts from your existing setup that's a quite practical way to getting the parts you want without fiddling with screws and cables and DOA components (well unless they fail during shipping). Personally I rarely start over from scratch though, it's rare that everything is so outdated it's best to start over.
That really is the big issue with a self build: If something goes wrong, you have to track it down and handle all the support. If you get a pre-built from a good vendor, they'll handle it all. Say what you want about Dell, but all you have to do is run their diags (baked in to the UEFI) and call them with the code, they'll send a dude with the parts needed.
So that should be the major thing you think about. If you don't want to do support, then buy it from a vendor that will provide you with support to the level you require. I tend to recommend Dell because their hardware is reasonable and they have support available everywhere. They subcontract it, but it all works well. We use it at work all the time.
If you are willing to do support yourself, then building it gets you precisely what you want. I build my system at home because I have very exacting requirements for what I'm after and nobody has that kind of thing for sale. Like I don't want a "good large power supply", I want a Seasonic Platinum 1000, nothing else.
Also you'll find that generally at the higher end of things you save money building a system. For more consumer/office range stuff it usually is a wash: They build the mass market systems around as cheap as you could afford to. However when you start talking higher end gaming stuff, you can pay a large premium for things.
As an example I just built a system for a good friend of mine. He wanted some very, very high end hardware and pretty specific requirements. Origin PC would get him what he wanted... for about $9,000. I put it together for around $6,000. The gamer stuff often commands a hefty premium.
Crap brands who bought the name of a previously respected company, e.g. Polaroid
Interestingly, the "Polaroid" camera listed there (the Polaroid 300 / Polaroid PIC-300) is actually just a *Fujifilm* Instax Mini 7 camera. That's right- the only camera Polaroid now sell that uses anything like the traditional Polaroid film technology is actually one made by Fujifilm (who licensed the patents from Polaroid)!
The current owners of the Polaroid brand *do* appear to be treating the instant photography line with a little more respect than the previous owners (who cancelled the original Polaroid film cameras- the only non-licensed thing they did, as even "their" digital cameras had simply been licensed-out rebrands). But outside that, they're still continuing the habit of whoring out the Polaroid name to random third parties for rebranding cheap tat no-name electronics, such as LCD televisions. In fact, in the UK, they're actually letting the supermarket chain Asda use it (in effect) as an own-brand for audiovisual products.
So, yeah, the Polaroid name *is* being used for random tat, but in this case, the "Polaroid 300" is actually just a rebranded Fujifilm model that gets decent reviews. Though I'd probably just go for the Fujifilm one myself anyway.
It would help to read the actual thread where the process of elimination took place rather than the NSF forum thread which was after it was already hashed out. The discussion also took place on IRC, but it also involved evidence that went well beyond just the photos, including automatically eliminating all launches from Vandenberg and then trying to perform some ocean current analysis to try and figure out how long it would take for ocean currents to push rocket debris from between Florida and Bermuda to make its way to the British Isles.
It is explosion debris.
No, it is not necessarily explosion debris. This particular section appears to be from the interstage section between the upper and lower stages of the Falcon 9, which is jettisoned and left to fall on its own to wherever it might land after the two stages separate. The lower stage breaks off first and then this part is an extra shroud used to make the area around the upper stage engine aerodynamically efficient when the lower stage is firing.
Another interstage section has been recovered earlier from yet another flight. The only thing really remarkable here is that it was covered in barnacles... something that even people who run ship across the Atlantic hardly find remarkable either other than it also indicates a bit of the length of time that it spent in the water.... that can also be used to help determine which flight this likely came from.
While it is I suppose possible that it was the CRS-7 flight and the explosion from that like you seem to indicate, there are many reasons to suggest this panel was not from that flight.
So is it Britain or England? I'ts not 'rocket science' guys.
As the other guy said, Britain or England are both correct, since England is a part of Britain and despite their position quite some distance from the mainland, the Scilly Isles are still considered part of England.
As a nationalistic Scot, I dislike when "England" and "Britain" are used interchangeably, and the headline/summary discrepancy does smack of that being the reason- however, since it was still technically correct I wasn't going to make a deal of it until you made that comment.
(You can stop reading here if you don't want a confusingly-detailed breakdown of the various terms. Just at least do me a favour as long as I have to remain technically British and don't assume "English" and "British" are synonyms! )
FWIW, if one wants to start nitpicking, the term "Britain" on its own isn't really well-enough defined in modern usage to argue over- beyond the fact it definitely *isn't* synonymous with "England". Generally "Britain" tends to be used even by people here as synonymous with the political state of the United Kingdom (i.e. the "United Kingdom of Great Britain and Northern Ireland"). "Great Britain" is the geographic term for the main island including Scotland, England and Wales, but not Northern Ireland, hence the full name of the UK. Meanwhile, the "British Isles"- a geographic term- includes the island of Ireland (part of which is of course an entirely independent country), along with some others such as the Isle of Man and the aforementioned Scilly Isles. (Some people in the Irish Republic dislike the term "British Isles", which is understandable given the use of "British" above).
What's really going to bake your noodle is that whereas the Scilly Isles are considered part of England, the Isle of Man, despite being a British crown dependency roughly the same distance from the mainland, isn't even technically a part of the United Kingdom itself...
Actually, now that I've looked into it, the Channel Islands (i.e. Guernsey and Jersey) are also considered a part of the "British Isles"- a nominally geographic term- despite the fact they're far closer to- and more obviously associated with- France. One might suspect they were only counted as part of the "British Isles" for political reasons, since they're British crown dependencies, albeit not a part of the UK itself (like the Isle of Man).
Pound for pound, the amoeba is the most vicious animal on earth.