Well, we sure didn't get into it to write boring business applications except a few in the dotcom years who quickly moved on when it went bust. As I remember it though, there were many who just wanted to play games and only a few who wanted work with code and I don't think pushing them to play more would have brought them over. Of course you needed the opportunity, but there are a lot of games that are mod-friendly if you're so inclined. I'd sure encourage and test if tweaking a game peeks their interest, but if it doesn't I wouldn't try with more game time.
It means it's closed source!
Missing codecs: AAC, H.264, MP3
Missing plug-in: flash
So either patents or not their code, if you got a good solution for that I'm sure Google would like to hear it.
I don't think you have to come up with that many conspiracy theories, Mozilla's "problem" is that they won. They broke Microsoft's monopoly, made HTML/CSS properly standardized and together with KHTML/WebKit/Blink some 80% use an open source renderer though many use it in a closed source binary. Microsoft would be laughed at if they tried any new proprietary extensions and for the rest the implementation details are all in the open.
I'm talking of the unwanted UI changes. Then there were the release frequency changes that broke extensions every release for a long time. Then there were more unwanted UI changes, cumulating in the despised Australis UI. Then there was the switch to Yahoo for searches. There were the grid advertisements. Then there was the mandatory HTTPS proposal. Now there's this nonsense. All of this is being done when there are still many bugs to fix, some of them existing for years.
Their problem can be summed up in two words: "Now what?" and it turns out they didn't really have any other goal in common than slaying the dragon and now the dragon's dead. Some UX designers get to make an art project. Some cowboy coders thinks more releases is better. Some will do anything to get away from the reliance on their biggest competitor. Some security nuts get to go overboard. Some want to go after Android/Chrome OS with Firefox OS, but this time they're not competing against proprietary and neglected shovelware and barking up a tree Ubuntu has made essentially no progress on.
Let's face it, Mozilla mainly won because Microsoft was trying to keep the web from competing with local applications so they could sell Windows licenses, they got to the head of the pack and grinded it to a halt. They didn't want to compete, they wanted to put a spanner in the works for as long as possible. It annoyed many and gave Firefox enormous amounts of goodwill even when it didn't work properly, out of spite for Microsoft people kept using it and pushing for sites to support it. They don't have a clue on how to compete with someone that puts up a fight, which is their second biggest problem.
An asteroid may kill a lot of people, but it will not cause global extinction. No asteroid strike has ever completely wiped out life on earth.
Isn't that argument a bit like "I plan to live forever, so far so good"? After all, if it did wipe out all life well then we'd be dead so obviously it hasn't happened yet. Some large extinction event seem to happen once every 50-100 million years, what does a once in a billion year event look like? Ceres, the biggest object in the asteroid belt is about a million times bigger (10^20 kg vs 10^14 kg) than the dino killer. That one isn't going anywhere, but there's clearly quite a few potential total extinction candidates if they came to intersect with Earth's orbit.
Excessive hyperbole is silly, yes...
Each year that passes sees roughly a 0.0000005% chance of a species-threatening asteroid coming our way, while real threatsâS - âSenvironmental, medical and political (i.e., war)âS -âScould literally wipe us off the face of the Earth in the blink of an eye.
Global warming is a sloooooooooooooooooow process and even if you burned every bit of coal and oil you wouldn't make Canada into Sahara, it's hardly an extinction level event. A modern day pandemic could presumably kill millions, but it's hardly an existential threat to the human race. Same goes for total thermonuclear war, there's be a lot of direct deaths and many more indirects deads from nuclear winter and starvation but not enough to wipe us out.
Tsar Bomba (most powerful nuke): 50 MT
Chicxulub asteroid (dino killer): 100,000,000 MT
We're not even remotely in the same league. The odds are small that it happens tomorrow but in terms of "worst case" asteroids have everything us humans can come up with beat by far.
I can only come up with the obvious client-side encryption, but will the network as a whole still be able to use the data as it's supposed to (in this case; find adult friends)?
This. It seems sexual preferences, age and location is rather essential for the service they provide and email, well how else are they going to notify you that someone has taken an interest in you or that you got a reply? You can't ask a doctor to not work with medical data, there's of course good and poor security but at the end of the day if there's a total system compromise you're screwed.
How could you protect against this?
Best practice seems to be as follows:
1. Public facing server makes web service call to locked down proxy server.
2. Proxy server validates every request thoroughly, everything that looks even remotely funny is rejected.
3. Proxy server queries stored procedure in locked down database, no SELECT * for you.
4. The results are serialized back to XML and sent to the public facing server for display.
A lot of work if you want to do it right, but you get a fairly good barrier to a total breach from the outside. Of course they could compromise your web server and start harvesting data, but you should have some sort of tripwire system for that with audits and logs checking for abnormal activity.
The other way in is of course from your network, if they can compromise someone on the inside with database access or developers to plant vulnerabilities that'll go into the production system. But that's usually a much tougher route and really no different from breaking into any other secure network.
I could go on an on about the differences between an Engineer, a Tech, a Manager, and a Team lead. It sounds like what you are looking for in a manager is really a team lead position.
Formally, you could be right. Informally, both the team leader and manager hat usually end up on the same person, even if he lacks one title or the other. If you haven't got a team lead it's pretty obvious, if you do have a team lead then in my experience the manager does the HR/administrative bits and leave the actual work management to the team lead or the project manager if you work on a project.
For example, with no formal title I basically had the responsibility to:
1) Execute the actual project
2) Delegate as possible to the two juniors
3) Support the two juniors
4) Train the two juniors
Sure, there was a project manager dealing with the contract and formal contact with the client. There was a manager dealing with formal HR bits. But I felt I was a bit project manager, a bit team lead, a bit manager and a bit mentor all at once. It was a constant prioritization between:
1) What must I do to get the project done?
2) What can I delegate to free up my time?
3) What should I delegate to teach them?
4) What should we walk through together?
When you're in practice managing 100% of their time, you get all the hats whether you want to or not.
"More useful" by whose definition? Money is llike water - it can only generate power if it's moving. That 'useful stuff' you speak of often looks like putting the money behind a dam, where it does nothing to stimulate the economy. Consumption, on the other hand, drives the economy.
That's a fairly flawed interpretation, globally all that is consumed must be produced so the only way to raise total consumption is to increase total production. If we consume more than we produce we run a deficit and are either living off means we already have or incurring a debt, which means we've spent less in the past or will spend less in the future. Increasing consumption won't grow the economy as such.
However circulation helps the economy find the most efficient means to produce what we want so I pay McDonald's because it's cheaper than growing and harvesting and cooking my own meal. They again buy from their suppliers and so on stretching your money so you get more for less, but all of this is because of advantages of scale. The value-add comes from us being more efficient when we're specialized, not because we play musical chairs with our money.
One has to give it to AMD. Despite their stock and sales taking a battering, they have consistently refused to let go of cutting edge innovation. If anything, their CPU team should learn something from their GPU team.
Well unlike their CPU division the GPU division hasn't been the one bleeding massive amounts of cash, at least not until the GTX 970/980 generation from nVidia. Though with the R300 OEM series being a R200 rebrand they seem to be running out of steam, one limited quantity HBM card won't fix their lineup.
This kind of power savings combined with increased bandwidth cna be a potential game changer. You can finally have a lightweight thin gaming laptop that can still do 1080p resolution at high detail levels for modern games.
You still need power for shaders that is about 80-90% of a GPU's power consumption. In fact, AMDs problem is that even if they could swap out the GDDR5 for HBM today they still lose on performance/watt to nVidia's Maxwell 2. And the interposer is basically a giant semi-processed silicon die, it might be good for performance but it's probably not good for cost.
Anyway, the slides are kinda impressive just like with their Zen processor and whatnot. But AMD has a rather poor track record of delivering products on schedule that live up to their hype. It's now been eight months since nVidia launched the GTX970/980 and we're still waiting for something new from AMD. You can't win if you don't get your tech into products and ship them.
Smart phones killed dead time, if you have five minutes riding the bus or whatever and you can rather instantly find/read/check anything you might need which is rather convenient. It's rather limited how entertained you can get while driving a car, since your attention is legally required to be on the road. And if you're only two you're usually socially required to be in the front seat making conversation, not zone out in the entertainment system. Really it's most kids in the back seat who get to do that and then why not on their cell phone or tablet or 3DS or whatever? You need a significant value-add to make up for the fact that it's stuck in the car. And as long as you're driving, the car's handling is going to be a big deal.
Now if we're talking a self-driving car where it's really my en-route entertainment center that's an entirely different matter. You just tell the car where to go and it goes, how it is to drive doesn't matter. It probably doesn't even matter if it takes a few minutes longer because you got to play another round of Candy Crush. In this case, yes having an Android/iPhone dock so it could integrate with the rest of my entertainment world makes sense. Until then, I'll be busy limping along bumper-to-bumper listening to the radio....
Unlike what Hollywood thinks not all problems are solved simply by running across a border. Have you ever: tried to get a passport for a minor without the other parent's signature? tried to travel as the sole parent of a minor? tried to enter a country as the sole parent of a minor?
Went on two weeks vacation to the US from Norway with my cousin when I was 16, zero parents seemed to work just fine. Of course I already had a passport, in Europe that's like travelling to another state. I'm assuming my cousin had some kind of permission slip though I never saw it, but that seems easy to forge. And with all sorts of long-distance relationships and immigrants travelling back and forth I really can't imagine crossing the border with one parent raising any flags unless the child's name already is on a kidnapping victim list.
I call bullshit. This is simply another step down a slippery slope that removes more personal responsibility. This is the very definition of the nanny State.
Well, the article is just a fluff piece saying that how you build the interface affects the results and that this can have consequences. Which is actually not such an unreasonable thing to say, as long as you don't take the concept too far. For one concrete example I know of from a hospital system, the software said pretty much "Nothing more to register so closing healthcare contact" when it actually meant to say "Warning, hospital visit registered but no further patient follow-ups scheduled. If you proceed the patient's treatment will end and case will be closed."
This was in production code found in a review trying to find how the hospital could "lose" patients. The message was technically correct, but it was also extremely misleading when the nurse had forgot to register a follow-up. One seemingly harmless confirmation and the patient could end up not getting chemo for their cancer unless the doctor noticed the patient was missing or the patient followed up himself. So the developers of the system should absolutely take some responsibility for making sure the system makes it easy to do the right thing and very hard to do the wrong thing, not just technically correct.
Another much hotly debated topic is defaults, because people have a tendency to overuse defaults. The problem is when 99.9% are the default but the 0.1% is actually important to register. Did you skip past the point with allergies when the patient actually is hyperallergic to peanuts? Ouch. People are not machines, they hate doing things that are 99.9% unnecessary even if you tell them that you checking that box is their proof that you remembered to ask the patient and a default won't do that. Like security, completeness and correctness often comes at a cost in usability too. It all depends on what matters more.
I don't doubt that there are some exceptions, possibly even some motivated enough to be slightly dangerous; but those people I've met who actively want religion to die out (as opposed to merely being atheists personally, or apathetic toward metaphysics) specifically want it do die out by persuasion rather than persecution.
Weeeeeeeeeeeeell.... not by persecution, but I'm not sure by persuasion either. Most of those fed up with religion consider it just a silly superstition and don't really see the point of discussing it any more than the Loch Ness monster or if a black cat crossing the road brings you bad luck. They realize they can't talk you out of an irrational belief, they just want it to go away and will probably act more with derision than persuasion.
I'm fairly certain humanity would find plenty of reasons to wage war if religions were not around to blame it on.
For sure, there's plenty examples of people of the same religion going to war over various reasons like land, resources, geopolitical reasons, wars of oppression, wars of liberation, power and control with wars of succession and so on. But for most of the genuine atrocities you need more, you need such a burning hate for the opposition that you're willing to slaughter down women and children and burn their cities to the ground. Where simply victory isn't enough, only total submission or even extermination. Religion is a very common fuel for such hate.
When fighting for resources you're also looking for a "return on investment", you have to gain enough to be worth going to war. That you can usually defend against by rational investments in defense, making it too costly to be worth it. Irrational wars fueled by hatred often don't care, a civil war might tear the whole country apart and leave it in ruins as long as the infidels die. The Germans fought the allies all the way back to Berlin, the Japanese until they were nuked. Twice. Neither made sense, it was death before surrender.
Of course you can say that was mostly racism, not religion though I'm pretty sure the Holocaust was a good dose of both though Nazi Germany certainly fought a lot of other nominally Christian nations. Religion is also a very lasting divide. Germany fought most of Europe, now they're a key member of the EU. The US was at war with Mexico, the North was at war with the South but the wounds mostly die with the generation that experienced it. In the middle east they've been fighting for 2000 years and every conflict reopens a wound that never heals.
Of course religion has its good sides, I think a lot of people behave better than they might have because they think God/Allah is watching or it affects their karma. So it's not just irrational evil, it's also irrational good. Mostly just irrational and mostly harmless, really I don't care if you want to bend knee and pray to the FSM or have your own diet because FSM said eating something is unclean or the FSM told you not to work on a Sunday or whatever. As long as you got it dialed down to mostly quaint and charming.
The age of the square, visible pixel was actually a pretty short period between blurry CRTs and retina LCDs. Pixel art was originally created for CRT, which blurs the pixels. Artists developed techniques to take advantage of this.
Going by your UID, I'm guessing you were too young to have been there. The glory days of pixel art was the 80s and early 90s with resolutions of 320x240 and less as well as 4-256 colors simply because you had no other choice. Those were very visible, even on the CRT. A good example is comparing TES II: Daggerfall, released in 1996 which was the last of the "pixelated" generation with 320x200x256 color and TES III: Morrowind in 2002, which was a damn beauty with up to 1600x1200x16.7mio color and all of this while CRTs were dominating.
Of course you could still see here and there that it was pixels and not retina-class, anti-aliased super smooth ultra realistic graphics - it's not hard to see it's a computer game and not real life, but clearly they were going for being as realistic as possible and not a stylized "pixel art" form. LCDs might have raised the standard a bit on the level of detail you need, but they aren't very relevant at all as to why pixel art exists in the first place. It's more like a game board piece, enough to see what it is and make it an interactive part of the game but not even trying to be realistic.