To the best of my knowledge "head starts" in letters and numbers make no difference in long-term outcomes. I'm disappointed to hear Gates pushing for this sort of thing. Solid educational resources at the right developmental stages are critical for long-term success, not some sort of fast-track to the ABCs.
Investors don't care about 20% revenue growth y-o-y if EPS has tanked.
GOOG Earnings Per Share:
This got a lot of publicity but it doesn't really add all that much security. Supposing you choose 4 words from a dictionary of 200k (roughly the order of magnitude of the OED), you arrive at about 70 bits of entropy. Conversely, choosing a 10-character password from a 62 letter alphabet (a-zA-Z0-9) yields 59 bits of entropy- the difference is only a factor of 1024. Attackers aren't so dumb as to just try choosing random characters- they have very good priors on how common any particular character sequence is in the typical password and will mix and match entire words, with or without leetspeak substitutions, etc.
Of course no matter how rigorous your policy, it all goes out the window once your users type the same password into some other random site.
Complexity matters mainly if your attacker gains offline access to your hashes. Far and away the main source of password compromise is non-uniqueness (using the same password elsewhere). This is actually the main benefit of forcing a periodic password change. Graphical and gesture passwords are horribly insecure from shoulder surfers.
If you can, support as many factors as possible. Multiple factors gives your users flexibility- they may not always be able to receive an SMS or have a card reader handy. TPM-based virtual smart cards are super handy for remote auth from a domain-joined device- no cards or readers required.
Don't apply for a dev job. Assuming there was sufficient math in your PhD apply for a data science or data analyst role, which will include a fair share of programming but also mentally engaging work. Hiring managers for these roles look for people that have strong analytical skills and the ability to learn new things (proof: you have a PhD). What languages you know is secondary in these roles to how well you dig in to a problem and deliver insights.
Gotchas more than quirks:
- the day you realize you put a side effect in an assert() call.
- the day you realize GCC, maybe it was V2, not sure this is still an issue, exploits extra bits of precision in the Intel FPU, *only if* optimizations are enabled, which causes certain iterative floating point algorithms (eg SVD) to fail to converge.
In both cases everything works great in debug builds but goes to hell in release builds and it's incredibly painful to get to root cause.
for sure the first site I'd attack is obscure registrar namecheap...
Some details here: http://en.wikipedia.org/wiki/P...
NCMEC uses PhotoDNA which is a fuzzy hash that can detect altered images.
Yes, most likely GOOG is using the same thing everyone else uses- the NCMEC standard is PhotoDNA:
+1000. The OP has embedded hardware skills which is a relatively rare skill-set- the barrier to entry is for sure a lot higher than basic software programming. My advice would be to leverage the hardware skillset into some new embedded programming domain (learn new hardware-specific tricks). There's little-to-no value in reinventing yourself as a generic programmer.
If you're using Outlook I assume you've got Onenote too. Create a daily meeting in outlook titled diary or whatever, and when you want to take notes open the meeting for today and use the meeting notes feature to take notes. The only issue I see with this is that it might not organize the daily notes by date in Onenote, but there are decent features for moving pages around and reorganizing them. Plus everything is searchable and if you want you can save the whole notebook in skydrive and open them from your phone. Say what you will about MS, in my day-to-day work OneNote is the best thing since sliced bread.
Actually, although I lean towards agreeing with the article, I think it sucks.
Here is a far better article about private schools and why maybe they are not good for society:
Mostly agree that geography/demographics matters a lot. The article is terrible but she has an important point to make, which is summed up much better here:
Public school in America has declined as an institution because the wealthy have abandoned it and everyone thinks that's ok. But it's not. This is in part because the people who set public school policy happen to be wealthy, and therefore have no skin in the game. It's also because egalitarianism is all but dead as an American ethos. Level playing fields are for suckers.
If you're wealthy you look at the public system and decide you can do better for your kids. So you make a locally optimal choice which is perfectly reasonable in isolation. It's sort of an inverted tragedy of the commons.