developers treating the systems as an infinite resource pool with no real rules or resources past "does my code run?"
While I agree that some developers are cavalier with rules, consideration of resources is fundamental to writing software. I would say developers who ignore that aren't doing a very good job...
It is still unclear whether there are dangerous levels of use for cannabis, she added.
Fixed that for ya.
I have the same problem. There's at least two dozens distinct individuals who have had emails erroneously addressed to my inbox.
For automated emails that offer an easy link to unsubscribe or dissociate my email address from that account, I use the provided link. Those are pretty easy.
Sometimes people register for paid services that send a monthly bill and it comes to my email address. They may or may not be of English origin. For these, I just add a filter or rule to my email provider or client to just delete them or move them. Communicating with someone, possibly in another language, possibly requiring lots of bureaucratic red tape, is not really worth it. If they care about it enough, it's their responsibility to fix it.
The most annoying case is when a large group of friends start an email thread with a whole bunch of different people in the "to" or "cc" field. Asking them to correct the email address is pretty much an exercise in futility, since all it takes is one person to hit 'reply to all' and your email address is back on the thread. For these, I just block every recipient on the thread.
I've never had the problem of someone already having registered my email. One way around it would be to set up another email address that just forwards to your actual email address.
what best practices do
Should be able to easily revertable
Had my hopes up, only to have them dashi'd. Ah well.
br> Around the age of five, my dad brought home a 486 DX with 8 MB of RAM. I quickly became the primary user of it. There were computers at school, even as early as second grade, but it was primarily a toy for learning math, playing with art programs, using Microsoft Works, and learning typing. In the second grade I had a reputation in class for being extremely proficient with the keyboard. I think I hit maybe 40-50 WPM, which was impressive for my age back then. Nothing really interesting happened with computers throughout elementary school.
Then in middle school, I was at a school kind of reputed for technology. We played with Flash, a lot of MS Office, and a lot of CorelDRAW, which was kind of like Adobe Illustrator. There was a 'web team' extracurricular activity, which consisted of maybe the top ten to fifteen computer geeks of the middle school. That was mainly doing a little bit of HTML and a Macromedia Dreamweaver. And a lot of Unreal Tournament in our off time. We got to stay out of the cold winters in the computer lab to play with computers. Around this time I was experimenting with Linux at home so I would often putty to my home machine and go on IRC, which lead most classmates to think I was some sort of computer hacker.
In high school, computer classes was actually a kind of step back compared to middle school. I don't think the mandatory classes ever went beyond MS Office. We also did some research for science classes and such using computer. In grade 11 was when you could actually take a course called "Computer Science." My teacher taught us Visual Basic. The focus was making a usable UI most of the time. Rarely was there any math or any theoretical CS involved. It seemed like the provincial curriculum didn't really specify what exactly this course was meant to teach because a friend at another school was learning basic AI concepts and programmed a tic-tac-toe game.
By the end of high school, the closest thing to real computer science we had done was a VB6 program was computed steps in the Goldbach Conjecture. Anyone who was truly interested in computer science had self-learned skills that far outstripped the curriculum. When I entered university as a computer science student, the difference was staggering. I had probably been in the top three most respected computer geeks in high school, but I was absolutely average when I reached my university. I thought I was a real ace at computer science before, but there, I realized I had only been a child who had just experimented with programming in utterly nonsensical approaches...
Around 2006 I got a laptop which just was a nightmare to work with any Slackware so I mainly used Windows. It had become too painful to try to make Linux work on it, but I had access to Ubuntu and Solaris at my university's machines so it was not all bad. In 2009 I got a Macbook, and OS X does everything that I had once wanted from Linux so I've been sticking with that since then. And it is pretty, graphically. So in this 'era' of my OS choices, I was mainly driven by picking something that works for my needs, without being a pain to set up.
For work at an enterprise and as a research assistant I've also been using RHEL9 and Ubuntu, but that is not really by choice. If I threw away my Macbook and got a PC laptop today, I might go with Archlinux, since its orientation towards a simple design seems appealing to me.