Copyright infringement is theft because it denies a copyright owner the ability to sell the product for which they have the copyright and thus they lose money.
Thanks for the nostalgia! I remember when people tried to claim that with a straight face back in the 80s, but no one believed it even then. Can you imagine that someone actually said that ridiculous crap in seriousness once? I'm glad we've moved past those ludicrously mind-bending contortions and can laugh about them now, knowing full well that no one actually thinks that way anymore.
Sharing: Willingly giving a portion of your possessions
Bzzt. I can share hugs, music, friendship, laughter, pain, and joy with others, but I wouldn't call any of those "possessions".
to another, denying you use or benefit thereof.
That presumes scarcity. If I share your post on Twitter, you are not deprived of it. Neither would I be.
Whatever the environment, there are jobs that require someone just to be there waiting for something unusual to happen. Even in the nuclear missile bunkers, I bet they spend about 95% of their time sitting around waiting for an alarm they hope never comes. You can only clean so much before it's time to lean. So what if OP works in a clean room? I bet there are plenty of "I'm paid to sit here" jobs in there, too.
Why not? Why should the creator not be able to impose any restrictions they damn please?
Largely because of the first-sale doctrine, which codifies property rights sanity: if you sell me something, it is now mine, not yours. I can do whatever I want with it. Use my spatula as a screwdriver? Use a thermos bottle for a hammer? Watch scenes in a movie out of order? It's none of your business. I bought it. It is now my property, and I'm free to do with it as I please.
(Averting pedantry: of course that doesn't involve violating copyright. Straw men will be ignored.)
Hell yeah, I'll admit that I am King of the Geeks. Talk nerdy to me.
OK, OK. I'll double-check with a calculator that's not "bc" before publishing. I've done enough physics work, though, to trust that 1) calculations showing explicit conversions are almost always correct, and 2) calculations that don't almost never are.
Like when Bell Labs developed C to write Unix? There's a long tradition of major companies coming up with new languages to scratch an itch. Thank God is hasn't died. How boring to live in a time when we'd decided that there was nothing left to innovate?
For nontrivial math, I don't always trust Google's interpretation of the question to be the same as mine. That page is a little short on details of what it's actually doing. On the other hand, WolframAlpha is really good about showing its work. I just always forget that it's there.
In either case, yeah, I like doing it the hard way. Or as I call it, "learning" or "practicing".
Moore's law describes the number of transistors in a package, not linearly their size. Doubling the number of transistors means that each one has to be 1/sqrt(2) as big as the old version, and that to-the-11th would be about 1/45th as large, not 1/2048th. That's still pretty dinky.
They think it's great because, in a tragic case of hilarity, jumping into code with minimal design is what python is great at.
We think it's great because, among other things, it has first-class functions and a very high code:boilerplate ratio. This lets us write very concise, readable, and maintainable code.
Eww, no. I've never seen good Python code that asserts types because it's not the idiom for you to care. For instance, suppose you write a function like:
def get_contents_of(obj): return obj.read()
In this case, obj might be a file, or a web request, or a string (via the StringIO interface). Who knows? Who cares? As long as obj.read returns something, it works. BTW, this is supremely nice for unit testing when you don't really want to hit an SQL server 1,000 times in a tight loop.
Now, you could write something like assert isinstance(obj, file) to guarantee that you're only dealing with file objects. Of course, that lays waste to the whole concept of duck typing and people will laugh at you for doing it. So dropping that bad idea, you could write assert hasattr(obj, 'read') to ensure that the object has the needed methods. But why? Python gives you that check for free when you try to call the method. Let it do the heavy lifting and concentrate on the parts of the problem you actually care about.
Exceptions are one of the worst things to have become common - an "error" is almost always only caught outside the scope that it occurred in, hence the stack has already been unwound and thus there is no sane way to fix the error and retry the operation that caused the exception.
Yeah, that would be terrible. You almost never use them in Python like that, partially because Python tends to have a vastly shallower call stack than, say, Java (largely because you don't need 10 layers of abstraction between bits of code thanks to the duck typing we just talked about).
I think it boils down to you not knowing idiomatic Python. That's OK. I'm ignorant about lots of things, too. But I think you'd find that you enjoy it more if you stop trying to write C or Java in Python, because that almost never works out well.
I agree, but remember that Python is interpreted in exactly the same way that Java is: both compile high level code to bytecode and run it on virtual machines. PyPy selectively uses LLVM to compile that bytecode into assembler for some enormous performance boosts, much as the Java JIT compiler does.
Whatevs. I co-built a web service on Python that handled 250,000 requests per second with a horizontally scaleable design. We could bump that up to 1,000,000 requests per second by deploying 4 times the servers (which isn't as easy as it sounds because most things don't scale out well like that). I left that company and went to another employer that handled "only" 80,000 requests per second, averaged over a month. If you can ditch the chattiness of HTTP, well, I've written single-threaded UDP servers in Python that could handle 200,000 requests per second per server. How fast do you want it to be?
Unless you're seeing extremely low numbers or your design requires vertical scaling because it was architected in 1965, choice of language isn't all that important. Ruby is slow, too, but Heroku manages to shovel the data pretty well.
I love Python because it maps very neatly onto how I model problems in my head. I'm not averse to using other languages, but Python is my comfort zone because Guido and I apparently think about algorithms in the same ways. As it turns out, I make a decent living with it.
So, do I have a good job because I know Python, or is it because the thought patterns of the people who are drawn to Python are the same ones that companies want to pay for, regardless of language? If the former and you want a good job, then by all means learn Python. But if knowing Python is just a side effect of the properties that employers are actually looking for, then it's probably not going to help you all that much.