Currently I only have one active project on SourceForge - however it's a Java-based one (I distribute both precompiled
I'll keep a watch in the meantime if SF attempt to insert adware, although I doubt they'll try. But any new future project of mine, especially if it supplies native MS Windows binaries, will be hosted elsewhere.
In hindsight this is one of the things that I wish Kernighan & Ritchie (the original authors of C) should have considered.
Pascal and Ada both use ":=" as an assignment operation and "=" for testing equality, so this type of error is a non-issue in these languages. Furthermore Pascal (1970) actually predates C (1972) by two years, so it bears consideration why K&R overlooked this possibility.
That said, nearly all modern compilers (incl. GCC) do print a warning if you use a "=" operation in a if() or while() condition without explicitly surrounding the expression in parentheses - but then you have to be willing to examine the warnings output (as you'll still get your executable in the end), unless you're disciplined enough to use compiler flags like -pedantic -Werror as a means of extra quality assurance.
I think the ISO C standards body should consider introducing ":=" as an alternate assignment operator in a future standard of C, and then all compilers could offer a switch that'd forbid the use of "=" for when you're writing new C code from scratch for new projects.
You'd then still have the problem of existing codebases needing maintenance still being at risk of misuse of "=", but eventually if such a newer C standard started to enjoy widespread support, people could then do a search-and replace for uses of "=" with ":=" in existing code. (I say this with a bit of pessimism since Microsoft's C/C++ compiler still doesn't support C99 fully).
*e.g. tell your browser to restrict resource usage of JS code from untrusted sites so they can only animate at, say 1Hz, or only use 5% CPU max, or only post back xkB amount of AJAX per minute (if you're trying to conserve bandwidth).
That way you don't get lumbered with the perils of bloated JS from incompetent programmers running amok with useless animations, and poorly coded algorithms that should be, say, O(NlogN) but end up as O(N^2) - but still have sites needing minimal JS for usability reasons (e.g. form auto-complete/correction/validation) able to run, without constantly having to update your NoScript-style blacklist/whitelist.
After all, it's my electricity they're consuming (I'm trying to keep my PC's consumption low), and given the rate local electricity prices are rising here (Australia), maybe I should ask the site owners to chip in to my power bill if they continue to insist on me executing their frivolous JS ?
That is assuming it doesn't become profitable for spammers and other evil-doers to focus IP4-based botnets at the problem. The 'IPv6 lets me hide!' fallacy only works for as long as people aren't actively looking.
If you load multiple instances of an executable, it will only load the executable once. It's the same thing. Windows PE executables and Unix executables are the same in this (PE format is a lot like a unix binary). Segments are page aligned so they can be mapped into memory. When you run an executable, it does not copy the entire thing off disk into RAM, it just looks at the headers and configures the virtual memory system to demand page the disk files.
There is almost no difference to what the OS can do with a DLL versus an executable. Both can be cached. Both can be shared with multiple instances. The primary difference a DLL has is with the userspace runtime, where one DLL can be linked to multiple different applications (very useful).
Yes. All my work you can find on arXiv.org. When you submit there, you have to choose the license under which the arXiv can distribute, and two of the options are Creative Commons licenses. We submit our articles to the arXiv first, and then to journals. This means that when the journals receive it, they know the content is already out, and they're not going to get an exclusive distribution deal in any case. There used to be a "preprint" system in which major labs would physically mail around recent articles. The WWW started at CERN and is an outgrowth of this idea.
Everyone should use the arXiv. There are sections for many scientific disciplines, but far from all of them (all it would take is a request to start a new one). In many other fields, the journals exercise draconian control over the scientists (Medicine, Computer Science) and that needs to stop. They work for us, not the other way around.
But, I'm talking about journal articles -- I haven't written any books, and don't really intend to. For scientific journals I agree copyright is a useless hindrance. It was a nice tool when distribution is expensive, but now that the marginal cost of distribution has gone to zero, it's better done with a central government grant, and open access. Now we're just missing a peer-review/referee system attached to the arXiv. When that happens, the journals will die for good.
I'll let you google for a US model.
While the content AW produce used to be really good a few years back (when they stiil had Vid Dude, who was a creative force behind a lot of the shoot ideas), they're on a gradual downhill slump now, heading towards mainstream and becoming just like the other sites out there, because the guy* in charge of the business is a power-hungry sleazy megalomaniac who won't listen to anyone else's advice.
*That's right - a guy, Garion Hall. Just so you know, there never ever was an Abby Winters, it's a fictional identity assumed by him for marketing appeal, just to get an edge over genuinely female-run sites.
I know all this because I used to work for them (in a software development role for their CMS). Best to save your time and effort and don't apply for any IT or non-production positions at that company - you'll just get screwed over with lousy pay and no raises in the long run - that's his management tactic for all staff. Any sense of it being a joyous place to work for is just an illusion.