We'll soon treat the phrase 'pirates of the silicon valley' in a whole new way. Gotta go spray-paint the jolly roger onto my dji.
Time to repurpose the interferometer to try all the old experiments, especially Michelson-Morley one.
I keep telling my colleagues to at least put in copyright notices when using other people's work. I know so many developers who use nothing but open source products but never acknowledge it. The acknowledgement alone is enough for us. That ensures that the open source message is passed on to users.
I might be rocking the boat by wondering this, but in most developed nations, if an unarmed person was gunned down the way Michael Brown was (regardless of what crime he had committed), you don't get rewarded with over USD 200 mil in gifts and training. Officer Wilson could have shot to disable rather than kill - but of course, someone will say he didn't get enough training. Officer Wilson could have toughed up and faced a charge by Michael Brown (if he was doing just that) without shooting his gun - but someone would point out that police training dictates another set of actions, and that the training was at fault. Officer Wilson could have just waited for backup to arrive instead of taking on two individuals by himself, and if he lost the suspects he wouldn't be losing mass murderers, since he would have been issued the code that the young man was wanted for.
By not doing what was needed, which was to make an example of Officer Wilson's incorrect decision-making, the US is creating a law enforcement precedent that allows blaming the training for bad decisions. IMO, the Obama administration is further highlighting this precedent by approving this funding for equipment and training. Does this not conflict with the legally established precedent of Graham v Connor, where it was determined that use of force must be governed by principles that any other reasonable officer would uphold? Is the US now saying that the majority of officers would have opted to shoot Michael Brown given the circumstances? Is that not what Obama is acceding to with his generous gift: that the current training dictates the same reaction across the board?
Will body cameras truly improve the reactionary measures taken by officers in the field? Had Officer Wilson been wearing a body camera, he may well have acted no differently. Officer Wilson did not take into account the audience on the street and in the buildings around him, nor did he wonder that there may be surveillance cameras of buildings, ATMs, etc., or the existence of smartphones. He was so caught up in the moment that he did not consider any of that. For an individual like that, would a body camera be a deterrent?
In the latest Tiobe index (of course you'll tell me that it's not an accurate representation of demographics), only one of the top 10 languages is not a derivative of C, and that too is at 10th spot. According to that index, about 60% of all developers use a language derived from C. A lot of the others have lexical similarities, if not syntactic ones, simply because the creators wanted C developers to feel at home.
Not saying C was the original, or would remain the most relevant root forever. But, at present, it is the dominant syntactic contributor to the most popular languages, which is a fact that newcomers to coding must be made aware of in order for them to decide what language to adopt. Sure, they could learn Haskell and start developing some of the most optimal software, but the likelihood of getting employed in a presently relevant software firm could be remote.
Speaking of parents, if you have a child, what programming language would you teach him/her? C or something else?
Plenty of successful languages? (Even Microsoft adopted a flavour of C in the end, with their C#)
Technically copy on write improves efficiency in space management and not speed - because there has to be an indexing (or some other) mechanism to determine when an instance of a referenced (perhaps I'm not using the right vernacular) variable changes.
Meant, "Swift is Apple's attempt at preventing future generations of iOS developers from moving away to any other platforms by introducing a 'syntactically' different language to C."
"...move away TO any other platforms..."
Forgot to mention why Objective C and not Swift. Swift is Apple's attempt at forcing future generations to move away from any other platforms by introducing a 'syntactically' different language to C. Both Java (Android's primary language) and Objective C have basic syntactic (and to some extent, lexical) similarities, which allows a developer of one platform to easily transition to another. Yes, Swift provides all sorts of functionalities that make it easier to develop software, but Apple could have chosen to build that into Objective C.
So, will you go with a language that will empower Apple and cripple your options (IMO), or one that will give you the widest array of options?
That thread was started by an 'experienced C' developer. RegularDave is a regular guy without any real coding skills.
Since xcode is a sandboxed (meaning, you don't have to know anything else outside of xcode, except how to publish apps, in order to develop them) software development environment, all you need to know is how the xcode environment works and the ins and outs of Objective C. Once you learn the Interface Builder, basics of programming in Objective C (variables/data structures, flow control, math/logic, and selections) and the essential classes that get things done in an iOS environment you can start coding apps.
Fact is, it won't be an overnight affair. Learning the basics of coding in Objective C will take you a few days. Try Hillegass & Fenoglio's Objective C Programming book.
One question though. Since you're starting from a clean slate, why aren't you considering developing for Android? It's a much bigger marketplace, which means you'll have a bigger audience for whatever you develop.
Perhaps to Maldives, if y'all want to know what propaganda really means.
That's what Balotelli said...
I really don't think it should be too hard to find talented young people who can become security experts with the right push. And it shouldn't take an army of people to provide this push given all the cheap information propagation means that the Internet has afforded us.
As for cost of security systems, how expensive would it be to setup 6 layers of proxies or multiple firewalls using Linux? My mobile phone would probably handle the screening and NAT tasks for an agency with 1000s of employees.
I just don't see how it's so easy to accomplish these insurgencies when lesser systems managed by organisations with smaller funding are able to keep their systems better protected. It's not like we don't already know 99++% of the possible attack vectors, and not like the US government won't have enough bandwidth to fend off any sort of DoS attack.
Perhaps we'll soon get wind of an appropriation bill floated by the meteorological agencies...