Noooot really sure Americans ought to be lecturing anyone else about the quality of democracy.
Noooot really sure Americans ought to be lecturing anyone else about the quality of democracy.
In fact, more than virtually any other company in the world, excepting perhaps Microsoft.
> What exactly do they produce? What particular problems do they address? How is mankind's lot significantly improved by the presence of Facebook?
They produce a social network. There are many problems that these address. Sure, from your point of view, people post cat baby photos. Around the world Facebook is virtually the Internet, because it provides the means to connect with a social graph of people in similar circumstances. The breadth of communities, NGOs and government agencies worldwide that use Facebook groups to communicate with small communities, where there exists no infrastructure to build and publicise traditional web sites, is beyond large. Today I fly out to the rural mountains of Taiwan where I will be working with the Taiwanese indiginous people to document their own language. They face a great many difficulties but the various tribes have been able to pull together to share information, organise events, publicise their political struggles, all via Facebook and mobile networks (another great engineering achievement, which presumably you actually rate?)
Also an engineer (one who develops solutions for people outside of the rich white-anglosphere).
There's heaps of them on eBay. Just get a NodeMCU 'dev kit'. There's a couple of vendors, nothing between them really. They cost about $5-8 from China. Then use NodeMCU and not this silly BASIC thing
Since it hasn't been mentioned here. The ESP8266 is no stranger to interpreted languages. The NodeMCU firmware offers a Lua interpreter. It's been around for longer than this BASIC project and is now fairly robust. I have created a couple of projects with it and been pleasantly surprised, particularly with support for the u8glib library. This is just outstanding.
There's lots of reasons to like an interpreted language on a device like this. That said, the hardware/libraries integration and maturity is way more important than exactly what interpreted language. I feel a tag nostalgic for BASIC but I don't really see the utility over the excellent NodeMCU firmware. There's even an online firmware builder that allows you to select which features, ostensibly hardware protocols and the like, to bake in so you can maximize how much free heap there is. http://nodemcu-build.com/
Interesting. Are you using the software USB serial library (CDC or something?). Uses up half of the flash and most of the RAM but it does work. It's just really really tight. Presumably you're using the arduino library for PS2 controllers? Looks like it's bit-banging, might well expect the AVR to run at a particular speed. Might want to look at that?
It's difficult to overstate the importance of the ESP8266. It was marred a bit in the first place with the $5 modules that only had a serial interface and a firmware that spoke in AT-commands. This is paradoxically how these stupid AVR+ESP8266 boards work. They're worthless. The AT/serial firmware is buggy and unreliable and a complete waste of time.
The thing that made the ESP8266 lift off is the breakout boards with USB serial onboard allowing direct programming of the device. For about $10 you can get one of these from China and then flash the NodeMCU firmware and write Lua on the thing. NodeMCU isn't universally loved, largely because it kind of sucked until more recent versions, but just imagine a scripting language with low-level hardware GPIO, I2C, SPI and of course WiFi. Proper IoT shit for $10 and they draw about 30mA when running flat out and sleep at much less. It's amazing.
If Lua doesn't cut it, there's a really nice port of Arduino so you upload C code directly to the board with shitloads (but not all) of the Arduino support libraries you might want. For me, there's really only one usage case for 8-bit AVRs. That's the ATtiny85s run for about $3 on a board with USB (digisparks).
This really is a load of crap. Extract a bunch of fairly obvious stratagems from a received text, an English translation of generally dubious worth, and apply it to cyber warfare.... unsurprisingly it fails to stack up particularly well. Sunzi was almost exclusively fixed on the idea that armies were controlled by single entities and that virtually all actions under taken by them had cost, and thus could be factored in a set of trade-offs, or expert application of game theory, before game theory was a thing. It was insightful at the time, to say the least, it can still be useful to state the more obvious strategems of any conflict but to claim relevance today where the agents existiing in dramatically different contexts is weak sauce indeed. Sunzi, in particular, would be horrified that any engagement would essentially exist in perpetuity, if the sunzi bingfa (art of war) was indeed written by one person, then he would be horrified by the layout of modern cyber warfare, and would certainly be quite unable to add anything to the idea that one may have to defend against any number of actors, each of which potentially using different strategies at virtually no cost..
Would have expected this to be already extensively studied. C'mon humanities there must be already some linguistic research on this?
Holy cow batman. I guess I wouldn't expect slashdot to be up on anything to do with the filthy humanities but this is really quite something. There is a vast amount of research on this. The general idea is called linguistic relativism and has been a hotly debated topic since Wharf first started pondering the issues in the 1930s. http://en.wikipedia.org/wiki/L...
It's easily tested and has been often demonstrated, that speakers of languages with certain obligatory features, like say tense and plural in English, will be more observant about those facts that speakers of languages where such features are optional. Then there's a vast array of work that has discussed perception, particularly colours where languages vary in terms of how many names for colours there are, and hence the form of distinctions that need to be made when observing colours. I was always rather more partial to Dan Slobin's description of 'thinking for speaking' where our cognition shapes what we observe and what cognitive paradigm to use based upon the demands of the language we intend to speak in.
The whole Sapir-Wharf hypothesis and linguistic relativity has been flogged to death. The TFA paper is building on that with an interesting experiment designed to discover something rather more nuanced than suggested by the headline here. They used an interesting experimental technique that involved employing interference from another language by making them perform a task using that language. They seem to have demonstrated that this interference does indeed shift the way the participants viewed the task based on the differences between languages. It's certainly not a surprising finding for those linguists like me, that hold to a usage based theory of language (functionalism) based on general cognition. However it's a great example of the fascinating things you can discover with clever experiment design.
"Because it costs $1500"
This should not have been down voted. It's an honest and informative post about the reality of publishing with Elsevier.
From what I've heard the Chinese have been using Roman letters to help their students learn their own language for years now, and especially use roman letters to make it easier to enter Chinese text into a computer.
Indeed! Other input systems based on radicals or even handwriting have fallen out of favor compared to pinyin (that's what Mandarin romanisation is called) schemes. Helpful because increasingly English words or acronyms (often acronyms of Chinese words!) are becoming popular among young Chinese.
If you want to see pinyin, use Google translate and hit the phonetic symbol underneath the Chinese characters. It's the A with the two dots above it. The diacritics are the tone marks. Many of the roman letters sound somewhat like English but several aren't like x, c and q. The x sound is particularly amusing given the current Chinese leader's surname is Xi. I get a kick out of hearing newsreaders mispronounce it. Childish I know but when you've spent this many years learning a language and often still not understood by native Chinese it's nice to feel superior once in awhile.
Romanization systems don't work because there are too many homophones to worry about.
Bzzt! Aside from anything else, there is a standard romanisation sytem called Pinyin (), this is perfectly adequate to represent tones. It's used to teach Chinese both to kids and foreign speakers of Chinese. It's in dictionaries to tell Chinese people how to pronounce new words (since the Chinese orthography only gives you clues to pronunciation and of course no information about tones). Other tonal languages with greater tonal inventories than Mandarin such as Vietnamese have adopted similar schemes as their official orthography. There was even a substantial movement in the PRC to shift towards a roman alphabet at one point. This stemmed from the same political movement that simplified China's orthography from the traditional full form characters. Most of the arguments made about losing information in dumping Chinese characters can also be made about what has already occurred in the shift to simplified.
Even this argument premise betrays a fundamental misunderstanding about language. If you jump on a massively multiplayer game you'll find Chinese happily chatting away in pinyin without even writing the tones (you can do it in ascii by using numbers eg. ni3 hao3. That's because the act of parsing language is deeply rooted in context. Only certain words make sense in a given context or in a given syntactic position.
What most speakers of Western languages don't understand is quite how far along the explicit spectrum European languages are. An example is the English fetish on needing to specify a subject leading to bizarre constructions like "It is raining". Speakers of Chinese are much happier and skilled with the art of disambiguating not just lexical words but pragmatic intention from utterances that don't convey the full meaning in their semantic evaluation.
The high frequency of homophones is no barrier to a romanisation. I also fail to understand why anyone would think radicals are essential. They're very useful in reducing the task of memorising the character set, particularly since they have pronunciation and semantic clues that make it easier to remember how to read (and more importantly write) various words. They are actually quite a lot better at this task for the original full form () orthography because the full radicals often remain where as in the current simplified orthography of China, much has been reduced to arbitrary squiggles discarding semantic and pronunciation information in the process.
That's a circular argument though. If a phonemic orthography was used, you wouldn't be relying on clues any more. It would be enough to hear a word to be able to write it down. You cannot currently do that in Chinese except by using pinyin. I do this all the time. I write down the pinyin and then later check in a dictionary for the hanzi.
I'm a little loath to reply to this on the basis that the vast majority of posts from the Slashdot crowd on anything to do with university tend to view education as all about money. I suspect that's a heavy cultural bias from the US... anyway.
As someone who is a 40-something about to finish a degree this year, I have some experience of this but for me, at least, your question loads the dice. I was earning plenty of money doing what I was doing before, I just didn't like it. I'd be happy to earn a living, doing something I love and that is what, in my experience, most mature students are doing back at university.
Granted that might be a little skewed because useless public services like healthcare and universities cost more in the US than anywhere else in the world, and maybe you do feel some pressure to get a career result to pay back the debt. That said, there are cheap or even free ways to get educated if you're willing to move beyond the top-tier universities.
Finally, I'd add this: It's easy to make the decision to go to university to study something based on some sort of future goal. What universally happens is that by the end of the degree, you have a different idea about what that goal is. It's also quite hard to motivate yourself, do well, and even benefit particularly well from a degree if you aren't really interested in the subject.
So my advice is this: do a degree in something you're really interested in and when faced with choices, go for the flexible choices. There is every chance that you'll run into some niche off of something you're interested in which will turn out to be a gold mine. It happened to me. I found a field that blended my previous skills with what I was learning and it's the best thing that ever happened to me.
Maybe because I got into it near the end of last millennium, or because I was well enough grounded in fundamentals from assembler days, or because I was often dealing with dirty data, Perl plus the also less than perfect MySQL have enabled me to play in several unconnected spaces.
In 2012 I'd add the disclaimer as long as you don't treat CPAN too seriously. While there are indispensable gems there, way too often it either doesn't quite do what you need or alternatively, in attempting to do so, it invokes ridiculous dependency trees where quality control collapses.
Still waiting for Perl 6.
David Zindell's Neverness and its follow up Requiem for Homo Sapiens trilogy, the first of which The Broken God is my all time favourite.
I've long argued that the main thing propping up the artificially way too high US Dollar is its preferencing by extralegal entities since the normalisation of white collar "work" drove most of the American economy out of inherently tradeable production into devices which must be propped up by legal fictions to acquire monetary value.
Backed up the system lately?