The article is on Hackaday: http://hackaday.com/2014/10/22..."
Link to Original Source
Kids really don't need to learn to "code". Only trained monkeys working for few bucks/hour "code". Of course, Facebooks and Microsofts need such people too, but that really isn't what we should be teaching to kids.
Have them learn mathematics, abstract and analytical thinking, let them do actual science, experiments, let them tinker (and fail!), expose them to the computers and computer science too. That is much more important.
Whether the little Johnny or Susan can write a program for adding up a few numbers or make a web page when they can barely read and write yet doesn't matter - perhaps they will become an excellent physicists or chemists instead. Or perhaps get a Nobel for curing cancer, who knows. We will need all kinds of engineers and scientists, not only cubicle monkeys slaving for Microsofts of the future. Schools shouldn't serve only one industry - if the kids are prepared and interested, they will go in the computer science themselves, without having to "spoon-feed" them with it.
I simply wonder why these behemoths of companies sitting on so much cash don't run their own re-qualification/education programs? That would be a win-win situation for everyone. And it not some silly commie invention - Tomas Bata (the shoe tycoon from before the WWII) was doing exactly that - taking kids from the street and offering them education - and gaining qualified and loyal workers in the process. Of course, it is cheaper to whine about the lack of visas for foreign labour and poor school systems and demand that someone else solves your problems
Unfortunately, the whole thing is again more BS.
Arxiv is *not* a peer-reviewed publication - anyone can submit anything there. So having a paper on Arxiv doesn't mean that it is any good.
Sorry, even $2300 isn't enough for a device of this complexity. And anyway, they don't have that money - the backers paid only $500, so they have to fund the work from that, not from the $2300 that they may hope to get at the end.
Also check out how much a commercially produced (including all economy of scale discounts!) camera components costs: http://www.red.com/store/camer... Believe me, that isn't 100-200%+ of margin there.
And the team lacking any engineers or anyone with a verifiable experience in building projects of similar size?
Oh and check out their team - "new media artistst", "filmmaker", "3D artist", "software developer"
This looks very much like CLANG (https://www.kickstarter.com/projects/260688528/clang) 2.0
The problem is that this device will never get built. 100k is a ridiculously low budget for the production of a device of this complexity. Just to have an idea of what is involved for a much simpler device with the same budget (a silly 3D printer): https://www.kickstarter.com/pr... Basically those guys have also asked for 100k, got them, spent a year on it - and went bust. At least they had the balls to admit it and are going to refund the backers. Going to an assembly house with less than a million in budget? Forget it, they won't even speak to you.
That leaves assembling these cameras in a garage, by hand. Which means soldering those nasty BGA by hand - good bye any reasonable yield, not to mention that those chips aren't exactly cheap.
Which leads to the second point - I have serious doubts about their BOM costs. If they are planning to sell the camera for $500, with the FPGA/SoC costing about $100 alone, that can't work out. The 4k camera sensor is likely in the similar range (probably more - 300fps 4k sensor? Those things cost hundreds of dollars just the bare sensor
In short, unless they have an order of magnitude larger external funding as well this isn't happening. Period. They may have a prototype which perhaps works (who knows, the videos could be fake, all pictures are labeled "concept drawings/renderings", irrelevant testimonials about open source, etc.), but they have no idea how much the manufacturing is going to cost. And I doubt that this is going to be a charitable undertaking with the team paying for this out of their own pocket.
That's because it is based on the gambler's fallacy - that the past outcomes of something somehow determine the future ones. The same voodoo is used for things like stock price prediction (look up "technical analysis"). It is mathematically a provable bullshit, but that doesn't mean people are going to stop using it
Honestly, the minimal required configuration is more to appease the marketing department and industrial partners than any sort of practically useful information.
Anyone who has attempted to use Windows Vista/7/8.x on anything with less than 4GB of RAM knows that it is completely unusable. It might run in 1GB, but there is nothing left for any applications. Even 4GB is barely enough for some basic work. For any serious use one needs at least 8GB or more and a modern CPU - likely an i3 or i5 at least.
The other reason is likely pressure from Intel, because they want to keep selling their Atom CPUs. Which are both slow (when clock speed is concerned) and most of them are 32bit only due to various issues (some CPUs not supporting 64bits, mobos/BIOS/drivers not working/not available for 64bits, etc). The moment Windows was 64bit-only, Atom would be dead. It is the same story as downgrading the requirements for Vista so that it could be used on the machines running the integrated Intel graphics back in the day. It was practically unusable, but allowed Intel to claim it is compatible
That isn't actually true. You *will* get sick even with positional tracking, as many people found out when the DK2 Rift was released. Just look in this thread, for example:
https://www.reddit.com/r/oculu... Positional tracking enhances immersion and potentially presence, but it is not really a fix for motion sickness. Unfortunately many people don't understand this.
The problem is deeper - you are correct that the sensory mismatch between what you see and what your sense of balance (inner ear) and proprioception (nerve endings in your muscles relaying the position of your limbs) are telling you is what causes the problem. However, that is not really tied to the positional tracking. It is fairly easy to demonstrate - many people get sick even with full 6DOF tracking using a very expensive big bucks tracking system, walking around in a CAVE, not using an HMD at all (CAVEs are usually far less motion sickness inducing than HMDs).
Most of the nausea problems are caused by poor application design - sudden accelerations are bad, because you don't expect them (it is akin to someone pulling the rug from under you!), motions not initiated by the user are bad (again, unexpected movement!), inappropriate navigation schemes - strafing, head bobbing, "aiming with your head" (not being able to look and change direction of movement independently - as in all FPS games that use mouselook), etc. All these things cause motion sickness. No amount of tracking wizardry is going to help you there unless the design of the application is fixed - and these problems are unfortunately in almost every single demo that was released for the Rift so far, despite there being 30+ years of published research on VR available.
Then there are problems that are often ascribed to motion sickness, but are not really - headaches, dizziness, eye strain. Those are often caused by a poorly adjusted HMD. This is where Rift suffers a lot, because unless you have perfect vision and your eyes are spaced exactly the same as the Rift lenses, you will get eye strain and headache after a while due to a blurry, out of focus image. This is why commercial HMDs have both dioptric adjustment (the two pairs of replaceable lenses really aren't a solution) and interpupillar distance adjustment (the lenses or even displays themselves can be moved closer or farther apart). Another issue with the Rift-like HMDs is with scenes where the textures and jaggy, not antialiased lines cause visible "beating" (moire) against the raster of the relatively low-res display, provoking a lot of visual discomfort - this was really bad in the DK1, DK2 reduced it a bit thanks to the higher resolution and pentile display. That's why dark scenes work best with Rift, because the dark pixel raster is not that visible.
If it is the SmartBoard, then these things are usually connected to a Windows PC that runs the software and feeds the image to the projector. Most people run PowerPoint slides on these
Otherwise it is either a resistive touch sensor + projector and a few sensors (RFID or even magnets) for the markers. The newer boards use a camera instead of the resistive system.
Well, that's another issue. Unfortunately, most of the teachers don't really known how to teach and keep the students engaged. Putting the same crap they perform every day in the classroom on video doesn't really help anything. Very often it is not even their fault - they weren't actually shown how to teach in the first place!
That may sound surprising, but university teachers rarely get any pedagogical education/training - mostly if you have a degree, you are assumed to somehow know how to teach. So you do what you have seen your teachers do. And it sucks - perhaps your teachers sucked already and even if they didn't, you are certainly not them, only parroting what you think are their methods. Contrast this with highschool/elementary school teachers where pedagogical training/education is mandatory part of the qualification (at least in the most of Europe).
I was lucky to have been offered a training and it did help me a lot - intuition and flying by the seat of your pants can get you only so far. It isn't fair to the students neither. However, we were pretty much the exception and not the rule - most of my colleagues never had that training and some didn't even consider it useful ("I am teaching for 20 years, so I know how to teach. Waste of time!"). Guess who had most of the complaints. And some of these were the most ardent proponents of video lecturing and MOOC, thinking it will free them from the teaching.
On the subject of these e-learning and MOOC systems - I think that these are more a fad to sell the software to the universities and training institutions than anything actually useful. There is lack of any hard data and statistics showing that it is actually effective. Unfortunately, as is often the case, the concept was designed by a businessman or a programmer somewhere, not an actual teacher. Those are usually the last ones to be asked - the system gets bought, installed and then you are told by the university powers that your classes get videotaped and will be put in it. Geee, thanks. Even lecture over video conference system requires special preparation, a fully non-interactive class must be organized and done completely differently than a normal one if there is to be at least some chance for it to work. Right now it is more a money grab by the vendors than anything actually useful, apart from getting the content accessible for more people.
There are a few fundamental issues here and people from both sides of the classroom tend to ignore them. I have some education as a teacher and did actually teach undergraduate and graduate classes at a Uni.
Students are surprised that these courses are often demanding, that there is homework, etc. Hello, these are university level courses, what did you expect? This ain't vacation or World of Warcraft, only with a free diploma at the end.
Teachers are surprised that their classroom-oriented methods don't work when put online. Surprise, recording a lecture on a video, slapping it online and expecting the students to not get bored from the droning and just give up on this is silly. Especially when various extrinsic motivation that keeps students staying put in the auditoriums (like having paid expensive tuition or actually being able to obtain a proper, full degree) is missing. Lectures are boring as hell even when in person, it is probably the worst way to teach/learn. Recording the lecture, removing the personal contact and slapping the thing online only makes it worse. No fancy "e-learning" platforms can fix that fundamentally broken model.
Unfortunately, many unis see the "e-learning", online courses and what not as a great way to save money - no need to pay for so many classes, so many teachers, teachers can spend time doing research instead of teaching, etc. Win-win, right? Wrong!
The technology alone won't make the students learn - the role of the teacher as a facilitator and guide to learning is indispensable. Give students Minecraft (or a tablet or some other technical gimmick) and they will spend 99% of the time fooling around because of the distractions. They need someone to actually show them the relevant bits, explain what is not clear and guide them through the classwork - that is what the teacher is for. Non-interactive video cannot really replace that. While the classic lecture is also horrible from this point of view, the drone at the blackboard can be at least interrupted and asked extra questions. With video this is difficult or outright impossible.
Another crucially important thing for both the student and the teacher is feedback - "Am I doing OK?" "What needs to be improved?" "How to improve it?" If the only "feedback" for the student are automatically marked quizzes or the final mark/score for the course/module, as is often the rule, that really doesn't help them at all - they have perhaps failed the course or received a poor mark already. They need the (formative) feedback while still working!
Also the feedback for the lecturer is important - very often the students don't get anything from the class, because the lecturer mumbles incomprehensibly, is not organized or overloads the students. However, the typical way to collect feedback are some satisfaction questionnaires at the end of the term/module - way too late to fix anything. And now add yet another layer of insulation between the lecturer and the students - the non-interactive videos - and the realistic amount of feedback both sides can expect becomes exactly zero
During my teaching I was trying to get away from lecturing as much as I could - which can be surprisingly difficult, with the university administration explicitly expecting you to lecture. Where I could, the classes were focused on discussion, group work and projects. I was even turning the classes completely inside-out - had the students read the classwork from the textbook, do the exercises at home and then the class was spent explaining what wasn't clear or needed more guidance. There is little point in spending class lecturing for hours stuff that the students can read faster and more comfortably in a book. It did work, for the most part - even though the classes I was teaching were "hard" stuff - like programming, basics of computer graphics, introduction to artificial intelligence, image processing. However, do this with an e-learning system that is explicitly structured around lecturing!
I find these online course systems as a nice way to brush up on some topics, but not really much more. As it is, it requires an extremely strong will and commitment from the student, for little gain. And it doesn't really help the teachers much neither - someone still needs to record those lectures, give out assignments (which cannot be reused from term to term - students are not stupid and will "re-use" the solutions too!), mark all those things, etc. All that in addition to their normal teaching load.
If someone puts together R, Haskell, Cobol and Fortran and declares them unpopular my bullshit detector goes off-scale. That person obviously has no clue.
I don't really get what is the point of this type of article. Good programmer must learn to adapt, if someone thinks that they will learn *THE LANGUAGE* and then live off it until retirement, they are either being delusional or extremely stupid. Learn the underpinnings of the field instead - logic, theory of computation, language theory, data structures and algorithms, structured/object oriented, functional and declarative programming (to at least know that there are other approaches than just the usual imperative code!). Those things are going to be way more useful for any programmer than learning one or two programming languages. That is something that you will typically do in a few days/weeks when you will actually need it - if you know the basics and have some programming experience under your belt already, picking up a new language (not becoming an expert!) is easy.
While I agree with your statement about removal of the video, the part on antisemitism in France is BS.
The recent uptick of antisemitism in France has nothing whatsoever to do with the ban on sale of nazi memorabilia (which is, btw, banned in Germany and many other countries as well), but with the war in Gaza. The people who attacked the Jewish stores and places of worship in the recent riots are mostly young Arabs (and there are plenty of them here in France due to the French involvement in Northern Africa, Lebanon, etc in the past) and various militant pro-Palestine groups.
I suggest that you practice your own advice - if you are not exposed to it (or too ignorant to actually know when to check the facts), shut the hell up.
Pause for storage relocation.