Forgot your password?
typodupeerror

Comment: Re:Fundamental issues (Score 1) 182

by janoc (#47897937) Attached to: The MOOC Revolution That Wasn't

Well, that's another issue. Unfortunately, most of the teachers don't really known how to teach and keep the students engaged. Putting the same crap they perform every day in the classroom on video doesn't really help anything. Very often it is not even their fault - they weren't actually shown how to teach in the first place!

That may sound surprising, but university teachers rarely get any pedagogical education/training - mostly if you have a degree, you are assumed to somehow know how to teach. So you do what you have seen your teachers do. And it sucks - perhaps your teachers sucked already and even if they didn't, you are certainly not them, only parroting what you think are their methods. Contrast this with highschool/elementary school teachers where pedagogical training/education is mandatory part of the qualification (at least in the most of Europe).

I was lucky to have been offered a training and it did help me a lot - intuition and flying by the seat of your pants can get you only so far. It isn't fair to the students neither. However, we were pretty much the exception and not the rule - most of my colleagues never had that training and some didn't even consider it useful ("I am teaching for 20 years, so I know how to teach. Waste of time!"). Guess who had most of the complaints. And some of these were the most ardent proponents of video lecturing and MOOC, thinking it will free them from the teaching.

On the subject of these e-learning and MOOC systems - I think that these are more a fad to sell the software to the universities and training institutions than anything actually useful. There is lack of any hard data and statistics showing that it is actually effective. Unfortunately, as is often the case, the concept was designed by a businessman or a programmer somewhere, not an actual teacher. Those are usually the last ones to be asked - the system gets bought, installed and then you are told by the university powers that your classes get videotaped and will be put in it. Geee, thanks. Even lecture over video conference system requires special preparation, a fully non-interactive class must be organized and done completely differently than a normal one if there is to be at least some chance for it to work. Right now it is more a money grab by the vendors than anything actually useful, apart from getting the content accessible for more people.

Comment: Fundamental issues (Score 3, Interesting) 182

by janoc (#47897039) Attached to: The MOOC Revolution That Wasn't

There are a few fundamental issues here and people from both sides of the classroom tend to ignore them. I have some education as a teacher and did actually teach undergraduate and graduate classes at a Uni.

Students are surprised that these courses are often demanding, that there is homework, etc. Hello, these are university level courses, what did you expect? This ain't vacation or World of Warcraft, only with a free diploma at the end.

Teachers are surprised that their classroom-oriented methods don't work when put online. Surprise, recording a lecture on a video, slapping it online and expecting the students to not get bored from the droning and just give up on this is silly. Especially when various extrinsic motivation that keeps students staying put in the auditoriums (like having paid expensive tuition or actually being able to obtain a proper, full degree) is missing. Lectures are boring as hell even when in person, it is probably the worst way to teach/learn. Recording the lecture, removing the personal contact and slapping the thing online only makes it worse. No fancy "e-learning" platforms can fix that fundamentally broken model.

Unfortunately, many unis see the "e-learning", online courses and what not as a great way to save money - no need to pay for so many classes, so many teachers, teachers can spend time doing research instead of teaching, etc. Win-win, right? Wrong!

The technology alone won't make the students learn - the role of the teacher as a facilitator and guide to learning is indispensable. Give students Minecraft (or a tablet or some other technical gimmick) and they will spend 99% of the time fooling around because of the distractions. They need someone to actually show them the relevant bits, explain what is not clear and guide them through the classwork - that is what the teacher is for. Non-interactive video cannot really replace that. While the classic lecture is also horrible from this point of view, the drone at the blackboard can be at least interrupted and asked extra questions. With video this is difficult or outright impossible.

Another crucially important thing for both the student and the teacher is feedback - "Am I doing OK?" "What needs to be improved?" "How to improve it?" If the only "feedback" for the student are automatically marked quizzes or the final mark/score for the course/module, as is often the rule, that really doesn't help them at all - they have perhaps failed the course or received a poor mark already. They need the (formative) feedback while still working!

Also the feedback for the lecturer is important - very often the students don't get anything from the class, because the lecturer mumbles incomprehensibly, is not organized or overloads the students. However, the typical way to collect feedback are some satisfaction questionnaires at the end of the term/module - way too late to fix anything. And now add yet another layer of insulation between the lecturer and the students - the non-interactive videos - and the realistic amount of feedback both sides can expect becomes exactly zero ...

During my teaching I was trying to get away from lecturing as much as I could - which can be surprisingly difficult, with the university administration explicitly expecting you to lecture. Where I could, the classes were focused on discussion, group work and projects. I was even turning the classes completely inside-out - had the students read the classwork from the textbook, do the exercises at home and then the class was spent explaining what wasn't clear or needed more guidance. There is little point in spending class lecturing for hours stuff that the students can read faster and more comfortably in a book. It did work, for the most part - even though the classes I was teaching were "hard" stuff - like programming, basics of computer graphics, introduction to artificial intelligence, image processing. However, do this with an e-learning system that is explicitly structured around lecturing!

I find these online course systems as a nice way to brush up on some topics, but not really much more. As it is, it requires an extremely strong will and commitment from the student, for little gain. And it doesn't really help the teachers much neither - someone still needs to record those lectures, give out assignments (which cannot be reused from term to term - students are not stupid and will "re-use" the solutions too!), mark all those things, etc. All that in addition to their normal teaching load.

J.

Comment: Poorly worded mess ... (Score 1) 385

by janoc (#47863345) Attached to: Unpopular Programming Languages That Are Still Lucrative

If someone puts together R, Haskell, Cobol and Fortran and declares them unpopular my bullshit detector goes off-scale. That person obviously has no clue.

What exactly does make a language "unpopular"? That it isn't used to build whiz-bang websites or smartphone apps? Do they realize that languages like Cobol, Fortran and R are fairly specialized tools (data processing, math and statistics) and thus they will always live inside their little (or big) niche. Comparing these with something like C#, JavaScript, PHP, Swift is just retarded, it isn't even apples to oranges comparison. It is like declaring Matlab "unpopular", because there are no apps in Apple's AppStore written in it. About as relevant as yesterday news :(

I don't really get what is the point of this type of article. Good programmer must learn to adapt, if someone thinks that they will learn *THE LANGUAGE* and then live off it until retirement, they are either being delusional or extremely stupid. Learn the underpinnings of the field instead - logic, theory of computation, language theory, data structures and algorithms, structured/object oriented, functional and declarative programming (to at least know that there are other approaches than just the usual imperative code!). Those things are going to be way more useful for any programmer than learning one or two programming languages. That is something that you will typically do in a few days/weeks when you will actually need it - if you know the basics and have some programming experience under your belt already, picking up a new language (not becoming an expert!) is easy.

Comment: Re:I forced myself to watch it (Score 2) 300

by janoc (#47748519) Attached to: Put A Red Cross PSA In Front Of the ISIS Beheading Video

While I agree with your statement about removal of the video, the part on antisemitism in France is BS.

The recent uptick of antisemitism in France has nothing whatsoever to do with the ban on sale of nazi memorabilia (which is, btw, banned in Germany and many other countries as well), but with the war in Gaza. The people who attacked the Jewish stores and places of worship in the recent riots are mostly young Arabs (and there are plenty of them here in France due to the French involvement in Northern Africa, Lebanon, etc in the past) and various militant pro-Palestine groups.

I suggest that you practice your own advice - if you are not exposed to it (or too ignorant to actually know when to check the facts), shut the hell up.

Comment: Re:False Premise (Score 2) 116

by janoc (#47642997) Attached to: Wiring Programmers To Prevent Buggy Code

Mod parent up, please, this is spot on. You do this sort of "research" when you need to justify that the expensive toys you bought are actually used for something.

When I have seen the list of sensors they are sticking on the user, this has nothing to do with anything even remotely practical (have you seen a typical EEG sensor cap or eye tracker?). All the researchers are doing is running the test subject through a battery of experiments and classifying few measured values, based on some correlations - in an artificial setting.

This completely ignores the complexity of the problem - such as the biggest problem being constant interruptions from managers and colleagues, distractions in a noisy cubicle, bad specs, poor/inadequate tools, and many other issues. What they are proposing is basically a Clippy on steroids with a ton of expensive sensors. Such papers are published a dime a dozen (google "assistive agents" for example), not sure why exactly this one got picked out as somehow interesting.

Comment: Re:Completely ignores bad specs... (Score 3, Informative) 116

by janoc (#47642985) Attached to: Wiring Programmers To Prevent Buggy Code

Mod parent up, please, this is spot on. You do this sort of "research" when you need to justify that the expensive toys you bought are actually used for something.

When I have seen the list of sensors they are sticking on the user, this has nothing to do with anything even remotely practical (have you seen a typical EEG sensor cap or eye tracker?). All the researchers are doing is running the test subject through a battery of experiments and classifying few measured values, based on some correlations - in an artificial setting.

This completely ignores the complexity of the problem - such as the biggest problem being constant interruptions from managers and colleagues, distractions in a noisy cubicle, bad specs, poor/inadequate tools, etc. What they are proposing is basically a Clippy on steroids with a ton of expensive sensors. Such papers are published a dime a dozen (google "assistive agents" for example), not sure why exactly this one got picked out as somehow interesting.

Comment: Re:ftdi, Atmel are VERY common in devices. I did i (Score 1) 205

by janoc (#47579883) Attached to: "BadUSB" Exploit Makes Devices Turn "Evil"

Nope. While these chips are common both are way too expensive for mass-produced hardware. Practically every microcontroller has a version with USB interface today and most of mass produced gear doesn't use these - an FTDI bridge is around $1/pop at quantity, that's crazy for an $20-40 end-user price item.

Anyhow, FTDI chips cannot be reprogrammed - you can modify their settings, but the are only an UART/I2C/SPI-to-USB bridge, they don't do anything by themselves. And that something uses e.g. an Atmel AVR chip (actually really rare, they are very expensive for the capabilities they have) doesn't mean that the programming pins are *actually hooked up* to something that is USB-accessible. Some may have the DFU bootloader, but typically they would have the firmware locked. You are way more likely to find various ARM micros and cheap Chinese clones of MCS'51 series these days, but again, that the chip is programmable doesn't mean it could be reprogrammed by the host system!

Comment: Re:and this is news why? (Score 4, Insightful) 205

by janoc (#47574883) Attached to: "BadUSB" Exploit Makes Devices Turn "Evil"

I would love to see malware that will reprogram a mask-programmed blob in a common throwaway hardware. Or a microcontroller in a webcam that doesn't even have the programming pins (typically some sort of ISP or JTAG) connected to anything USB accessible (or not even connected at all, at best to some test pads).

A typical USB stick or a webcam don't have hardware to permit firmware upgrades, even though the silicon inside could be theoretically upgradable. Not to mention that the exploit would have to be written specifically for the target hardware - different processors, memory layout, USB interface, etc - all that would make it really hard to produce a generic malware. If you want to see what is involved in something like that, look at the article on hacking HDD controllers:
http://spritesmods.com/?art=hd... And that is a harddrive, which are produced by only few manufacturers, have relatively standardized interfaces and controllers. Now imagine having to do that sort of reverse engineering on every type of harddrive in common use if you wanted to write a reasonably effective malware (e.g. a data stealing worm). It is much easier to exploit some Windows bug or use a phishing scam than this.

So yes, this is potentially a threat, but panicking over your USB sticks or webcams going rogue on you is vastly overblown. This could be an issue for a very targeted attack where the benefits of compromising e.g. a keyboard of a high value target will outweigh the effort required, but not really anything else. And that assumes that the keyboard is actually able to be updated! It would be probably simpler to just send an operative in and install e.g. a keylogger ...

Oh and they mention the "BadBios" story ... Nobody was ever able to confirm that apart from the original very confused researcher.

Comment: Load of ignorant crap (Score 5, Insightful) 150

The entire article is harping on 3rd-party ad network libraries stealing personal data and phoning tracking info home. As these are libraries and developers are re-using open source libraries, then it follows that "Open source is no free lunch" and is stealing your data. What a majestic leap in logic!

They conflate open source libraries with various ad-network code stealing personal data, basically trying to portrait open source code as being responsible for it. Never mind that the ad-network code is almost never open source.

Granted, OSS is certainly not bug-free, but the spyware has little to do with it.

What a load of ...

Comment: Not surprising ... (Score 2) 123

by janoc (#47448195) Attached to: Elite Group of Researchers Rule Scientific Publishing

This is "news" only to people who don't have a clue how research works - and usually the ones setting the publication criteria - like "you have to publish 2 journal papers per year" for an assistant professor (fresh post-doc or a PhD student), along with all the teaching load, of course. I was teaching 10 different courses (!) one semester and was still expected to actually do research half of my time and to publish those 2 journal papers.
Never mind that shepherding a journal paper through the review process and publication takes a year or two on average alone, plus you have to actually have something to publish to begin with. Even conference papers can take 6 months to publish and you must attend them as well (but nobody wants to pay for that!).

The prolific "publishers" are mostly professors that are heads of labs. They are not actually doing any of the work themselves. It is the young PhD students and post-docs who are slaving away in the lab, writing the papers and then put the name of the prof on the paper as a coauthor. It is a very common practice, basically giving a nod to the prof for paying their salary and letting them graduate. If you have a large lab with 20 PhDs who write 1-2 papers a year, that's alone 40 papers for the prof's CV annually. Then you get invited to contribute to various book chapters (again PhD students write that), you get invited lectures and what not - all that counts as publications.

The young researchers have absolutely no chance to break through in such competition where the number of publications is a criteria. You can have two very good papers but when you apply for an academic job, you have no chance against a guy with 40+ (no matter that most of them are the same thing publishes under different names or it isn't really their work). Unfortunately, that often leads to BS publications - like doing few minor changes and publishing the same work several times in different venues, publishing obvious, non-interesting "results" in minor, often in-house workshops or conferences, in the worse cases even scientific fraud and various misconduct - all for the sake of getting that number of publications up. It is only your job and chance for tenure that is at stake.

I have left university pretty much because of this - with no/not enough publications no chance to get a permanent position, but no chance to get those papers published if all you are doing is teaching teaching and more teaching (even though I love teaching). And when not teaching you are doing paperwork and trying to justify your own existence to various clueless bureaucrats every few months so that they don't cut your funding again. That's not exactly a situation where you can do research.

Comment: More security theater? (Score 3, Insightful) 702

by janoc (#47398571) Attached to: TSA Prohibits Taking Discharged Electronic Devices Onto Planes

I do wonder how this is going to stop someone from smuggling an explosive on board. It is vastly easier to conceal some nasty payload inside of a bulky laptop than inside of a battery. And it could still even work as a laptop - a brick of a plastic explosive the size of a disk drive or a secondary battery would be enough to cause a huge problem on board, without preventing the laptop from booting up and working.

And that is still assuming someone would actually want to bother with this - the guy with explosive underpants certainly didn't need a working battery ...

Mind boggling stupidity.

Comment: Cyveillance (Score 5, Interesting) 349

by janoc (#47384477) Attached to: Qualcomm Takes Down 100+ GitHub Repositories With DMCA Notice

Oh that DMCA was issued by Cyveillance - the incompetent company Hollywood and music labels hired for policing P&P by string matching filenames and then carpet bombing service providers with DMCA requests, even though the content was not infringing at all. I bet they simply crawled Github for Qualcomm copyright notices, something that is often left in source code, even though it was relicensed long time ago already. Unfortunately, their bot is not that smart.

Some references:
https://www.techdirt.com/artic...
http://arstechnica.com/tech-po...

etc.

These bozos are known and someone at Qualcomm should get fired for hiring them. This is going to backfire at Qualcomm in a spectacular way, IMO.

We are not a loved organization, but we are a respected one. -- John Fisher

Working...