Quitting the Graphics Field Over SIGGRAPH 71
An anonymous reader writes "A Professor at Stony Brook university has quit the field of computer graphics. He claims too much importance is given to one particular conference (SIGGRAPH) and that acceptance of papers in this conference has too much importance in terms of the careers (tenure, grants etc) of a researcher. Furthermore he claims the paper reviewing for SIGGRAPH is not fair and bright and novel papers are summarily rejected because they are either not from a 'hot' field or because the reviewer does not understand the concept and is not willing to spend time understanding it. He has started a discussion forum which has comments from several big names in the field including the papers chair of SIGGRAPH 2007."
And? (Score:5, Interesting)
Re:And? (Score:4, Funny)
Re: (Score:2, Informative)
There is branch of anthropology devoted entirely to basketweaving.
KFG
Re: (Score:2)
Instead I became a dive rescure volenteer in my spare time. we don't weave baskets but sell them to raise money.
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:2)
This guy is complaining about a problem that plagues all of academia. FYI, academia includes a slightly broader range of subjects than just "Artificial Intelligence, Security, Compilers, Theory, Distributed Systems, Formal Methods, Programming Languages, [and] Databases." You know, like physics, biology, geology, English literature, ethnomusicology, and a few others...
Re: (Score:2)
a switch of fields within computer science would be tough enough in itself, but it seems to be much more feasible than to drop into a discipline where he would completely start at zero. in fact, transferring the CGI people's deep understanding of mind boggingly optimized data structures into other fields of computer science could even prove to be a very
Re: (Score:2)
Not really. In other fields as large as computer graphics, there are typically several conferences worldwide. I don't have much experience in the matter, but I'd imagine that an industry-driven show like SIGGRAPH that is your sole venue for academic advancement would be a huge pain in the ass to live with, and I can see how it would easily stifle all kinds of interesting but fringe research.
Re: (Score:1)
Re: (Score:2)
the top conferences of other fields might be the most important thing in their field, but they still not known to the general public. most of the people who know that conference will also have heard of the smaller ones. with siggraph things are very different, it's so famous that the number of people who roughly know what siggraph is
Re: (Score:2)
There are TONS of AI conferences. IJCA, ECAI, AAAI, then in specialties, SAT, FLOC... so forth. My understanding is that in graphics, you're in SIGGRAPH or you're not published, and that because of a shortage of conferences, only 1 or 2 papers is good enough for faculty positions. Top positions in AI will command many more than that, and we even have our own journals, even for subfields, such as the Journal of Machine Learn
Re:Crybaby Sally (Score:5, Insightful)
The boy labored hard and was proud to have moved the pile of stone in record time. Surely this would show his usefullness and move up in the crew heirarchy in time.
The boy went to the forman and asked what task he should perform next.
"Throw 'em back over the wall," said the forman.
"What?" yelled the boy. "Why did you have me throw them over the wall in the first place if you were just going to have me throw them back?"
"Well," said the foreman. "You seemed a fine lad to me and I was proud to be able to offer you something to do in order that could learn to earn a wage. Perhaps someday I'll actually have something useful for you to do."
"To hell with this," the boy muttered under his breath and wandered off to find something useful he could do right now, whether it earned him a wage or not.
The moral of the story is: Fuck 'em. Fuck 'em all. Sideways.
KFG
Reminds me of a story I've read sometime ago (Score:2)
A guy was employed in a factory. He'd sit there with a screwdriver, and two cups on a chain would come down from a hole in the ceiling, one of them containing two weird shape pieces of metal and a screw. So he'd take them, fasten them together with the supplied screw, place them in the other cup, and both would go back up to the next floor. Presumably to the next step in the assembly line.
So the guy does his job well for years, and fa
Re: (Score:1)
I have, in fact, brought up this very point in other posts in relation to biofuels (and food is just a common name for animal biofuel). In a fossil fuel free world much of the output of producing biofuels will be consumed to produce and refine the biofuel, consumed by machines and/or human workers.
Conversely, if you buy biofuel today you can rest assured that a considerable amount of fossil fuel went into producing it. That's why the cost is a
Re: (Score:1)
Have you never heard of government patronage? And many IT jobs are no better, with people spending their working lives chasing problems that could be solved in minutes if attacked at the root, rather than at the symptom.
Have you never seen the posts here joking about how the screwed up nature of Windows provides lots of people with a "good" living? What they mean is a good wage, but their lives are essentially being wasted throwing rocks
Academic Review (Score:5, Interesting)
In replying to this comment, I know that I'm going to sound like a bitter grad student; but, for some reason, I feel inclined to burn karma and make this statement:
I sympathize with this professor, and the trouble that he has faced. Although I work in the field of computer security (instead of computer graphics), I have seen many novel and ingenious papers rejected from conferences precisely because they are not from the current 'fad' field. Usually, I require large amounts of caffeine (and alcohol) just to make it through the conferences I attend, because they are filled with uninteresting papers written by hack academics attempting to ride the latest trend.
Perhaps it is this experience that has influenced the way in which I do academic reviews for conferences, when I am called upon to do so. I have no patience for papers that have nothing meaningful to say. Whenever I give an 'accept' rating to a paper, it is because I feel that the authors have something genuinely interesting to say. Whenever I give a 'reject' rating to a paper, I do my best to give as many constructive comments as I can -- I try to point out what insightful or meaningful things the author has done, as well as things that are genuine technical flaws and should be addressed. But, the thing I am never scared to do? I have never backed down from stating in a review, blatently, that the author's work seems novel and useful, and that some of the details are way over my head and should be subject to further review.
Given all the (meaningless) talk about reforming the academic review process, I often wonder: how much of the problem described by this professor would be solved if more reviewers had the balls to admit that some of the most novel ideas were over their heads?
Re:Academic Review (Score:5, Insightful)
Re: (Score:1)
both tesla or reich are great examples of this
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Re: (Score:1, Funny)
Caffeine is a stimulant because it binds to adenosine receptors more readily than adenosine, increasing the levels of dopamine and epinephrine. The latter will increase heart rate and blood glucose levels. It also has other effects, like improving mood through increases in serotonin. In chocolate, drinks that use guarana, and teas theobromine also accompanies caffeine and provides an addit
Re: (Score:3, Interesting)
They may well have admitted that, but it doesn't matter: the problem is that if the reviewers don't understand it, the audience doesn't either. While "this isn't hot" is an invalid reason to reject a paper, "the reviewer didn't understand it after 20 m
Re: (Score:2)
Kuro5hin [kuro5hin.org] is a great example of how peer review can work very well - most of the time. Articles only make it through the voting queue if they get 70 more "accept" votes than "reject" votes. A typical confe
Re: (Score:2)
They may well have admitted that, but it doesn't matter: the problem is that if the reviewers don't understand it, the audience doesn't either. While "this isn't hot" is an invalid reason to reject a paper, "the reviewer didn't understand it after 20 minutes" is a valid reason for rejection.
Not necessarily. It may be that the paper targets a different sub-specialty that will be well represented in the audience. It's perfectly reasonable to say so and pass it to someone else for review.
It's equally val
Academic Review -- general malaise (Score:5, Insightful)
Academia hinges almost entirely on your research karma, your success at obtaining grants, and the funds you can bring in to your department. In computing, it has very little to do with how effectively your work extends understanding in your area, even less to do with using honest scientific methods, and absolutely nothing to do with teaching.
And since your research karma is in the hands of the high priests in the field and has relatively little to do with your own technical abilities, I can fully understand the frustrations of other research academics. It's a dead-man's-shoes area, and not a good field to be in unless you're good at cultivating your profile through social engineering.
Fortunately I left early because of the compelling attraction of fat paycheques in freelance contracting, an order of magnitude better than academic payscales. But even without that, I think the social problems within academia might have made me leave in disgust at some point too.
I don't know anything specific to SIGGRAPH, but that kind of malaise is quite widespread in the academic sector.
PS. The current publication/conference-based approach in peer review needs change. The author of TFA actually gave one possible avenue, arXiv [arxiv.org], which fits in well with today's greater interest in open systems. I support that.
same thing in other fields (Score:2)
The reviewers have their own careers at stake. If you don't support their little club, you're the enemy.
How like hte "real world" (Score:2)
I am glad that it isn't just management types that stuck on faddy subjects. this reminds me of all of the sales meetings, training courses, emails, and presentations that I've had to attend/read/view about the latest "breakthrtough" in business philosophy. It seems like every year I have to sit through some presentation about some repackaged, unoriginal bollocks that some arse came up with. Meanwhile, the guy with the real idea is on the phone arranging for some venture capital.
Glad to hear the acade
Known problem. Known solution, but you'll hate it. (Score:5, Interesting)
This has been recognized for years. See "How to get your SIGGRAPH paper rejected [siggraph.org], from 1993.
Some years ago, I stopped submitting papers to SIGGRAPH and started filing patents. It's been much more profitable.
Anyway, SIGGRAPH seems to have shrunk. I think the show floor peaked in size around 1997. Today, the Game Developer's Conference is where the real technical action is.
SIGGRAPH is mostly a rendering convention now; there's a little animation, a little behavior, and a tiny bit of physics in the papers this year, but other than that, it's rendering and compression. Which are relatively mature technologies.
Exactly. (Score:3, Interesting)
Re:Known problem. Known solution, but you'll hate (Score:5, Insightful)
I don't know if SIGGRAPH has shrunk or not (I wasn't in graphics in '97), but I wouldn't say that the GDC has taken its place. I sympathize with Ashikhmin's frustration at the conference (but not his reaction), having been on the receiving end of a few cryptic SIGGRAPH rejections.
First of all, I don't agree that it's "mostly a rendering convention now". I'd say there were about 20 papers on rendering and compression out of 80 or 90 papers (unofficial page of papers [brown.edu]). I also think that there's lots of "technical action" going on there.
The real problem is that SIGGRAPH hasn't grown with its field. One major conference was fine for the first 20 years or so, but graphics has grown in size and diversity so much in the last 15 years that it's ridiculous that there's still only one "top-shelf" conference. Look at the proceedings for this year's conference; there are papers on rendering, compression, ray-tracing, image processing, vision, data-driven modelling, GPGPU, procedural modelling, HDR, graphics APIs, fluid simulation, photography, mocap, light fields, pcrt, computational geometry, crowd sim, animation, and npr.
EACH of these things that are getting lumped into "GRAPHICS" is enough of a field in its own right that it deserves several journals and conferences of its own.
That's not even the meat of the problem; there ARE conferences for each of these topics, but people generally only submit SIGGRAPH rejects to them! The problem is that everyone wants the prestige that goes with a SIGGRAPH publication, and it's a vicious cycle; there are reviewers who shoot down every paper they feel is a threat to their own work and get away with it, and this forces anyone else who wants to survive there to do the same.
What needs to happen, in my bull-headed opinion, is for all of those people who write good papers that never make it to SIGGRAPH start submitting the first time around to the other conferences - I3D, Pacific Graphics, SCA, IEEE VIS, Eurographics, et cetera. These are all perfectly viable venues that will become as prestigious as people would like, if only people would take them seriously.
I say, let the small-minded dweebs have SIGGRAPH; we shouldn't gauge the quality of our work solely based on SIGGRAPH's rejection policy - even if it were a totally fair process, not every good paper can make it in. Submit your awesome paper to the other conferences, and once these other conferences are packed with impressive work, it'll mean as much as SIGGRAPH.
Just wishful (and a little bitter) thinking.
I don't think "hardware" was the right category for this...
Proofs? (Score:2)
There is one example of the unfair editor behavior in the article - surely not enough to condemn all the conference.
Auther of the article don't like preferred treatment of the "hot subjects". But that is quite natural - "hot subjects" is what most people interested in this moment. If other researcher/practitioners in the field are not interested in what auther doing,
Simple Solution (Score:1, Insightful)
amounted to something important and were rejected by the peer review
processes (not only be siggraph but also other important conferences),
to register somewhere, and basically have the person or persons which
peer-reviewed the paper and then rejected it noted, and to also carry
out an examination of their past rejections and acceptances and attempt
to establish a form of behavior with regards to them.
If said behavior is deemed unacceptable then t
Re: (Score:2, Interesting)
A good editor will know if the reviewer(s) are fair.
It is a flawed system, but I don't
Re: (Score:2)
I certainly don't agree that it is ok to be more worried about the harm to a reviewer's career from feedback on the fairness of their reviewing, and less worried about the harm to career development for a person whose papers are unreasonably rejected by an unfair reviewer. We are talking about a process that is meant to be peer review; authors and reviewers should be peers. Reviewers curre
A system for double-blind meta review (Score:2)
Such an absolute exclusion seems objectively very unlikely. One's academic reputation is ordinarily comprised of the totality of all the usual professional academic activities including reviewed publications, editorships, chairmanships, admin. duties, peer reviewing, scholarly society memberships, prizes, awards, grants, etc. Every selection committee in my experience considers
Re: (Score:2, Insightful)
Salon des Refusés (Score:5, Interesting)
In the 1860s, artists of the nascent realist and impressionist movements submitted works to the Salon de Paris, the official exhibition sponsored by the Académie des beaux-arts, selection committee only to be rejected. The resultant complaints of bias led French emperor Napoleon III to allow the rejected works to be displayed in a separate exhibition.
The first Salon des Refusés in 1863 invited art-works rejected for display at the Salon de Paris.
Most were poor quality, leading to ridicule in the press. However, the exhibition included several important paintings including Édouard Manet's Le déjeuner sur l'herbe (The Luncheon on the Grass) and James McNeill Whistler's The White Girl. Other artists who showed at the Salon des Refusés include Henri Fantin-Latour, Paul Cézanne, Armand Guillaumin, Johan Jongkind, and Camille Pissarro.
it's not quite that simple... (Score:5, Informative)
If the reviewer doesn't understand the importance of the claims or conclusion of the paper, then that's the author's problem. It's the responsibility of the author to make those clear and accessible to everybody.
If the reviewer doesn't understand the methods of the paper, that's the reviewer's problem. Methods sections need to be detailed, accurate, and take as little room as possible, which makes them intrinsically hard to understand. But that's not a problem because they are meant for reproducing the work, not for understanding it.
Re: (Score:2)
I've had reviewers give me guff while reviewing population genetics papers because the reviewer didn't understand the basics of statistics. That's a field I shouldn't have to explain. If you can't grok a Chi-square test, then you should get out of the field, or at least not review papers.
You have to write for your audience, and assume some level of knowledge. You also often have to deal with word limits, so you can't write as much as you'd want to.
Re:Write better papers, dammit (Score:5, Interesting)
Bingo. That's exactly the problem. When I first started in grad school (mechanical engineering), I found the papers very difficult to understand, and I thought it was a problem in my knowledge. But then when I had someone else explain it to me, I was like, "uh, couldn't they have just said [simpler version]" and my adviser politely explained how something that looks too easy won't look novel and notable enough to publish.
In a lecture from a math professor (Erdos number 1), I heard exactly the same thing. He said it takes him a long time to review a submission, because he has to say, "er, okay, how did he get from here to here
You really have to wonder what this is supposed to accomplish. Are you less smart because you got more people to understand your idea? (I've always thought that if you can't explain what you did to a reasonably intelligent layman, given enough time, you don't understand it yourself.)
Re: (Score:2)
And these people are, sadly, correct. If others can understand their work, then the Indians can likely do it cheaper. Consequently they get fired and their job outsourced.
The lesson here is that people want and need job security, and will take steps to create it themselves if it won't be provided to them, no matter how much harm it causes to the field or firm they work in.
I'm
So, fine... he should leave. (Score:5, Interesting)
You know, up until a couple of years ago, I worked my entire adult life (about 20 years - or so :-)) in IT. Call it mid-life crisis, whatever. I needed a change. I was disgusted with corporate idiocy, among other career-specific reasons.
I completely changed careers; although I had some studies in my new field (translation), I got another degree to "re-establish" myself, and set out to work for myself. I can honestly say I've never been happier. Is it because I changed careers, or because I now work for myself? I don't know. All I know is I'm a much happier person, and (I'm told) more pleasant to be around.
I'm one of those people that firmly believes that humans are not meant to do just one thing in life.
I'm quite certain he'll find something that gives him more satisfaction, if he hasn't already.
Makes me wonder (Score:2)
No... (Score:2)
No. That's just how you interpreted it based on your own belief system.
Re: (Score:2)
And you know this how ?
Sure, but that doesn't conflict with claiming that we were meant for something specific. For example, a needle is meant for sewing, but it works fine for poking people's eyes out too.
You're making it sound like you thought there was something wrong with that. Which, combined with your previous unproven claims being stated as truth makes you sound like an atheis
Welcome to academia (Score:2, Funny)
Re: (Score:2)
Re: (Score:2)
> (restricted to a maximum of two pages) rejected
> because it was too short - at exactly two pages.
> I kid you not.
How could this surprise anyone?
I don't think anyone would ever claim that the current system was any good at letting the good papers in - but it's job is less to identify all the good papers than it is to identify all the bad ones.
it's like the exact opposite of an email spam filter: with email, a few "nigerians" in th
Start a new conference (Score:2, Informative)
Sebastian Thrun and a few others were fed up with the quality of ICRA and IROS, so they started a wholey new conference last year, Robotics Science and Systems [roboticsconference.org]. It was successful, and IEEE is now even helping to organize future sessions.
Also, this kind of competition works. ICRA was noticeably better this year, as conferences will make c
Complex papers! (Score:2, Interesting)
Computer imagary is a very large and wide ranging subject, and becau
Hasn't been said yet (Score:3, Funny)