Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Education

Hacking as Scholarship 67

FatherBusa writes "I am a professor of English who specializes in what is usually called "humanities computing"--a discipline concerned with creating and theorizing about the use of computers in humanities research (the homepage for the Association for Computers in the Humanities has some info). I was recently asked to join a working group charged with the task of establishing a peer review process for scholarly software projects in the humanities and stumbled across the Guidelines for Evaluating Work with Digital Media in the Modern Languages put out by the Modern Language Association (the main professional organization for language and literature studies in North America). Hackers working in humanities departments may want to give it a read. It's an interesting statement that speaks to the (sometimes difficult) process of getting "tools" and other sorts of digital work evaluated as academic scholarship in promotion and tenure processes."
This discussion has been archived. No new comments can be posted.

Hacking as Scholarship

Comments Filter:
  • by SL33Z3 ( 104748 )
    I wonder how the DMCA will weigh in on this.
  • Bumper sticker in a history prof's office down the hallway from mine:

    "Yes, I majored in humanities. Would you like fries with that?"
  • Read it (have a copy), but I don't see the relevance.

    Still, Humanities Computing sounds like an interesting aspect of the field. It could be very useful in most corporations (like my Engineering company, for example).

    • Re:Read it... (Score:3, Interesting)

      by Coplan ( 13643 )
      Actually, the more I think about it, I guess I do see the relevance. Problem is, while it might be useful to get your hacking and other "tools" (as they are called") evaluated towards credits, is it a worthwhile battle? I, for one, tried to get Acadamia to accept my digital work accepted for project work, and didn't have much luck. I am, however, in Landscape Architecture, which is an oldskool field -- perhaps a bit behind the technology of today. Maybe the case is different in other fields (such as humanities, for example) where such digital media might be more acceptable as a form of research?
    • Re:Read it... (Score:2, Insightful)

      by PlanetJIM ( 212710 )

      I'm not a computer scientist, but even if I were I don't think I'd dismiss the computing done in the humanities as fluffy or trivial.

      When I was at Michigan State University I worked with their Humanities Computing unit Matrix [umn.edu]. A lot of the work they did humanities-wise was preservation of spoken and visual texts and making those texts available to scholars digitally. The interest here is obvious for linguists working in oral histories. Some of those tapes have barely been played more than to be transcribed. It's great that they're getting digitized and made available to people online.

      Computing wise, the thing that interested me most in their work is how complicated it is to come up with accurate and helpful metadata to describe the stuff that's getting digitized and cataloged. They work pretty hard to make sure that these texts will be easily searchable and usefully listed for the people that will be using them to write dissertations.

  • by imperator_mundi ( 527413 ) on Thursday August 15, 2002 @09:12AM (#4076111)
    If you're in the tech things/scientific and don't know who Strauss was, you are automatically banned in the limb of the mind numbed tech freak, while if you studied history of arts and you don't know what a square root is, that's simply normal.


    I remember time spent fixing pc for other students that had to finish their thesis about don't know the role of granary in the middle of dark ages and that trated the "machine" as the root of evil, the stuff that was wasting their precious time.

    They were of course thankful but at their eyes you were just a cleric of the satanic cult of technology that behave in a very gentle manner, and fixing the pc was a sort of dark ritual.

    So if someone in the Humanities starts looking at the technology in another way maybe is the begining of a better world.
    • I'm sorry to say this if English is not your native language, but your post indicates that you could use some lessons on certain humanities subjects, notably basic usage, pluralization, and grammar.
    • That's interesting... because I'm a classics major, sitting next to an english major at the computer lab we both work for. We constantly get questions about basic computer stuff from CIS* majors (such as "how do I use the internet?" and "how do I restart my computer?"). Computers are a hobby for us, not a career. (Which makes summer here fun, because we can play with the unused computers to install different Linux distros, BSD flavors, and even Solaris/x86 on)

      I think any true 'geek' or 'nerd' should be well-rounded.

      *CIS = Computer and Information Science
  • by hugesmile ( 587771 ) on Thursday August 15, 2002 @09:19AM (#4076138)
    Many readers apparently found this to be boring and irrelavant reading. I think they may be missing the point.

    In the "old school", professors would get recognized (and tenure) for their contributions thorugh publications (appearing in critical journals, for example).

    Now though, you can make MAJOR contributions by writing "software" (not just programs, but anything published in a digital medium). Using the old rules, you wouldn't be recognized.

    The article referenced implies that such digital contributions are equally relevant for recognition, PROVIDED that they follow the same sort of review process - peer review, unique contribution, etc.

    This seems to be a good approach and good news for "hackers" - our value is being recognized in fields beyond software development.
  • by peter303 ( 12292 ) on Thursday August 15, 2002 @09:23AM (#4076157)
    The conventional evaluation method is for a professional society in a subject to think the creation is worth saving in perpetuity. Then they provide archival and serving of this material. The official stamp of approval is the "citation reference", which could be electronic. This is normally done with peer-reviewed papers in journals. However some of these journals are now entirely electronic.
  • Humanists are not only adopting new technologies but are also actively collaborating with technical experts in fields like image processing, document encoding, and information science. The whole point of the article is the stress of cross-disciplinary useage of technology. The motto is "do the work, and don't be so obsessed about getting credit for it." The principle underlying these guidelines is that when institutions seek work with digital media and faculty members express interest in it, the institution must give full regard to this work when faculty members are hired or considered for reappointment, tenure, and promotion. This gives some "rigor" to any "proof" of theoretical underpinnings of projects -- you don't need ML or TWELF (www.twelf.org) to provide the level of mathematical rigor previously needed to publish. That nifty speech-recognition software that you wrote for the linguistics department is now worth a whole lot more to you. We can probably expect a whole new outpouring in professor/grad student productivity.....i mean heck, I'm probably going to use this in the coming two years to justify some of *my* work. (Dual ms/bs....so my senior project has to be a doozie) -b
    • I don't think the MLA would take a position which encouraged "do the work and don't worry about getting credit for it." None of the position statements the MLA has published regarding part-time instructors, working conditions for graduate students, or more general tenure and promotion issues reflect that sort of thinking.

      If anything, it seems to me that the gist of the MLA's position is nearly the opposite of what you suggest: rather than stressing cross-disciplinary work, the MLA advocates consideration of work in digital media in the disciplinary structures of English. Thus, the organization encourages scholars and administrators to address the subject of digital media tenure and promotion guidelines and in hiring and contract negotiation, so there won't be ugly surprises come review time.

      I agree with your optimism about inter/cross-disciplinary work. But institutions are slow to change.

      thanks, cbd.
  • by Cloudmark ( 309003 ) on Thursday August 15, 2002 @09:38AM (#4076286) Homepage
    As a social scientist (somewhat removed from humanities, but close enough), I've come to really value the role that computing can play in non-traditional fields. While some institutions have come to respect it as a component of any research, it's nice to see articles actively promoting computation, and in particular, elite and elegant computation, in soft research. Despite the stereotype that the only tool required for humanities students are books, the volume of information available now has reached a point where computer-based data mining and high-level analysis algorithms are necessary in order to provide anything resembling a thorough presentation.
    Up to this point, that sort of work, no matter how exceptionally coded, has been seen as just another research tool. I'm very supportive of any effort to arrange for scholarly recognition of code written in support of research. Just as in the sciences, a tool, once written, can be used again and again to further study. Furthermore, a well-crafted program or script can be of more value to the field than the initial data it returns, if only because it makes one more avenue of investigation available to future researchers.

    In summary, I'm glad to see that some of the fields that have traditionally relegated computation to the sidelines are beginning to recognize that there is academic and scholarly value in more than just the data that comes from computers. The development of research tools in the soft sciences may in time come to be almost as important as it is to the hard sciences.

    ~Cloudmark
  • Hackers working in humanities departments .

    Sorry, by definition, there are no hackers working in any humanities departments, anywhere. One does not "hack" part-time. Does one?
    • Sorry, by definition, there are no hackers working in any humanities departments, anywhere. One does not "hack" part-time. Does one?
      That's a rather narrow view of what "hacking" is, isn't it? Hacking isn't limited to finding the latest security hole in IE.

      Try the following as an extension to your definition:

      • Taking existing technologies and making them useful in a field in which they aren't currently used.
        Example: Hacking SGML and DTDs (used for many years in library sciences before being 'hacked' into HTML and it's ilk).

      Those technologically inclined folks in humanities departments "hack" existing technologies in order to make them more useful in their field of study.
      Some examples:

      • Speech Recognition: You need someone to write new and better software for this.
      • Translation: same as speech recognition
      • Storage of data: as someone mentioned elsewhere in this thread: Someone has to put together the collection of metadata to allow the most efficient and accurate storage of data.
      No pure computer hacker would be able to do any of those tasks. In all cases, you would have to have some background in some area of the humanities. And in all likelihood, you would be working in a humanities department (if you work in a university), as well as (or instead of) working in a computer department.

    • The programmer who works for me learned all his skills while working on a PhD Dissertation on Ovid, he's a classicist by training. During that time he was a programmer for a CD-rom based database of classical bibliography. He quit and got a job with me when he couldn't convince the stuffy directors of the database to go web.

      He did look for a while for a job as a professor of classics where his technical skills would be valuable, and he couldn't really find one. Of course, it's hard enough to find any job as a professor of classics, and he wasn't willing to search over a wide geographical area.
  • by Creosote ( 33182 ) on Thursday August 15, 2002 @09:51AM (#4076374) Homepage
    People, you are demonstrating a pitiful lack of awareness about your own history. Father Busa has been coding [kcl.ac.uk] since your grandparents were in diapers. If you've used the online Oxford English Dictionary or any other dictionary or concordance software, you owe what you're doing at least in part to Father Busa's interest in text processing 50 years ago. For that matter, if you use XML, since much of the work in SGML/XML and the theory of markup languages has been done by people coming out of humanities computing.

    Repeat in chorus with me: We are not worthy...

  • Scholars Wanted (Score:5, Insightful)

    by MisterSquid ( 231834 ) on Thursday August 15, 2002 @10:02AM (#4076460)

    Many /.'ers thumb their noses at the academy. Who needs a degree if you have the skills? Why pay money for a piece of paper when one can get right to coding? But the acceptance digital media within the ivy-covered walls can help the acceptance of digital media as more than "playing" video games, surfing pr0n, and "stealing" copyrighted content (not that any of these are not worthy endeavors in themselves ;) ). One of the best ways to ensure the evaluation and production of digital media is to have them studied in an academic context, and only a tenured professoriat can make that happen in ways that matter academically.

    At present, digital media are often marginalized as low-brow. Video-games are often blamed for encouraging mindless violence, the web is blamed for shortening attention spans, and security-checking is vilified as terrorism, email is the font of spam, and reverse-engineering is called breaking copyright. This is the public understanding of digital media.

    Specialized software and digital research [virginia.edu] being done at the Institute for Advanced Technology in the Humanites is at present abstract and does little to affect the thinking of the unwashed masses of undergraduates, let alone the public at large. But this kind of work is important because it influences the scholars who drive the field, and their work goes largely unnoticed by the established disciplines that might most benefit from them. On the other hand publicly-accessible texts [virginia.edu] are in many ways the "content" the web has been looking for (as demonstrated by usage [virginia.edu]), but recognition of such projects is still limited to circles of elite users. This must change, and a cohort of professors teaching students can help bring about that change.

    Creating an established body of scholars able to use computers in ways that help normal people understand how Art and Architecture, Modern Languages and Film shape the world in which we live--this will further the widescale acceptance of digital media as worthile and noble ones.

    It is important that people see digital media as more than video-games and surfing the web. Devising a body of standards by which digital media can be evaluated in the context of tenure review (limited though that context might be) will help.

    The need for a set of standards to review and assign value to the digital work of humanities scholars is crucial to the culture of computing.

  • I have sat on committees at my university talking about this very subject. I teach math and CS, so they bring me in for the "technical perspective" (whatever that means). The debates have been very heated.

    In some cases I support this. Scholarship is a very broad concept at small liberal arts colleges (unlike tier one research schools). If you write a textbook that has no new ideas of your own, but which can help students learn the material better, that counts as scholarship. If that is acceptable, then why shouldn't some computer tool that you created (i.e. language teaching tool, a chip simulator for a C.S. class, etc...) count?

    In my opinion, it should, provided that it goes through the same rigid requirements other stuff does. It is not enough to write a textbook and force your students to use it -- you must demonstrate that it is good enough that other people use it as well (otherwise you are just pawning crap off on your students). That is the difference between scholarship and class preparation.

    And this is also where the debate gets nasty. Many (but not all!) of the people who are trying to get credit for their hacking as scholarship are trying to get it with just their class prep and not subjecting it to higher standards. Sure, they want a review process, but the people often reviewing are other people who want credit for hacking, not the academic community at large. This is very bad; I am reminded of a card from the game "Survival of the Witless" (a satire of tenure politics) called "New York Times reviews of each others books".

    As a result many of them have hurt their credibility badly. And therefore, even though I would like to support them, it has been very difficult.

    • Sure, they want a review process, but the people often reviewing are other people who want credit for hacking, not the academic community at large. This is very bad; I am reminded of a card from the game "Survival of the Witless" (a satire of tenure politics) called "New York Times reviews of each others books".

      This is a serious problem that troubles some of the members of the working group that I'm on. We are -- and we realize it fully -- the choir. It seems to me that any review process has to include people who are not part of the culture that is producing digital work in the humanities. Credibility is key.

    • I think you make excellent points, Walker, and the guidelines really seem quite well thought-out. The cynical side of me knows, however, that reviews are always going to be a crapshoot as the real challenge isn't going to be establishing guidelines per se but ensuring that a tenure committee is balanced enough in it's understanding of the work and that the proper outside reviewers for the job are brought in each time. It seems like one of the biggest problems are academics/administrators who have zero tech-understanding but don't want to admit it. They get completely starry-eyed over anything with words like "digital" or "cyber" attached and don't wish to admit that they can't tell the wheat from the chaff. If they're too frightened to put a tool through it's paces - they might have to admit they don't get it - they prefer to just ooh and ahh at a demo and then escape as quickly as possible. Stellar work is heaped in the same pile with junk. To be honest, I find this breed of academic to be just as much of an impediment as those who dismiss all IT out of hand as the instrument of death for modern culture. I say this as one who has benefitted greatly from the sudden academic cache attached to all things cyberculture and who is one of those people who is being allowed to run loose and do what I wish simply because no one else understands - nor tries to understand - what I'm doing. I take my job very seriously, but it's a situation that could easily be abused.
  • Mod This Up. (Score:1, Insightful)

    by Anonymous Coward
    I have a masters degree. I have defended my thesis against a panel of Doctors who will later take my best ideas and trademark them as their own.

    Here is how academic book publishing works. Write a book, make your undergrads use it in your course and cash in. Every year revise it by adding a useless chapter. Then require your students to buy the newest revision saying it's 'important to be up to date'(tm).

    IMHO the best thing about the digital age is that their is a preference to work like the IEEE boards who ratify standards based on "rough consensus and running code". Many people, myself included, consider this "keeping it real".

    I should not have to beg an academic institution for brownie points for a software implementation that I researched, developed and implemented. I don't have to be taxed for being different and taking the lead. If my software is of use to the world then others can pick it up and go from there, assuming it's public source. Otherwise, the project can die on it's own without a lengthy debate by noble men.
  • In addition to the MLA guidelines mentioned by FatherBusa, there is a "Call for Action on Problems in Scholarly Book Publishing" [mla.org] by the president of the MLA. The president's letter looks at the problem of scholarship from the opposite angle: it's becoming more difficult for young PhDs to publish the traditional monograph (books are expensive and academic presses are selling fewer of them), this problem is systemic and threatens to damage the careers of otherwise promising, tenurable scholars.

    Not only should hacking be considered a form a scholarship, it (along with online publishing) may be the only option available for disseminating one's ideas.

Stellar rays prove fibbing never pays. Embezzlement is another matter.

Working...