Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Submission + - Other People's Data

ChelleChelle writes: An effective enterprise information management strategy needs to take into account both internal and external data. Dealing with the latter, however, often proves to be a challenge. Today, companies have access to more types of external data than ever before. The question of how to integrate it most effectively looms large. While this article doesn’t provide a hard and fast answer, it does clearly assess the situation, providing valuable information on the pros and cons of integrating at different layers in the architecture.

Submission + - What DNS Is Not

ChelleChelle writes: According to Paul Vixie of Internet Systems Consortium, “DNS [Domain Name System] is many things to many people—perhaps too many things to too many people.” The main aim of his article, then, is to underline our understanding of what DNS is by differentiating it from what it is not. Vixie maintains that there are many different misuses of DNS common today --or what he calls “stupid DNS tricks”--such as using it as a mapping service or a mechanism for delivering policy-based information. Overall his point of view is well-argued, yet the article begs the question of whether it is better to use DNS only for what it was originally intended or if innovation is the key.

Submission + - You Don't Know Jack about Software Maintenance

ChelleChelle writes: “Everyone knows that maintenance is hard and boring, and avoids doing it.” So starts off a recent article in acmqueue. The authors of the piece (Peter Stachour and David Collier-Brown), however, set out to prove the opposite. According to them maintenance is not only easy but it also can save you and your management time in the most frantic of projects. So what, then, is the best way to go about it? The majority of the article is devoted to answering this question. Stachour and Collier-Brown discuss the different approaches to software maintenance, pointing out the downsides of each before finally advocating an approach in which maintenance is built into the system from the ground up.

Submission + - Biomolecular Machines and Graphics Processors

ChelleChelle writes: For quite some time computer simulation has played an integral role in the study of the structure and function of biological molecules, with parallel computers being used to conduct computationally demanding simulations and to analyze their results. Recently advances in GPU processors and programming tools have opened up tremendous opportunities to employ new simulation and analysis techniques that were previously too computationally demanding to use. In many cases analysis techniques that once required computation on HPC (high-performance computing) clusters can now be performed on desktop computers, making them accessible to applications scientists lacking experience with clustering, queuing and the like. This article examines the evolution of GPU processors and their use by those involved in biomedicine.

Submission + - A Threat Analysis of RFID Passports

ChelleChelle writes: Since the U.S. government first announced its plan to put RFID (Radio-frequency ID) chips in all passports issued from 2007 onwards, concerns have sprang up regarding the issues of privacy and security. As the RFID chips are used to transmit such identifying information as your full name, nationality and even a digitized photo, it is feared that anyone with a high-powered antenna will be able to gain access to such important information and potentially use it with malicious intent (for example to create a new passport with which to gain access to your social security number and finances). Today, the question still remains whether RFID passports really make us more vulnerable to identity theft.
In this article five Harvard University students, led by Jim Waldo (of Harvard and Sun Microsystems) provide a threat analysis of the new passports, examining the plausibility and possibility of such attacks successfully taking place as well as what is being done to prevent them.

Submission + - C.S. at the Service of Biochemistry: An Interview

ChelleChelle writes: "David Shaw might be best known as the billionaire computing genius behind hedge fund D.E. Shaw and Company, but his self-funded research institute, D.E. Shaw Research, is making waves in a different area: biochemistry research. D.E. Shaw Research's largest project so far has been Anton, a 512 processor special-purpose supercomputer designed to run molecular dynamics simulations a couple of orders of magnitude faster than the world's fastest supercomputers. Shaw's hope is that his "molecular microscropes" will help unravel some biochemical mysteries that could lead to the development of more effective drugs for cancer and other diseases. In this interview in Queue conducted by Stanford CS professor Pat Hanrahan Shaw explains some of the key innovations in Anton that make it so fast. "

Submission + - Communications Surveillance: Privacy and Security

ChelleChelle writes: "Wiretapping technology has grown increasingly sophisticated since the police first began to utilize it as a surveillance tool in the 1890s. What once entailed simply putting clips on wires has now evolved into building wiretapping capabilities directly into communications infrastructures (at the government's behest). In a post 9/11 society, where surveillance is often touted as a way of ensuring our safety, it is important to take into consideration the risks to our privacy and security that electronic eavesdropping presents. In this article, Whitfield Diffie and Susan Landau examine these issues, attempting to answer the important question, does wiretapping actually make us more secure?"
Programming

Submission + - Privacy, Mobile Phones, Ubiquitous Data Collection

ChelleChelle writes: Participatory sensing technologies are greatly expanding the possible uses of mobile phones in ways that could improve our lives and our communities (for example, by helping us to understand our exposure to air pollution or our daily carbon footprint). However, with these potential gains comes great risk, particularly to our privacy. With their built-in microphones, cameras and location awareness, mobile phones could, at the extreme, become the most widespread embedded surveillance tools in history. Whether phones engaged in sensing data are tools for self and community research, coercion or surveillance depends on who collects the data, how it is handled, and what privacy protections users are given. This article gives a number of opinions about what programmers might do to make this sort of data collection work without slipping into surveillance and control.
Programming

Submission + - Making Sense of Revision-control Systems

ChelleChelle writes: During the past half-decade there has been an explosion of creativity in revision-control software, complicating the task of determining which tool to use to track and manage the complexity of a project as it evolves. Today leaders of teams are faced with a bewildering array of choices ranging from Subversion to the more popular Git and Mercurial. It is important to keep in mind that whether distributed or centralized, all revision-control systems come with a complicated set of tradeoffs. Each tool emphasizes a distinct approach to working and collaboration, which in turn influences how the team works. So how exactly do you go about finding the best match between tool and team? This article sets out to answer this question.
Programming

Submission + - The Pathologies of Big Data

ChelleChelle writes: Although it remains difficult to define for sure what exactly counts as "big data," one thing remains clear--if you scale up your datasets enough, all of your apps will come undone. Big data often causes big problems, especially, according to Adam Jacobs of 1010data Inc., when it comes to analysis. In this article Jacobs provides an overview of a few of the problems that can arise when analyzing big data, discussing the inability of many off-the-shelf packages to scale to large problems; the paramount importance of avoiding suboptimal access patterns as the bulk of processing moves down the storage hierarchy; and the replication of data for storage and efficiency in distributed processing.
Programming

Submission + - Cloud Computing (acm.org)

ChelleChelle writes: "Many people reading about cloud computing in the trade journals will think it's a panacea for all their IT problems--it is not." Before moving your computing from in-house to the cloud, it is helpful to first be acquainted with the basic principles behind cloud computing as well as to know some of the key issues and opportunities it provides. In this roundtable discussion, five current thought leaders in this field give useful advice on how to evaluate cloud computing for your organization.
Programming

Submission + - How Do I Model State? Let Me Count the Ways...

ChelleChelle writes: In recent years, the question of how to define interactions among Web services to support operations on state has spurred vigorous debate in the technical community. In particular, software architects and developers have argued over the specifics of how exactly to model and implement service state and the associated interactions on that state. This article attempts to shed light on "possible approaches to modeling state " by examining four different approaches: WF-RF, WS-Transfer, HTTP, and No conventions.
Programming

Submission + - Have Sockets Run Their Course?

ChelleChelle writes: As can be inferred from the title, this article examines the limitations of the sockets API. The Internet and the networking world in general have changed in very significant ways since the sockets API was first developed, but in many ways the API has had the effect of narrowing the way in which developers think about and write networked applications. This article discusses the history as well as the future of the sockets API, focusing on how "high bandwidth, low latency and multihoming are driving the development of new alternatives. "
Programming

Submission + - Network Front-end Processors

ChelleChelle writes: This latest article from acmqueue focuses on the history of NFE (network front-end) processors, following the evolution starting from a pure host-based implementation of a network stack and then moving the network stack farther from that initial position, observing the issues that arise. Examining this history sheds light on some of the trade-offs involved in designing network stack software.

Slashdot Top Deals

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...