Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment New content == new copyright (Score 1) 420

Copyright is based on authorship. So, while the main text of Mein Kampf is in the public domain, this new two-volume work will legitimately claim a new copyright. The new copyright will come from all those notes, which were created through acts of authorship. Contrary to the other contemporary story of Otto Frank, editorship is sweat of the brow, while authorship is creativity (see https://www.gutenberg.org/wiki... , for example). Translation does qualify as authorship, since is an intellectual creative act. (Dunno about machine translation...)

In the US, public domain items often have a copyright claim due to a new preface, or introduction or even cover art. Check some of the "Penguin Classics" in your local bookstore for examples. For Mein Kampf, all that added content will create a mixed item, where extracting just the public domain content (minus notes and whatever else is new) will take some effort. Luckily, the public domain text is already widely available without the new notes.

As mentioned, copyright expiry based on life+70 has no bearing in the US for items published prior to 1978 (https://www.gutenberg.org/wiki/Gutenberg:Copyright_How-To). Due to GATT, no copyright renewal was required after 28 years for items published outside of the US to get the full copyright term. The translations into English were not made until after the original German, of course. In the US, copyright will expire 95 years after publication based on the current USC Title 17 (which, as we know, TPP might force to change). But that's another story...
    - gbn

Comment Re:So AMD called their Hyperthreading a CPU core? (Score 4, Interesting) 311

Concur. The design of the bulldozer modules was abundantly clear. The fact that the "cores" share an FPU was clearly disclosed and part of any diagram of the parts. The shared FPUs played a huge role in assessing bulldozer-based CPUs for high performance computing workloads. For HPC, the usual benchmark is HPL (a.k.a., high performance LINPACK), which is a measure of double precision floating point performance for a particular matrix operation called DGEMM. The fact that the FPU was doing double-duty for two "cores" on a module meant that the peak theoretical performance was limited by the number of FPUs in a CPU, not the number of cores or modules or anything else.

As others have noted, hyperthreading via Intel can have exactly the same impact: the threads share various components, including the FPU.

Another aspect that can have a major impact on performance is the number of memory channels, and how things like cache coherency is handled. Among other things, AMD's hyper-transport exhibits different scalability characteristics depending on the number of sockets. In a four or eight-socket configuration, latency due to cache coherency operations can have a big impact on performance.
    - gbn

Submission + - Dyson swarm around distant star?

gbnewby writes: A Dyson Swarm is a smaller but still colossal version of a Dyson Sphere, which is theoretically used by Type II civilizations to harvest energy from a star. According to an online article in The Atlantic, Tabetha Boyajian postulates in a forthcoming paper that a star, KIC 8462852, is surrounded by debris that does not yet have a natural explanation. In a previous paper with fellow Planet Hunters co-authors, Boyajian described the anomaly. It's 1500 light years away, and analysis was of imagery from the Kepler telescope. As her colleague told The Atlantic, "Aliens should always be the very last hypothesis you consider, but this looked like something you would expect an alien civilization to build.”

Submission + - The Most Disruptive Technology Of The Last 100 Years Isn't What You Think

HughPickens.com writes: Ana Swanson writes in the Washington Post that when people talk about "disruptive technologies," they're usually thinking of the latest thing out of Silicon Valley but some of the most historically disruptive technologies aren't exactly what you would expect and arguably, the most disruptive technologiy of the last century is the refrigerator. In the 1920s, only about a third of households reported having a washer or a vacuum, and refrigerators were even rarer. But just 20 years later, refrigerator ownership was common, with more than two-thirds of Americans owning an icebox. According to Helen Veit, the surge in refrigerator ownership totally changed the way that Americans cooked. "Before reliable refrigeration, cooking and food preservation were barely distinguishable tasks" and techniques like pickling, smoking and canning were common in nearly every American kitchen. With the arrival of the icebox and then the electric refrigerator, foods could now be kept and consumed in the same form for days. Americans no longer had to make and consume great quantities of cheese, whiskey and hard cider — some of the only ways to keep foods edible through the winter. "A whole arsenal of home preservation techniques, from cheese-making to meat-smoking to egg-pickling to ketchup-making, receded from daily use within a single generation," writes Veit.

Technologies like the smartphone, the computer and the Internet have, of course, dramatically changed the ways we live and work but consider the spread of electricity, running water, the flush toilet developed and popularized by Thomas Crapper and central heating and the changes these have wrought. "These technologies were so disruptive because they massively reduced the time spent on housework," concludes Swanson. "The number of hours that people spent per week preparing meals, doing laundry and cleaning fell from 58 in 1900 to only 18 hours in 1970, and it has declined further since then."

Submission + - Jeb Bush believes that Newt Gingrich's moon base idea was 'pretty cool' (examiner.com)

MarkWhittington writes: During the 2012 Republican primaries, then candidate Newt Gingrich proposed building a moon base by the year 2020. Unfortunately, his main rival for the GOP nomination, Mitt Romney, descended upon the idea like a wolf on the fold, heaping ridicule upon it. The proposal was even the subject of a Saturday Night Live skit. The moon base and soon after that Gingrich’s candidacy sunk out of sight. But, according to a story on CNN, former Florida governor, and current presidential candidate Jeb Bush was not laughing. In fact, he still thinks that the idea of a moon base is “pretty cool.”

Comment Project Gutenberg procedures might help (Score 5, Informative) 213

This might help: https://www.gutenberg.org/wiki...
And, the updated "Rule 6 How-To" at https://copy.pglaf.org/

For something published in the US after 1923 and before 1964, renewal of copyright was necessary to get a further 28-year extension. (Term extensions in 1998 extended copyright of items published in 1964 onward, and removed the need to renew.)

The Rule 6 how-to has a template for non-renewal research that might satisfy YouTube, if you do the research and send it in.

Only around 10% of items published from 1923 onwards were renewed. (It's no longer required, but you can still renew today.) The US Library of Congress has records of copyright registrations and renewals, and the Rule 6 How-To describes where to get the records. For items from 1923-1963, the renewals for printed items are comprehensive.

Serialization is sometimes a problem. Items might have been published, then published in another form (say, a magazine article that was published as a book), and if the timing is close enough one renewal might cover both items.

Proving something is in the public domain in the US, for printed items, is not that hard for items published from 1923-1963. It takes some time and expertise, and there is always a chance there is a renewal that you didn't find. Proving it is still copyrighted is also easy: show me the renewal.

    - Greg

Submission + - Another big Cray supercomputer, this time in the Middle East

gbnewby writes: In 2008 when King Abdullah University of Science and Technology was getting started, they bought the 14th fastest supercomputer in the world, Shaheen. Today, Cray was selected to provide KAUST's next supercomputer, with a price tag of more than $80million. In a press release, KAUST writes about using this supercomputer for clean combustion, materials science for solar cells, and attracting new talent. The new Cray XC40 supercomputer and other systems will be installed in 2015.

Comment Data centers (Score 4, Informative) 168

The data centers in Quincy are quite large. Microsoft has a major facility, which is undergoing expansion. So does Yahoo, Intuit, Dell, and Sabey. These are major components of the tax base for the town of Quincy and elsewhere in Grant County. The data centers are highly resilient to power loss, with on-site diesel generators, 24x7 staffing, and all the other protections you'd expect. But prolonged use of the generators, if it becomes necessary, could exceed the permitted run time and accompanying pollution the facilities are allowed. Most likely, power from the other dams the Grant County PUD operates (or elsewhere on the regional power grid) could be routed to the data centers.

There are some other huge electricity consumers in the county. It's the world headquarters of a company that makes photovoltaic components, and also several food processors (all those potatoes from eastern Washington gotta be processed and cooked!). Industrial users might be able to turn down their power usage if there is a regional shortage, but data centers tend to operate at fairly stable 24x7 consumption levels. Major companies like those listed above have redundant facilities, and can shunt processing to other centers if required.

Site selection for major electricity consumers, including data centers, is a fascinating topic. The State of Washington has had various tax incentives to help businesses to choose to build facilities there. Electricity costs are among the lowest in the nation (under 3 cents/kWh for industrial customers). Plus, it makes extensive use of renewable sources, particularly hydroelectric (i.e., dams) and wind energy. Oregon has a similar story to tell, with their own rivers, dams, tax breaks, etc., and is part of why Amazon elected to put a huge facility there.

Comment Might be a hoax (Score 2) 776

I think this could be a hoax. It's not a scientific paper, not in a peer-reviewed journal's letter section. It appears via a Google circles posting from Kerry Emanuel who is a well-known, though partially reformed, climate denier. It looks like the Google+ account the letter is published in was just created. Plus, the facts are either skimpy & wrong. Saying we cannot ramp up solar & wind power fast enough, but can ramp up nuclear, is directly in opposition to what's happening. Solar installations are going up by double-digit percentage points each year, and meanwhile we haven't had a new nuclear power plant in over 40 years. The only pair that is underway (which is pictured in the Yahoo! story) is years from completion. There are only 19 permit applications active for new nukes in the US, and the power industry (which is notoriously risk-averse) has for decades shied away from their huge liability and expense.

Submission + - US National Science Foundation Office of Cyberinfrastructure going to CISE (twitter.com)

gbnewby writes: "This story has not yet hit the new wire or nsf.gov, but comes from two tweets from the official NSF news feed: https://twitter.com/NSF_CISE/status/243758554245390337 and https://twitter.com/NSF_CISE/status/243759816097529859. According to the tweets, the NSF Office of Cyberinfrastructure (OCI) will become part of the Directorate for Computer and Information Science and Engineering (CISE). This is huge news. OCI is the major force behind the US' largest supercomputers, and many other high-tech activities."

Comment End of an era (Score 1) 569

200 or so is the end. The last Raptor just flew to my back yard during this past week: "Last F-22 Raptor fighter jet arrives in Alaska" (http://www.newsminer.com/view/full_story/18475362/article-Last-F-22-Raptor-fighter-jet-arrives-in-Alaska).

The F-35 is another matter that seems more relevant: production for those is still delayed. Keep in mind these are sold to other countries, not just used in the US.

Comment Comments from PG (Score 5, Informative) 721

The Anderson/Bear folks posted this a couple of places, and it was picked up a few other places. Here is the text of the most recent substantial message I sent them on the topic: http://cand.pglaf.org/bear-response.txt . The group has not provided the author/title (or PG eBook number) of any title they think was wrongly determined to be non-renewed, other than those mentioned in the email. They seem to have some theories about what is eligible for renewal, or who can renew, but these are not contested by Project Gutenberg (in fact, our policy is to NOT to question whether renewals were fully compliant with the law, nor whether a person had the correct standing to renew).

The issue is whether a renewal was made. For The Escape, part 1 was republished with a different title, complete with part 2. Part 1 was not individually renewed, but the newly titled complete work was. We were unaware of the subsequent retitled republication, so did not find the renewal. For the purposes of copyright and renewal, a major outcome of the legal advice we received concerning The Escape is that serialized works are treated as single acts of authorship. Thus, renewal of a part may be considered to apply to the whole -- provided it happens within a reasonable timespan (we have been advised to use +/- four years).

The Project Gutenberg Copyright How-To has details on our procedures, although the Rule 6 how-to there (for non-renewals) is older than the version we used for the original Anderson/Escape non-renewal determination. We are working on a revision that will include additional research for serials, and a few other variations like republication with different titles. The how-to is here: www.gutenberg.org/wiki/Gutenberg:Copyright_How-To

For those who aren't aware, Project Gutenberg is classified in the US as a 501(c)(3) charity, as a library. With over 35,000 published titles, and well over 50,000 unique instances of copyright research (thousands for our Rule 6), it's not surprising that we make occasional errors. To date (I've been doing this aspect of volunteer work for Project Gutenberg since around 1999), we've changed our stance on fewer than 1/2 dozen public domain determinations. Not perfect, but I believe we're doing a good job overall, and have some very solid procedures by copyright experts over the years.

I first initiated our Rule 6 nearly 10 years ago. This was because I saw that of all the books and serials published in the US from 1923-1963 (when renewal was required for copyright to still apply), 85--90% were never renewed. The US Library of Congress does annual reports on this. Statistically, that means there a million or so items from 1923-1964 whose copyright expired after a 28-year term. These items have been in the public domain in the US since 1992 or earlier (1964+28), and many are out of print. As a policy decision, Project Gutenberg decided it was worth the risk of occasionally missing a renewal, to be able to affirmatively identify the many items for which no renewal occurred. I still believe this decision was the right one.

For those who are paying attention to Project Gutenberg news today, there was a story in the Washington Post that, more or less, accused Amazon of abusing their customers by selling public domain Project Gutenberg works, with DRM added, for a fee. The article is here: http://voices.washingtonpost.com/fasterforward/2010/11/amazon_charges_kindle_users_fo.html . (I exchanged several emails with the author.) It's a weird coincidence that within the same 24 hour period there is another story that basically accuses Project Gutenberg of stealing.

Enough for now. I'm going back to reading Marusek's "Mind over Ship" (sequel to the excellent "Counting Heads"), one of the hundreds of printed books I purchase every year. Maybe before I shut down for the evening I'll post Doctorow's "Makers" to Project Gutenberg. Oh, yeah: I promised I'd buy our Webmaster a Kindle, so he can insure our ePub files display correctly. Lots of GOOD stuff to do!

Thanks to the many posters for their thoughtful comments. As should be clear, copyright is a difficult issue. One thing I'd like to encourage is to give some thought to preparation for 2018. Sometime before then, we can expect a major extension to automatic (no renewal necessary) copyright terms to 95 (individual) or 120 (corporate) years. The last extension (Sonny Bono's last act) was published within the same 24-hour period as President Clinton was impeached, and received essentially no contemporary news coverage. We can expect The Mouse and other self-interested well-monied parties are already planning the next extension, to insure that nothing since 1923 or later ever enters the public domain. Rule 6 is one of our only post-1923 sources for public domain materials, but since renewals are no longer required it only takes us to 1964.


Submission + - Radio Telescope is to become a Cinema Screen

somegeekynick writes: "To mark the 50th year of the Jodrell Bank Observatory, the Lovell Telescope is going to be temporarily turned into a cinema screen. From the article, "The event marks the golden jubilee of the Lovell radio Telescope and the dawn of the Space Age, and kicks off two weeks of celebrations called the 'First Flight Festival'. During the show, the huge dish of the Telescope will act as a giant video screen displaying images of early space exploration, astronomy, engineering, the history and future of radio astronomy and the construction of the Lovell telescope itself.""

Slashdot Top Deals

In every hierarchy the cream rises until it sours. -- Dr. Laurence J. Peter