Scott_F writes: I recently reviewed several commercial, closed-source slideshow authoring packages for Windows and came across an alarming trend. Several of the packages I installed included GPL and LGPL software without any mention of the GPL, much less source code. For example, DVD Photo Slideshow (www.dvd-photo-slideshow.com) included mkisofs, cdrdao, dvdauthor, spumux, id3lib, lame, mpeg2enc and mplex (all of which are GPL or LGPL). What's worse is that the company tried to hide this by wrapping them all in DLL's! There are other violations in other packages as well. It seems that use of GPL software in commercial Windows applications is on the rise based on my testing of other software. My question is how much are GPL violations in the Windows world being pursued? Does the FSF or EFF follow-up on these if the platform is not GPL? How aware is the community of this trend?
rps11 writes: "Internet users are consuming more web content but communicating less, reveals a four-year study by the Online Publishers Association (OPA).
The report, released Aug. 13, states Internet users are spending 47 per cent of their time online reading and watching content, compared with 34 per cent in 2003, representing a 37 per cent increase over four years.
The increase in the time spent on content has been steady; growing 10 per cent from 2003 to 2004, remaining even between 2004 and 2005, growing 13 per cent from 2005 to 2006, and growing 13 per cent from 2006 to 2007.
The organization also found Internet users are spending 33 per cent of their time online communicating, compared with 46 per cent in 2003, marking a 28 per cent decline over four years.
On the e-commerce side, Web users on average spend 16 per cent of their time shopping online versus 15 per cent in 2003.
Meanwhile, the total time being spent on search remains relatively low, accounting for just five per cent, compared to three per cent in 2003.
The OPA attributes the major shift from communications to content as a result of several factors:
The online transition of traditionally offline activities, such as getting news, finding entertainment information or checking the weather.
The popularity of online communities.
A faster and more accessible Internet.
The popularity of online videos.
The improvements in search tools, which are helping online users find relevant content more easily.
The significant increase of content available on the Web.
The rise of instant messaging, which is more efficient than e-mail and has subsequently led to a reduction in time spent communicating.
Whether you agree with or dispute the notion "content is king," the results from this study fortify the importance of content on the Web.
It also serves as a reminder that web types can collectively advance the state of the Web and by creating and fostering quality content."
Ep0xi writes: "The Boston University Twin Project is an NIMH-funded investigation of individual differences in child behavior. The project focuses on activity level and related behaviors in twins. The study employs a behavioral genetic perspective, relying on a twin design to elucidate genetic and environmental factors that contribute to these traits and behaviors. In addition, molecular genetic techniques are utilized to augment the results of behavioral investigations.
The investigation could also lead to secondary effects named as "Anti identificatory personality behavioral disorder" and the effects of that diseas is yet to be known.
The entire proyect info can be found next: http://www.bu.edu/psych/labs/butp.html"
Ep0xi writes: Instead, he said that the disruption "was triggered by a massive restart of our users' computers across the globe within a very short timeframe as they re-booted after receiving a routine set of patches through Windows Update," Arak wrote.
tetrahedrassface writes: According to Chron.com recent advances in the scientific world have pushed researchers to join philosphers and futurists in searching for what defines life. Some thinkers like futurist Ray Kurzweil believe that anything with feelings is alive while most scientists belieive that the answers will never be easily found. As technology marches forward and the rise of self replicating machines and artificial intelligence are further developed the questions may become even more complicated. Religious beliefs may become strained and the oft quoted "playing god" may become a reality. As genetic material and indeed completely novel fabricated life forms become the patented assets of corporations, machines gain the ability to replicate and computers evolve to think and possibly feel, the question may become even more important. From the article: '"We are doing things which were thought to be the province, in some quarters, of God _ like making new forms of life,". "Life is very powerful, and if we can get it to do what we want... there are all kinds of good things that can be done. "Playing God is a good thing to do as long as you're doing it responsibly.
Ep0xi writes: The agreement announced today is an extension of IBM's existing support for the Solaris OS on select IBM BladeCenter servers, and exemplifies IBM's commitment to offering clients the widest choice of operating systems available in the industry, as well as Sun's commitment to offer customers a wider choice of systems for the Solaris platform. IBM and Sun's support of interoperability via open standards also means that customers will be able to extend their infrastructure by connecting new platforms easily, while preserving their initial investments.
Ep0xi writes: "The correct info about the two scientists is:
A Native american half british working for the ESA; and a German descendant, half US citizen half native american working for the NASA
The names are hidden because of the huge discovery that could lead to space-time travels among hundreds of other applications.
They have made the acceleration of photons at speeds higher than light, by bending space-time without the usage of Einstein's formula.
The experiment ocurred by a series of failures, then it was budgeted by the NASA. The ESA provided the cleaning system used to mantain
the emptiness inside the tube at a maximum level.-
Photons were accelerated at speeds of 203.61372934319271i Km/s at such velocity that it could not be named photons anymore.
They broke up the theory of light speed by accelerating and deaccelerating photons with resonant high Q magnetic circles around a linear tunnel
using non-military underground facility of 14 kilometers of particle accelerator.
The photons were forced to crash into a microwave barrier that was the point where photons started to fly thru another phisics already unnamed-
The theoretical speed limit were calculed by the residual enery loss of photons when reentering the conventional phisics limits, and
the distance and time travelled by them
The state of particles with mass travelling at higher speeds than light is called G state, and it is a log() curve in space-time with
a specific vector which can be calculated and or defined previously. The G courve is a difficult formula that do not include standard
space-time formulas, because it is pondered in a n-dimentional space where the dimentions travelled are defined by the exponential of the curve itself.
In the experiment, photons were turned into energy waves back and forth by the accelerator, and the result is the proof
that the light's speed limit can be modified, and reassembled somewhere else by precise theorical formula.
The proyect included seven levels of oscilating rings used to push the limits of light behavior.
The experiment cannot be repeated by the moment, and the data collected is short but precise, and it is stored on one datacenter
so it could not be named as science because the desintegration of photons while
travelling at higher speeds than light cannot be measured
by the conventional science methods and it were calculed with a different kind of phisics theory developed
by the young scientist at charge.
Now the data is being analized in order to provide specific equations for a future testing development.
The hardware included less than two thousand high speed microprocessors with specific software realtime-linked with lasers to provide
precise execution of the heavy machinery and sensors along the way.
Ep0xi writes: Amnesty International researchers, recently returned from Nigeria, have expressed shock at the prison conditions they witnessed and the protracted delays in Nigeria's justice system.
"The circumstances under which the Nigerian government locks up its inmates are appalling. Many inmates are left for years awaiting trial in filthy overcrowded cells with children and adults often held together," said Aster van Kregten, Nigeria researcher for Amnesty International. "Some prisoners are called 'forgotten inmates' as they never go to court and nobody knows how much longer their detention will last, simply because their case files are lost."
devonbowen writes: I'm doing the IT specifications for a project that's being outsourced. The goal is to completely rewrite some software that already exists in-house so that it follows some common standards (modern programming language, framework, browser support, etc). These things are pretty easy to define and it's also reasonably easy to test that the resulting product complies. But how can I specify software quality?
My concern is with maintainability. Is there anything we can write into the specs that will nudge them toward writing code that we'll consider readable and maintainable? Most "coding standards" documents tell you where to put curly braces but leave the bigger issues to be worked out within your corporate culture. And, of course, when outsourcing you are working across two different corporate cultures. One idea that has come up is to have code reviews early to help create some common understanding of code quality between us. Defining the frameworks and libraries should also help to some degree. But I was hoping for something more concrete.
Yes, I know that this problem could be avoided by simply not outsourcing the project. But that simply isn't an option here for political reasons. So given that it's necessary, how would you encourage quality in an outsourcing spec?
Ep0xi writes: Hard by a great forest dwelt a poor wood-cutter with his wife and his two children. The boy was called Hansel and the girl Gretel. He had little to bite and to break, and once when great dearth fell on the land, he could no longer procure even daily bread.
Now when he thought over this by night in his bed, and tossed about in his anxiety, he groaned and said to his wife, "What is to become of us. How are we to feed our poor children, when we no longer have anything even for ourselves."
"I'll tell you what, husband," answered the woman, "early to-morrow morning we will take the children out into the forest to where it is the thickest. There we will light a fire for them, and give each of them one more piece of bread, and then we will go to our work and leave them alone. They will not find the way home again, and we shall be rid of them."
"No, wife," said the man, "I will not do that. How can I bear to leave my children alone in the forest. The wild animals would soon come and tear them to pieces."
"O' you fool," said she, "then we must all four die of hunger, you may as well plane the planks for our coffins," and she left him no peace until he
"But I feel very sorry for the poor children, all the same," said the man. The two children had also not been able to sleep for hunger, and had heard what their step-mother had said to their father.
Gretel wept bitter tears, and said to Hansel, "now all is over with us."
Onlyodin writes: The National Aeronautics and Space Administration (NASA) has given the green light to a project that will build the largest ever supercomputer based on Silicon Graphics' (SGI) 512-processor Altix computers.
Called Project Columbia and costing around $160-million, the 10,240-processor system will be used by researchers at the Advanced Supercomputing Facility at NASA's Ames Research Center in Moffett Field, California.
What makes Project Columbia unique is the size of the multiprocessor Linux systems, or nodes, that it clusters together. It is common for supercomputers to be built of thousands of two-processor nodes, but the Ames system uses SGI's NUMAlink switching technology and ProPack Linux operating system enhancements to connect 512-processor nodes, each of which will have more than 1,000G bytes of memory.