An anonymous reader writes: The first large scientific study of how people respond to poor video quality on the Internet paints a picture of ever rising user expectations and the willingness to abandon ship if those expectations are not met.
Some nuggets... 1) Users are willing to wait for no more than 2 seconds for a video to start playing, with each additional second adding 6% to the abandonment rate. 2) Users with good broadband connectivity expect faster video load times and are even more impatient than ones on mobile devices. 3) Users who experience video freezing watch fewer minutes of the video than someone who does not experience freezing. A 1% freezing causes 5% less minutes watched. 4) Users who experience failures when they try to play videos are less likely to return to the same website in the future.
BIG data analyzed (260+ million minutes of video) and some cool new data analysis techniques used. Link to Original Source
from the new-is-relative dept.
sfraggle writes "Kotaku has an interesting review of Doom (the original!) by Stephen Totilo, a gamer and FPS player who, until a few days ago, had gone through the game's 17-year history without playing it. He describes some of his first impressions, the surprises that he encountered, and how the game compares to modern FPSes. Quoting: 'Virtual shotgun armed, I was finally going to play Doom for real. A second later, I understood the allure the video game weapon has had. In Doom the shotgun feels mighty, at least partially I believe because they make first-timers like me wait for it. The creators make us sweat until we have it in hand. But once we have the shotgun, its big shots and its slow, fetishized reload are the floored-accelerator-pedal stuff of macho fantasy. The shotgun is, in all senses, instant puberty, which is to say, delicately, that to obtain it is to have the assumed added potency that a boy believes a man possesses vis a vis a world on which he'd like to have some impact. The shotgun is the punch in the face the once-scrawny boy on the beach gives the bully when he returns a muscled linebacker.'"
jamesrt writes: Guillermo Del Toro has quit as director of the two The Hobbit movies, but will still help write the screenplays for the Lord of the Rings prequels. 'In light of ongoing delays in the setting of a start date for filming The Hobbit I am faced with the hardest decision of my life. After nearly two years of living, breathing and designing a world as rich as Tolkien's Middle Earth, I must, with great regret, take leave from helming these wonderful pictures.' Link to Original Source
jd writes: "Professor Rakesh Kumar at the University of Illinois has produced research showing that allowing communication errors between microprocessor components and then making the software more robust will actually result in chips that are faster and yet require less power. His argument is that at the current scale errors in transmission occur anyway and that the efforts of chip manufacturers to hide these to create the illusion of perfect reliability simply introduces a lot of unnecessary expense, demands excessive power and deoptimises the design. He favors a new architecture, which he calls the "stochastic processor" which is designed to gracefully handle data corruption and error recovery. He believes he has shown such a design would work and that it will permit Moore's Law to continue to operate into the foreseeable future. However, this is not the first time someone has tried to fundamentally revolutionize the CPU. The Transputer, the AMULET, the FM8501, the iWARP and the Crusoe were all supposed to be game-changers but died a cold, lonely death instead — and those were far closer to design philosophies programmers are currently familiar with. Modern software simply isn't written with the level of reliability the Stochastic Processor requires in mind (and many software packages are too big and too complex to port), and the volume of available software frequently makes or breaks new designs. Will this be "interesting but dead-end" research, or will the Professor pull off a CPU architectural revolution really not seen since the microprocessor was designed?" Link to Original Source
An anonymous reader writes: What do hungry stars eat? Planets of course! NASA's Hubble Space Telescope, with its newly installed Cosmic Origins Spectrograph, has captured data about a planet 600 light years away which may "soon" be consumed by its parent star. While the artists rendering gives a nice idea of what may happen over the next 10 million years, the planet has already been warped into an enlongated shape by the sun's strong gravitational forces. The planet in question, WASP-12b, has a mass 40 percent greater that Jupiter and has the highest known surface temperature of any known planet at around 1500C. One reason why it has begun to warp and will be consumed by the star is that it is so incredibly close to the star. So close, in fact, that it orbits the star in only 24 hours. These observations confirm theoretical predictions by Astronomer Shu-lin Li of Peking University, in which he states the core of the planet would become so hot that it would greatly expand the atmosphere of the planet. This matches the observations made by researchers in which they "see a huge cloud of material around the planet, which is escaping and will be captured by the star."
gyrogeerloose writes: John Gruber of Daring Fireball has suggested the possibility that Apple will announce a new extension archtecture for it's Web browser. In his latest blog, Gruber made this sly comment: '[one] big thing that's missing is a proper extension API. If only Apple had an imminent developer conference where they could unveil such a thing.' If this is true, it will be a great boon to Mac users who like Safari's page rendering performance and compliance with Web standards but would like to be able to take advantage of the types of plug-ins available for Chrome and Firefox. Link to Original Source