Here is a non-paywalled version of the actual scientific article, including a graph showing the measurements as a function of time on page 7. It's a significant but still faint detection, with the highest value being about 4 standard deviations away from zero.
Doesn't the USA have a concept of jury nullification, where the jury does much more than just determine facts, and actually takes a position on what's right and wrong?
Not all websites are for profit. In fact, the majority probably isn't. This approach would only be a moderate help for for-profit websites, but it would help for popular noncommercial websites like wikipedia, discussion forums, open source software pages, etc. It could also be used to make a noncommercial youtube alternative. Just because something takes an effort to produce doesn't mean that somebody is looking to get paid for it. Some people are just looking for an audience, or others to collaborate with, or are just trying to make the world a better place.
Just a few stories back here on Slashdot we heard examples of people who had their webpages grow so popular that they were forced to put ads on them, even though they didn't wish to. That's the sort of case that would benefit the most from a distributed system.
One of the reasons why the world-wide web is buried in a sea of advertising is that the costs associated with hosting a web-site increase as the site becomes more popular. So you might be ruined by your site becoming too popular. Advertising fixes that problem by giving income proportional to the popularity. But it comes with the undesirable side-effect of the ads themselves.
A peer-to-peer alternative to HTTP is a very different way of solving the same problem. If people who visit a page help upload it to other visitors, then the available resources will scale with the number of visitors without the server's bandwidth needing to increase. Bittorrent does this very successfully for large files and demonstrates that this mechanism can work. But bittorrent's latency is too high to serve as a replacement for HTTP. If this new protocol fixes that, and manages to get supported in many browsers, then things could get interesting. If they are to have any hope in the protocol gaining acceptance, it mustn't only be low latency, it should also be open and well-documented. So let's hope they don't pull another "Bittorrent Sync" here, and keep the protocol closed.
I also get loads of these. I can't imagine anybody being stupid enough to fall for it, but like other spam, I expect it continues because they do get enough replies to be profitable. I'd like to believe that it's funded entirely by people submitting sting articles, but that's probably too optimistic
I think the second greatest harm these fake journals do (after the harm they do to science's reputation due to how the media report on this) is to make it much harder to establish new journals. I think arXiv overlay journals that just provide peer review but let arXiv handle distribution and archiving is a good idea, but I fear such a journal would be promptly ignored nowadays because people are being conditioned to think that unknown journals are fake journals.
It is possible to die on turn 0, before you have even had a chance to act.
Actually, you'll have to watch part too also to see the whole argument.
The actual scientific article was published on arXiv in september. Gravity does not appear to be central to the problem, it is just used as an example here. They basically look at a toy problem where a large set of particles with simple interactions give rise to solutions where they can identify variables that increase monotonically away from a minimum, and hence can be used as a time variable. It is basically an entropic argument worked out in detail for a simple system.
My experience has been mostly positive. When I reported a crash in gfortran it was fixed in the next version. The same happened when I reported a code generation error for the closed source competitor Ifort. When I reported a memory leak in the WCS library in Astropy, the bug was fixed within a few hours. When I requested support for a new site in youtube-dl that was added the same evening. But I've had less luck with projects like Firefox, though I don't remember exactly what the issue was there.
The thing that worries me the most about advertising is its psychological effects. The goal of advertising is to change your behavior so that you buy more products. And it is really quite effective, or 140 billion dollars wouldn't be spent on the digital part of it each year (for comparison, the entire Apollo program cost about 100 billion dollars in today's dollars).
But the main way they affect your behavior is not by giving you the information you need to make an informed decision. It is by using standard propaganda tecniques to as much as possible bypass your rational decision making process. They associate positive feelings with the product, and indicate that you will be popular and cool if you buy it, and lame if you don't. It is not uncommon to see advertising which is so uninformative that it is almost impossible to guess what product is being advertised until the logo appears, but because informing about the product is quite secondary, these ads are still very effective.
Most people think they aren't influenced by advertising (perhaps other people, but not oneself) because we tend to think that our decisions are rational, or at least that we are aware of what processes drive them. But psychological studies have shown that we basically have two decision-making modes: The fast and easy mode and the slow and tiring mode. The latter is quite rational but requires concentration and tires people out. So we usually use the other mode, which is very suceptible to manipulation.
I'm affected by advertising even when I think I'm ignoring it, and so are you. That's why using ad-blocking is a bit like wearing a condom - it protects you from both "mind viruses" and computer viruses (advertising networks are a major vector for malware) that the page you're interacting with might be spreading.
And while it is true that one can opt out from the so-called "acceptable ads" (an oxymoron in my opinion, like "acceptable propaganda", "acceptable brainwashing"), I do not trust somebody who would take money from advertisers to maintain an advertisement-blocking extension. That's why I switched to Adblock Edge, a fork of Adblock Plus that hasn't sold out. Currently the only difference is that the "acceptable ads" "feature" is taken out, but I expect them to gradually diverge as Adblock Plus prostitutes itself further for the advertising industry.
If I understand correctly, higher gravity makes time pass more slowly
Correct. Or to be more general, time passes more slowly deep down in a gravitational well
so a clock in lower gravity will register more time from its perspective than the clock in higher gravity.
Yes. Every time a clock on earth ticks 1 time a (stationary) clock near a black hole will tick 1-R/r times. For example, a clock at a distance r=10/9*R will only tick 0.1 times every time a clock on earch ticks 1 time. The same applies to all other physical processes, not just clocks. So a person on Earch could wave their arms 10 times every time a person that close to the blck hole could wave them once. Or a person on earch could think 10 thoughts in the time a person close tot he black hole could think 1.
Therefore, from the reference frame of an object falling into a black hole it would seem that it takes forever.
No. If time goes slowly for somebody they don't percieve themselves to be going slowly, they see everything else going very quickly. From the reference frame of an infalling object, their clocks are ticking at normal speed (by definition - a local clock measures how fast physical processes happen locally). But the far-away world seems to be sped up.
But from our frame of reference, time passes normally,
Time passes normally to everybody in their own frame of reference.
so we would observe the object falling into the event horizon just as we would an object falling into a star, minus the red shift and vanishing?
No. In our frame of reference, all physical processes near the black hole, be that clocks, falling people or light, are moving in slow motion. In the absene of red shifts and other optical phenomena, we would observe the object to inch closer and closer to the horizon ever more slowly until it's hardly moving. A photon next to them would similarly move extremely slowly (the speed of light as measured locally is constant, but not in other parts of spacetime).
I guess what you thought was something along the lines of "people in areas where time moves quickly will see everything move quickly, and people in areas where time moves slowly will see everything move slowly". To see why that doesn't make any sense, imagine if I had a slow-ray and shot it at you. It would slow you down, but not me. But because I'm not slowed down doesn't mean that I will see you moving at normal speed. Your time passes more slowly than mine, so I will see you move more slowly than me. And since your mind is also affected by the speed of time, you won't perceive yourself to be moving any slower than normal. But to you the rest of the world would look like a movie in fast-forward mode.
So I have a hard time understanding how it would take an infinite time from the perspective of an outside observer
I hope this helps.
The two EHT telescopes that I work on are in Arizona, although I build some of the hardware that's being taken to the South Pole Telescope. It's getting improved to be a part of the EHT. One of the Arizona telescopes is a prototpye ALMA antenna that we just moved here from New Mexico last year, and got working a month ago.
That's intereting. I didn't know that the EHT worked at SPT-relevant frequencies. I work on data analyis for the Atacama Cosmology Telescope, a very similar telescope to SPT, and a neighbor of ALMA. So I've seen the ALMA telescopes up close several times.
Observations are typically done in March/April. This gives good weather at the many sites involved.
Isn't the weather often bad at the ALMA site in that period? In ACT we've used that period for maintenance.
The typical run is a week, and they try to get several 10-minute recordings during that time period. The data is recorded at 1 Gbyte/sec onto banks of hard drives, then shipped by FedEx to MIT for correlation. (I don't know if a FedEx truck makes it to the South Pole every day.)
That's a lot of hard drives! Is that the aggregate data rate for all the telescopes, or just for SPT?
That's a fascinating article, but you won't find many astrophysicists who would bet on it being correct (this probably includes the authors). Traversable wormholes are unstable without large amounts of negative mass, and we have no reason to believe that such exotic matter even exists. And if one is willing to assume that these wormholes have been there since the beginning of the universe in order to explain the presence of compact massive bodies in the center of every galaxy, then you might as well assume that black holes have been there since the beginning of the universe - that requires much less speculative new physics while solving the same problems. And those problems aren't that convincing in the first place - properly including baryons in cosmological simulations may well turn out to produce the right amount of large black holes when we get good enough computers to run them.
This is a typical case of somebody doing a fun "what if" or "devil's advocate" calculation, and the media turning it into something much more definite than it is.
When will observations start? How long will they last? When can we expect to see results on arXiv? How well will the fourier plane be covered (will you still need telescopes in the middle of nowhere to join/be built)? What will the spatial and temporal resolution be? Are there any important astrophysical foregrounds that could mess things up (blurring by plasma sheaths is something I think I've heard mentioned)? How are they handled? Did you know from the beginning that ALMA would join? Can you expect any other big boosts? How wide a frequency range do you have? How large an area in the neighborhood of the black holes you target will you be able to see? Could you expect to discover any nearby stellar black holes or neutron stars (I think one would expect a population of these in the general area)?