Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Rain Drops Signal Cell Phones 86

An anonymous reader writes "Signals from mobile phone masts have been used to measure rainfall patterns in Israel, scientists report. From the BBC article: 'The University of Tel-Aviv analyzed information routinely collected by mobile networks and say their technique is more accurate than current methods used by meteorological services. The data is a by-product of mobile network operators' need to monitor signal strength. If bad weather causes a signal to drop, an automatic system analyzing the data boosts the signal to make sure that people can still use their mobile phones. The amount of reduction in signal strength gave the researchers an indication of how much rain had fallen.'"
This discussion has been archived. No new comments can be posted.

Rain Drops Signal Cell Phones

Comments Filter:
  • by Gordonjcp ( 186804 ) on Saturday May 06, 2006 @06:24AM (#15276148) Homepage
    Well, there's an upper legal limit on effective transmitted power, but often a sector is run at lower power to reduce interference with neighbouring sectors.

    If you find you're getting a drop in signal due to rain fade, you can bump it up a bit. Most stuff uses ATPC (automatic transmit power control) so does it by itself, but you can get graphs off it with SNMP.

  • by Anonymous Coward on Saturday May 06, 2006 @07:38AM (#15276282)
    In the late 1930's Robert Watson Watt was investigating the interference of thunderstorms with radio signals in order to warn of approaching bad weather. As we all know, this led ultimately to the discovery of radar. This story is just a modification of that technique. 1. Duplicate Prior Art with slight modification in frequency 2.**** 3. Profit!
  • by dtmos ( 447842 ) on Saturday May 06, 2006 @07:47AM (#15276294)
    The key point not brought out in TFA is that the rainfall prediction scheme is not based on the link from the handset to the cell tower, but on the wireless backhaul links of the cellular system. The backhaul link is the link from the cell tower to the rest of the world (or at least the phone system of the rest of the world)--in many places in the world it is fiber or some other line, but increasingly often it, too, is wireless, using something called digital fixed radio systems (DFRS; check out standard EN 301 751 at ETSI [etsi.org]).

    The wireless backhaul links are much better for the meterological application than the handset link, because:
    (a) It's a fixed link; since the cell towers don't move, like the handsets do, the location of the link, and therefore the rain, is known, and
    (b) It's at a much higher frequency. The DFRS links used in this paper are at 8-23 GHz, much higher than the 0.8-1.9 GHz (depending on your local regulatory environment) of the handset link. This is important because rain attenuation increases [telesat.ca] as the signal frequency increases; it would be quite difficult to reliably detect rain fades at the handset frequencies (although in a bad enough storm--a cyclone comes to mind--it's probably possible; TFA notes the anecdotal evidence of fading television signals in bad weather).

    I note in passing that the web-based supplimental material to the article references a US patent application, # 60/698,491.
  • by AlecC ( 512609 ) <aleccawley@gmail.com> on Saturday May 06, 2006 @08:24AM (#15276369)
    Correct. The size of the cells used by telephones varies enormously, and hence the power to cover the cell properly also varies. In crowded areas with heavy cellphone use, such as city financial centres, the cells may be only 100 yards across. The power is turned doen so as to avoid invading nearby cells. On the other hand, in isolated regions, they want to make a few masts cover as much area as possible, so they turn up the power so the cells may be tens of miles across. But whatever power you are using, you don't want to be heard across an adjacent cell - the ideal is a small overlap bbetween adjacent cells but no crosstalk to cells beyond. So both masts and phones continuously adjust their power to be "just right". The rain signal discussed in this article is basically the level of this adjustment.

    You can bet that when a phon is advertised as having "up to 240 minutes talk time", that means you get that talk time when standing very close to the mast and therefore using minimum power. In real use, you will be further away, need more power, and get less talk time
  • by Ancient_Hacker ( 751168 ) on Saturday May 06, 2006 @08:56AM (#15276434)
    IIRC when Bell Labs was experimenting with microwaves, circa 1939, they noticed their signals were a LOT weaker when the weather was humid.

    So much so, that when they rolled out microwave telephone relay towers, circa 1950, they intentionally boosted the transmitted signal by some 20db (that's 100 times) more than necessary on a dry day, just to allow the signals to still get through during damp or fog or rain.

    So this isnt even old news, it's going on 68 years!

  • by acidblood ( 247709 ) <decio@@@decpp...net> on Saturday May 06, 2006 @09:05AM (#15276454) Homepage
    A post this badly written doesn't really deserve a response, but here goes:

    • CDMA (I don't know about GSM) has dynamic power control built in, so that transmission power is kept at the bare minimum required -- why use more power if it isn't really required?
    • Extra power drains batteries faster.
    • May interfere with neighboring cells.
    • In a spread spectrum system (both 3G standards use spread spectrum, so this will apply to most networks in the near future), every transmission occurs on the same frequency band, so someone raising their power level is seen as noise on the other communications, which in turn requires everyone to raise their power level.

    Oh, and the turbo button actually slowed down the processor down to the speed of a 4.7 MHz 8086. When in turbo mode the computer would run at nominal speed.
  • by Anonymous Coward on Saturday May 06, 2006 @11:52AM (#15277149)
    Here goes a curious fact about the shape of raindrops and its effect on radio waves.

    Many people think that raindrops have the typical shape of a tear, others think by looking at the rain itself that the drops are vertical lines of water. The first impression comes from pictures and literature, the second is caused by the fact that the raindrops fall at high speed, thus appear vertically blurred.

    In fact, the tears start up being roughly spherical and end up becoming flat because of the air resistance.
    http://www.suite101.com/article.cfm/science_sky/91 232/1 [suite101.com]

    When these water drops are inside an electromagnetic field, electric currents are induced on its surface which attenuate the field. Due to the flat shape, it results that the horizontal component of the electric field is the one which gets attenuated most.
    This means that in order to minimize attenuation in a radio link during rain, it is convenient to use "vertical polarization" (which means that the electric field vector at any point within the electromagnetic field is contained in the vertical axis only) which is the component of the electric field which is least attenuated by rain.
  • by tylernt ( 581794 ) on Saturday May 06, 2006 @02:28PM (#15277845)
    Cell phones operate at 900MHz and 1.8-1.9GHz, which do not skip off the ionosphere (as CB does at 29MHz). Skip is related more to radio frequency and the 11-year sunspot cycle than modulation (i.e., CB's AM vs. SSB [Single Side Band]). Additionally, water droplets tend to reduce signal strength, which is why satellite dish owners sometimes experience "rain fade".

    The only explanation that I can think of for increased signal strength would be the tower antenna's or radio's temperature due to a poor quality installation. Hotter temps (as when the sun is shining on it) can reduce radio performance. I must admit that's a stretch, though.

"May your future be limited only by your dreams." -- Christa McAuliffe

Working...