Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:Where the choke point really is (Score 5, Informative) 273

It doesn't quite work this way. This is going to be a bit technical, but you asked a technical question, so bear with me. Yes, I am a ham (since you asked for one), and I've also done some commercial RF data systems.

As others have pointed out, cellular telephone systems aren't like broadcast systems. You really can "put up more towers" to increase the amount of "service" (available data transfer per unit time, number of simultaneous voice calls, etc.) in a given geographic area without using more RF bandwidth. The reason for this is that you can turn the power on the base and handset down to reduce the coverage of the cell allowing reuse of the RF bandwidth more frequently within a certain geographical space. This is already done: cells on rural highways are much larger than cells within a city. In fact, the cells on rural highways would often be capable of covering an entire city from a geographic point of view, but there wouldn't be enough capacity to handle all that traffic, so smaller (lower power, lower antenna angle, etc.) cells are placed in cities allowing reuse of that RF bandwidth. Broadcast services can be thought of as "cellular" with very large cells (depending on the service, up to and including the entire planet for HF "shortwave" radio, for example) if you want, but that's not a traditional interpretation.

As for how much bandwidth it takes to attain a certain information rate, that varies with a number of factors. Assuming a uniform RF environment (noise, propagation, etc., which of course isn't true but is handy for discussion), the key tradeoff is made by how "aggressive" your modulation scheme is. A more aggressive modulation scheme packs more data into a certain amount of RF bandwidth, but it requires a stronger signal to noise ratio at the receiver to demodulate and recover the data. The exact relationship between how much data you can chuck into a given amount of RF bandwidth and the required receiver SNR varies with your chosen modulation scheme and receiver design. The reason data rates have been increasing with time is that newer, better (easier to demodulation) modulation schemes and better (mostly less noisy, but also more cost effective for a given complexity) receivers are being developed. More cells are also being added (see above) to lessen "competition" for the channel's bandwidth, but we're also seeing a lot more users and demand, so that probably averages out. The amount of RF bandwidth allocated to the cellular telephone services has remained roughly constant since the late 90s (800MHz cellular band + 1900MHz PCS band, though other bands are also used regionally, and some of these are new).

In a two-way scenario like a cellular telephone, you also get to play with the fact that the two directions don't behave equally. The base-to-handset link (downlink) has the advantage of no access contention (there's just one base, and it knows everything it's doing), expensive equipment (there's only one, so the company can pump some money into it), and lots of power available (it's plugged into the wall). The handset-to-base link (uplink) is messier: it has access contention (multiple handsets coordinated remotely by the base), cost sensitive equipment (consumers don't like to pay thousands of dollars for their handsets), and limited power (batteries). Antennas are something of a wash since antennas are effective about equally in both directions. What all this means is that it's easier to use a more aggressive modulation scheme (and hence cram more bits per second into a given chunk of RF MHz) on the downlink than the uplink. Fortunately, this is roughly in-line with consumer demand: most consumers want to transfer large stuff to their phones, not from them. FWIW, Cable Modems have similar concerns, and a similar situation results.

You also seem to assume a TDMA based uplink channel. Modern standards are all CDMA based. While the theory of operation is totally different, the effect is the same: multiple people contend for the same resource. Various standards allow for various amounts of "resource concatenation" to allow one user to use more than one "unit" of the uplink when it's not full, but most of them do not dynamically adjust the size of the quantum. (Though, somewhat amusingly, the downlink of CDMA2000 EVDO is time-division multiplexed).

Cell providers also have to ensure that old handsets are supported. For non-compatible upgrades (e.g. GSM to UMTS - done by AT&T and T-Mobile), this means they basically have to run both standards side-by-side on different channels. This results in some interesting scenarios where congestion will force some users onto a slower, less efficient standard even if the handset and base both support something better. I've seen this happen at amusement parks a lot.

Also, some on-air standards don't support voice calls at all (e.g. EVDO or HSDPA), and the carriers always want to be able to handle voice calls, so generally at least one channel on a cell will operate in a voice capable mode, though I can imagine some might have the capability to drop every channel into a data-only mode if there really are no voice calls. A true "4G" service would eliminate this as voice calls are actually just data, as far as the infrastructure is concerned, with QoS used to ensure availability. Current data standards (including the LTE being deployed by Verizon) do not have this QoS capability, so voice calls are still routed on a separate channel (probably RTT, but I'm not familiar enough with these deployments to say this authoritatively). LTE Advanced should mash all this back together.

Of course, all this stuff has limits. It's not practical to set up a cell (with the required base station, antennas, tower, etc.) every 100 feet, and current RF technology and transmitter/receiver complexity is only so advanced, so there is a limited amount of service available to an end user. I can't say I agree with Verizon's pricing (I use Sprint, and while I don't have a 4G handset, if I did, I'd have unlimited transfer while on 4G Wi-Max and a 5GB soft cap while on 3G EVDO/RTT), it's not like they could offer unlimited everything easily. I just think the overage charges and base limits are a little off.

(I apologize if I've gotten any technical details e.g. what standard supports what wrong - I'm not a cellular engineer, so if anyone reading this is, please correct me. The basics should be reasonable, though)

The Internet

Activists Use Wikipedia To Test Aussie Net Censors 330

pnorth writes "Editors at Wikipedia have removed a link to a blacklisted web site that sat uncontested for over 24 hours in the main body of the Australian regulator's own Wikipedia entry. The link, which directs readers to a site containing graphic imagery of aborted foetuses, was inserted into ACMA's Wikipedia entry by a campaigner against Internet filtering to determine whether Australia's communications regulator had a double-standard when it came to censoring web content. The very same link motivated the regulator to serve Aussie broadband forum Whirlpool's hosting company with a 'link deletion notice' and the threat of an $11,000 fine. Last night, the link became the subject of 'warring' between several Wikipedia administrators in the lead up to its removal, with administrators saying they didn't want to be used to prove a point."
Hardware Hacking

Bunnie Huang on China's "Shanzai" Mash-Up Design Shops 181

saccade.com writes "Bunnie (of XBox hacking and Chumby fame) has written an insightful post about how a new phenomena emerging out of China called 'Shanzai' has impacted the electronics business there. A new class of innovators, they're going beyond merely copying western designs to producing electronic "mash-ups" to create new products. Bootstrapped on small amounts of capital, they range from shops of just a few people to a few hundred. They rapidly create new products, and use an "open source" style design community where design ideas and component lists are shared."

Is JavaScript Ready For Creating Quality Games? 165

kumpetan writes "After seeing so many games built with JavaScript, and considering the applications it powers and the use of Ajax, it seems like web developers are now in the game development pot. It is getting easier and more popular with libraries like jQuery, MooTools, Prototype, etc. There are even libraries like Game JS, GameQuery or JavaScript GameLib, specifically for this purpose. So, will we start to see more ambitious game projects arise using these tools?"
Hardware Hacking

An Open Source Coffee Machine 99

An anonymous reader writes "The Open Source Coffee Machine [video link] is a recycled coffee machine, controlled by a PC running Beremiz, and using some MicroMod CANopen I/O nodes from Peak-System. This machine have been prepared by Peak-System and Lolitech for SCS-Paris-08 exhibition. It served free coffee during four days at Peak-System's booth, and has been donated to IUT of Saint-Dié-des-Vosges, France, so that students can have fun practicing automation."

Oil-Immersion Cooled PC Goes To Retail 210

notthatwillsmith writes "Everyone's seen mods where someone super-cools a PC by submersing it in a non-conductive oil. It's a neat idea, but most components aren't designed to withstand a hot oil bath; after prolonged exposure materials break down and components begin to fail. Maximum PC has an exclusive hands-on, first look at the new Hardcore Computer Reactor, the first oil-cooled PC available for sale. Hardcore engineered the Reactor to withstand the oil, using space-age materials and proprietary oil. The Reactor's custom-manufactured motherboard, videocards, memory, and SSD drives are submersed in the oil, while the dry components sit outside the bulletproof tank. The motherboard lifts out of the oil bath on rails, giving you relatively easy access to components, and the overall design is simply jaw-dropping. Of course, we'd expect nothing less for a machine with a base price of $4000 that goes all the way up to $11k for a fully maxed out config."
Portables (Apple)

Users Rage Over Missing FireWire On New MacBooks 820

CWmike writes "Apple customers, unhappy that the company dropped FireWire from its new MacBook (not the Pro), are venting their frustrations on the company's support forum in hundreds of messages. Within minutes of Apple CEO Steve Jobs wrapping up a launch event in Cupertino, Calif., users started several threads to vent over the omission. 'Apple really screwed up with no FireWire port,' said Russ Tolman, who inaugurated a thread that by Thursday has collected more than 300 messages and been viewed over 8,000 times. 'No MacBook with [FireWire] — no new MacBook for me,' added Simon Meyer in a message posted yesterday. Several mentioned that FireWire's disappearance means that the new MacBooks could not be connected to other Macs using Target Disk Mode, and one noted that iMovie will have no way to connect to new MacBooks. Others pointed out that the previous-generation MacBook, which Apple is still selling at a reduced price of $999, includes a FireWire port. Apple introduced FireWire into its product lines in 1999 and championed the standard."
The Internet

Drop-Catching Domains Is Big Business 197

WebsiteMag brings us news from the Coalition Against Domain Name Abuse (CADNA) about a recent study of drop catching —'a process whereby a domain that has expired is released into the pool of available names and is instantly re-registered by another party.' The eleven day study showed that 100% of '.com' and '.net' domain names were immediately registered after they had been released. CADNA has published the results with their own analysis. Quoting: "The results also show that 87% of Dot-COM drop-catchers use the domain names for pay-per-click (PPC) sites. They have no interest in these domain names other than leveraging them to post PPC ads and turn a profit. Interestingly, only 67% of Dot-ORG drop catchers use the domains they catch to post these sites — most likely because Dot-ORG names are harder to monetize due to the lack of type-in traffic and because they tend to be used for more legitimate purposes."
Linux Business

Where Linux Gained Ground in 2007 203

christian.einfeldt writes "Computer scientist and media maven Roy Schestowitz takes a look at platforms where GNU Linux gained the most ground in 2007. In a thorough review which is the first of a two-part series, Schestowitz looks at trends in supercomputers, mobile phones, desktops, low-end laptops and tablets, consoles, media players and set-top boxes. Schestowitz finds that GNU Linux solidified its dominant grip on supercomputers; made huge gains in low-end laptops and tablets; won major OEM and retail support on the desktop; gained new entries into game consoles; and also spawned new businesses in set-top boxes while holding its ground in pre-existing product lines. He sums it all up by saying that '2007 will be remembered as the year when GNU/Linux became not only available, but also properly preinstalled on desktops and laptops by the world's largest companies.'"
Privacy

Submission + - Japan to Fingerprint, Photograph all Foreigners (theage.com.au)

MochaMan writes: "As of this Tuesday, November 20th, Japan will be requiring mandatory fingerprinting and mug shots of all foreigners entering the country, making it one of only two countries in the world to do so. The program goes further than the US program in that it also applies to visa-holders and permanent residents. The prints will be stored and shared with other governments. The Japanese government has produced an explanatory video, and even a promotional PDF poster. Japanese and international civil rights groups have raised concerns that the practice is both an invasion of privacy and discriminatory. An online petition to abolish the program is available. Is the age of privacy over?"
Biotech

Purpose of Appendix Believed Found 235

CambodiaSam sent in this story, which opens: "Some scientists think they have figured out the real job of the troublesome and seemingly useless appendix: It produces and protects good germs for your gut. That's the theory from surgeons and immunologists at Duke University Medical School, published online in a scientific journal this week. For generations the appendix has been dismissed as superfluous. Doctors figured it had no function. Surgeons removed them routinely. People live fine without them. The function of the appendix seems related to the massive amount of bacteria populating the human digestive system, according to the study in the Journal of Theoretical Biology. There are more bacteria than human cells in the typical body. Most are good and help digest food. But sometimes the flora of bacteria in the intestines die or are purged. Diseases such as cholera or amoebic dysentery would clear the gut of useful bacteria. The appendix's job is to reboot the digestive system in that case."

RealPlayer 11 Is a Real Rip Contender 226

rishimathew writes to tell us TechNewsWorld is reporting that the new RealPlayer 11, not even out of beta yet, has a lot of great new features including the ability to easily rip streaming videos from sites like YouTube, Revver, and Heavy.com. "With the release of RealPlayer 11, the company is boldly moving into another dicey realm: ripping streaming video. Sure, there are lots of means out there to capture video from sites like YouTube Latest News about YouTube, Revver, Heavy.com and such. There are programs like WM Recorder (US$49.95) and Replay A/V ($49.95), as well as Web sites like Keepvid.com and Mozilla Latest News about Mozilla Foundation Firefox add-ons like VideoDownloader. I've tried some of them. Few, though, can match the slick ease of use of RealPlayer 11 -- and it isn't even out of beta yet."
First Person Shooters (Games)

Submission + - UT3 on Linux or Mac Anyone?

Space-Nut writes: It is well known that the Unreal Tournament series have tried hard in the past to make their games available for Linux and Macs. With the upcoming Unreal Tournament 3 release sometime soon and a statement back in May from Mark Rein in thread about DX10"..All this means is that UT3 will support DX10 — it does NOT mean that DX10 is required! We expect the vast majority of our users will be Windows XP / DX9 users. We will also support Mac and Linux as per usual." Is anyone else really excited about UT3 and 2007 and will you support Epic in providing a Linux and Mac version by buying the game?
Hardware Hacking

Is Your GPS Naive? 291

mi writes "Many GPS devices today will try to scan the FM bands for traffic advisories in the area to display on their screens. The signals, however, are neither authenticated nor encrypted, and one can — with commonly available electronics — construct a device to broadcast bogus advisories. Possible codes range from "bullfight ahead" to "terrorist attack"..."

Slashdot Top Deals

The biggest difference between time and space is that you can't reuse time. -- Merrick Furst

Working...