Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Compare cell phone plans using Wirefly's innovative plan comparison tool ×

Comment Re:Science (Score 2) 60

Yes, more transceivers are better than less, thank you MIT.

But only if they're really tightly synchronized.

MIT got them to be tightly synchronized despite being in different boxes in different rooms, rather than all being in the same box, WITHOUT a lot of extra, extra-special, extra-fancy, extra-cost, hardware. This can be built with a bit more off the shelf stuff (maybe the SAME amount of the same off the shelf stuff but with a bit better firmware) and easily folded into the next generation's chips.

Comment Re:Not handy for the home (Score 1) 60

Since they are talking about many devices connecting to multiple routers it's not going to do much for the average home user then. I may have a couple of devices but only the one router.

  - If you got a second router, put it some distance away from the first, and hooked them together with a network cable, you could use two devices about as fast as you could one with one router.
  - If you had three wired routers you could use three devices close to as fast as you could use one with one router.
And so on.

Note that I'm not talking about using the devices with each near a particular router. I'm talking about the routers spread out around the room or the house and the devices also somewhat spread out - but differently (even just at different spots in the same room) and with no particular relation between the device and the router locations.

Comment Re:We're not in a mimimum yet. [Re:Of course. . .] (Score 2) 280

There is some possibility that the sun may, at some time in the future, enter another sunspot minimum similar to the Maunder minimum of 1645 to about 1715. But we're not in one now.

Actually, there was a recent development in modelling the sun, which (if I recall correctly) resulted in a model of the sunspot cycle that has a high-90s percentage match to the historical data. (The key was to model it as TWO dynamos rather than one.)

Also (again, if I recall correctly) the new model predicted that we were going into something that looked like a new Maunder Minimum, with this cycle being weak and the next one nearly nonexistent.

(Sorry I can't dig up the reference right now. Only got a couple minutes left to post.)

Combine that with orbital forcing (which has been gradually, but progressively more steeply, pushing us toward another BIG ice age since about the time humans started using agriculture and settled down to dig up stuff, including coal), and the expected exhaustion of practically-extractable fossil carbon reserves in something like four more centuries, and warming might not be our long-range climate-change issue at all.

A Maunder minimum might only cover a half-century or so. But if it brought on another "little ice age", that (at about three centuries duration) might be about right to cover the period before global freezing is more of a concern than global warming.

Comment Re:Nah (Score 1) 171

On the other hand, an electric motor can easily produce its maximum torque at stall.

Then drop off like a cliff.

Not necessarily. You're thinking of older, more basic, motor designs, connected directly to a supply (such as a series-wound motor), not a modern electrical machines with winding currents controlled by switching regulators.

Torque is proportional to the product of the stator and rotor magnetic fields, which in turn for wound magnets) are proportional to current.

In a simple motor the current is limited by the fixed voltage applied across the winding resistance, which drops as the machine speeds up due to back-EMF generated by the motor's motion.

In a switching regulator controlled winding the resistance is very low (to reduce I-squared-R losses) and the current is controlled by the switching regulator. The current at stall is potentially astronomical as a result, limited by the regulator's dwell time, not the raw supply voltage. As the motor speeds up the current (and thus the torque) can be maintained at a desired (and high) value despite the rising back-EMF, up to an RPM and back-EMF where the switch would have to be on full-time (or full half-cycle time for AC-excited windings) to push the desired current through the winding resistance.

Comment Re:Nah (Score 1) 171

Porsche 918 Spyder is 0-60 in 2.3s. Elon has a ways to go still.

On the other hand, an electric motor can easily produce its maximum torque at stall.

An electric car, with adequately sized motors, controllers, and batteries (or other power sources) should be able to drive the tires to the traction limit from a standing start to the speed where the available power will no longer sustain that level of acceleration - well over 60 MPH. This means the acceleration is limited solely by the coefficient of friction of the tire/road contact surface - a critical parameter that can be tightly tracked, during acceleration, by drive electronics akin to non-skid brake controllers.

So an electric car should be able to get the best possible standing-start rating out of any given tire technology - and be literally unbeatable in such a contest.

IMHO the only reason (pre-Tesla) electric cars had a reputation for being underpowered creampuffs rather than unbeatable sprint sports cars, is that the automobile manufacturers thought the purchasers would all be eco-freaks, more interested in mileage and ideology than performance, and designed lower-manufacturing-cost, underpowered, cars for this market.

Comment Grey Goo Limit (Score 1) 148

I recall a joke scenario from a couple years ago:

Earth is in the throws of a Nanotech Grey Goo scenario. The microscopic self-replicating robots have converted about half the planet to more of themselves. And then they stop. The few surviving humans, observing from space, are puzzled.

Zoom in. Thought balloon from the mass of Grey Goo: "Damn! We shouldn't have stuck with IPV6. We've run out of addresses!"

Comment Re:Stealth (Score 1) 117

... airframes still can't match pilots. An aircraft on a mission may need to execute some spectacular maneuvers, and the pilot can often survive quite well, especially with active flight suits. However, the airframe is still damaged by the maneuver, and might not be usable again.

Which just means that they didn't throw extra weight and strength (a constant cost) into insuring that the meat-sack-carrying vehicle would take no damage in ANY extreme, momentary, corner case that the meat-sack COULD survive.

Remove the meat-sack-guidance-computer, its support systems, emergency ejection systems, and that big space in the middle for it, and the design potential is drastically altered.

So there's no conflict between your point and the predecessor's claim.

Comment There are plenty of job ADS. (Score 5, Insightful) 332

There are plenty of jobs for [this, that, and the other thing]

There are plenty of job ADS.

This is because, in order to hire an H1-B, the employer must first advertise the job to US persons.

But there are whole classes given on how to gimmick the hiring process so that anyone who applies, other than the desired H1-B, can be plausibly turned down as unqualified. The US applicants waste their time, and the H1-Bs get the positions.

Give us a call when there are plenty of HIRES of US citizens for these, or any, positions.

Comment If queueing could be fair this would be no issue. (Score 2) 193

Last time we had unlimited data plans, there were people who would tether hundreds of gigabytes a month ...

If, when the network became congested, the available bandwidth were fairly divided among the competing users, such usage would not be an issue. Everyone asking for less than their share would get all their data through at line rate, everyone asking for more would evenly divide the remainder. At times when the pipes were too clogged to handle it all, the "data hogs" would get the same data rate as everyone else trying to use the "Information Superhighway". They wouldn't degrade the other users' experience any more than any other user's traffic did. (It's just like the way a driver who likes to cruise flat-out at night doesn't end up going any faster than the rest of the traffic at rush hour.)

I used to wonder why it wasn't done that way. Then I get a job designing router chips, including the special-purpose coprocessors to handle bandwidth division.

It turns out that actually making fair division happen in real time requires enormous amounts of sideways communication between the states of the (otherwise independent) throttling mechanisms for each user, flow, etc. It's much easier to preset the limits and only adjust them occasionally. But that means the "data hogs" either get throttled or, when rush hour comes and they're still trying to pump lots of data, they clog the pipes. So the ISPs identify customers who use a lot of data in off hours and turn down their limits, to keep them from degrading things for everybody else. It's not good. It's not fair to those who are just trying to use the service that was advertised, or those who carefully do their data-hogging on off hours only. But it's about the best ISPs can do with the available tools.

I was starting to look into practical ways to "do it right". But the network equipment company downsized me before I'd gotten rolling on it. Now I'm fully employed doing other stuff. So somebody else will have to figure this out, and get it designed and deployed in a future equipment generation, or we'll keep having this problem.

Comment Re:Reading is faster (Score 1) 290

"...such as a key member who doesn't do process text well..."

I read that a dozen times wondering if I was one of those members...

Nah. More like I'm one of those people who doesn't final-edit text well before hitting "submit" when the boss appears over my sholuder, while I'm in mid posting during a coffee break, with an emergency that needs immediate handling. B-b

Slashdot Top Deals

This login session: $13.76, but for you $11.88.