Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Except we wouldn't be here now. (Score 1) 768

We could probably survive without #3 though.

#3 was not just about letting the cops use your house for staking out your neighbors or the army using your home for a free bed-and-breakfast.

It was to keep them from planting a spy in your house, to report on all your activities.

IMHO it is even more apropos now than back when the "quartered troops" were redcoats. Now they're spyware or hardware keyloggers planted in your computer, or racks of tapping equipment in server rooms, as with "Study Group 3" or "Prisim".

We just need a supreme court decision that these automated agents, located on people's or companies' premises, consuming their space and resources, and spying on their activities, are "quartered troops" within the meaning of the Third Amendment for it to become as important in the electronic legal landscape as the First, Fourth, and Fifth are in meatspace.

Comment Re:Captain Crunch!!! (Score 1) 64

According to Lapsley's account Draper just tagged along with the real hackers.

I knew him in those days. He really was quite innovative.

But also quite talkative. I have amazed others who knew him when I describe the time he was staying at my place I actually got him to shut up for over a minute in the middle of a technical discussion.

Of course I did it by showing him something with a phone that he didn't think was possible. (He then shut up while he worked out some of the ramifications.)

Comment Re:Millions of dollars of calls? (Score 1) 64

Were hackers really racking up millions of dollars of fraudulent calls, or was AT&T using the same inflated math that the BSA use to calculate loss of revenue from piracy -- by using full retail prices, even though there may have been no loss of revenue or cost to the carrier.

To some extent it was the inflated math case. The retail rates on long distance service were set very high, to generate money that subsidized rural phone service (which ran at a loss, due to line length, but had to be provided as part of the deal that gave Bell their monopoly charter). The Phreaks mostly used the lines at off-peak hours, when the trunks they used would otherwise be idle.

Comment Actually, it made them money. (Score 1) 64

if the last slot was used by a hacker, there was one less slot for a paying customer. ... unless AT&T was building more capacity to support the hacked phone calls, then there was really no real cost to them (except maybe termination charges for international calls)

But the network traffic, like power consumption, varied a lot with time-of-day, and the network had to be sized to handle the peaks. The phone phreaks usually did their deeds at off-peak hours.

Even if they DID have to install extra equipment, that just meant they made MORE money. The arrangement that granted their monopoly, in return for providing universal service, let them (in cooperation with the regulatory bodies) set prices so they received a guaranteed rate of return. The more they spent, the more profit they made. So as long as the phreaks weren't disrupting things too badly they weren't a financial drag.

That, by the way, is apparently the genesis of Bell Labs. As long as they spent money on something plausibly related to improving telephone service, every dollar they spent brought in a dollar and six cents or so. So Bell hired a lot of smart people, gave them equipment, and told them to go to it (and just publish a couple articles a year in the company journal). (Financially, though, it was a "failure": Chartered to lose money, it actually made money, even in its first year, by licensing the technology it developed.)

Comment Sounds like it's still "all pixels" (Score 3, Interesting) 240

Each application does its own rendering? 31-bit pixel counter?

This sounds like it's all pixels, like X, rather than geometry, like NeWS or display postscript.

So if I have monitors with high resolution I still have to tell all the applications to change their size, individually, or use a microscope to read the text, right?

If I stretch a window (intending to scale it, rather than just see more of what it shows) it has to go back to the application for re-rendering, right?

And if I have adjacent monitors with different resolutions they won't match up. Heaven help me if I lay a window across the boundary between two, the T between 3, or the + between four. Right?

Or have I missed something?

Comment Honest mapping explains two of the three artifacts (Score 1) 304

[] his observations might have been the result of standardizing the test scores... IE if you have a test that only scores 50 max and you scale it to 100 obviously you aren't going to have many odd numbers in the results.

He points out that in some of the tests all scores of 94-100 inclusive were obtained, so it's not a case of leaving out odds or a regularly-spaced set of numbers based on a simple scaling up/down.

If you have a maximum score of 53 you might chose a mapping function like this:
    (rawscore 48) ? (rawscore * 2) : (rawscore + 47). That gives you a non-linear mapping with the slope cut in half for a small interval on the right side. The "can get steps of one and two" on the top mean nothing about what you can get below the knee when the mapping is non-linear.

Similar mappings can end up with both ends smooth and only the middle spiky.

Why do that? So you only get ONE discontinuity in the data, near the top, rather than one point of roundoff noising up the spacing and comparisons between students all through it.

A skewed distribution is hardly surprising, especially when the bulk of the measurements are near one end of a finite numbering system. Further, the non-linear mapping above would make the downslope on the right hand side shallower by a 2:1 ratio, exactly what you see. A distribution skewed toward the high end also argues for using a mapping like the one above - to spread out the pile of high-scoring students and make differences in score less divergent from differences in percentile rank.

The deficits just below passing scores and the spikes at them, however, are just bogus. The only "mapping" that can reasonably explain them is the "courtesy points" shoveling of just-failing students into just-passing. However, this can be explained as mercy being built into the mapping. (It can also be explained as protecting just-passing students from being unfairly pushed into the just-failing region due to a center-spreading, hump-flattening, non-linear mapping applied as a convenience for admissions officers.) The total absence of scores just below the fail point says it's not favoritism or individual corruption, but a systematic benefit given to all just-failing students.

Comment Re:Economic collapse. (Score 1) 209

What makes you think a declining dollar is GOOD for us (except maybe for the government)?

  - It sucks the value out of anything you own that is denominated in dollars. That includes your savings, bonds, and contracted payments (insurance, pensions, wages/salaries).
  - It also sucks the value out of your investments that AREN'T denominated in dollars: When you finally liquidate them, the government includes the inflation component of the sale price as increased profit or decreased loss, and taxes it.

Especially, inflation affects your wages or salary, which won't be inflated to compensate (and you'll fight just as hard for every cent of compensation you DO get as you would for a raise.) It's the lower payments to workers that CAUSE the "increased competitiveness" of the country's products.

Sure the country's businesses sell more - for less actual value. They could have done that by lowering the price, instead, if the maximum profit for them and their workers was at the lower price. They don't need central planners second-guessing them, tilting the whole playing field, and using it as an excuse to rip off more value from the private sector - which means YOU - for the government's coffers.

Comment Re:Thanks for the image. Now I'll have nightmares. (Score 1) 123

Change for the better is fine. Change for the worse is bad.
Change for the sake of change is usually the latter.

Some agile methods have their place. So did Smalltalk.
Others have been described, accurately, as "experiential
programming".

I have yet to see any agile methodology I consider has a
place designing and/or constructing a nuclear reactor
for use near a populated area - or even within the Earth's
biosphere.

To be fair:
  - I haven't studied them all.
  - Some of the components of agile are techniques I have used
      myself, before "agile" was coined, with considerable success.
  - There are a lot of NON-agile methods that are worse than
      even the worst of pre-agile.
I'm willing to be be show that, like best-effort networking with
flakey hardware, it's possible for agile-style methods to be
better at compensating for human error than other
formal methods.

But IMHO all the agile approaches I've seen cut too many
reliability corners for me to trust them on something as
massively life-critical as a reactor design.

Comment Re:Thanks for the image. Now I'll have nightmares. (Score 1) 123

I can tell you are old school to the max given your 40 column display you typed that out on. 80 columns is for newbs!

Nah. You're seeing the width of the comment box.
I like to hit a newline and control the spacing of the lines,
rather than have them re-wrap when the window is resized.

(Of course the first serious digital hardware device I designed
and built, single-handed, was a terminal. And it DID have less
than 80 columns. There were limits to how much horizontal
resolution you could push past the filters of a TV set if you
went in with RF, and back in the days of mostly 74xx small-scale
integration, the recirculating shift registers I could find for the
line cache came only in powers of 2 bits, not 80 or 40.)

NOW get off my lawn! B-)

Comment Economic collapse. (Score 1) 209

Whatever, its not like its going to start WW3... moving on.

If relations sour enough that China stops rolling over and buying more US T-bills and starts selling off its holdings, the collapse of the Dollar will drastically exacerbate the US economic collapse. This could easily lead to a WW III situation.

Comment Thanks for the image. Now I'll have nightmares. (Score 3, Funny) 123

Maybe they should be using the 'Agile' nuclear reactor construction methodology.

I've been programming professionally, as methodology fads
have come and gone. Among those I've encountered were the agile family and its precursors.

Much of that experience was in the auto industry, where
practically any software might end up being life-critical. and
some in telecom, where the reliability requirements are
tighter than mil spec.

My software is noted for robustness,
to the point that a colleague once remarked that I was the
only person he'd trust to program an artificial heart for him.
(Said colleague was one of the evangelists for an agile
precursor.)

The very thought of deploying a nuclear reactor designed
using an agile methodology makes me shiver. I expect to
have nightmares about the possibly for a while now.

Please DON'T mention this bright idea to the pointy-haired
bosses.

Comment I thought this was already solved. (Score 3, Interesting) 51

I was under the impression that the issue of translating LED light into a broad swath of color was an already solved problem (except for some fine-tuning optimization), using appropriately-sized nanoparticles which hand the energy from the photons around, slicing-and-recombining energy from photons into different sized packets and re-emitting the light at a frequency characteristic of the size of the nanoparticle. Cover the LED with a bunch of these in a range of sizes and you get a smooth spectrum.

Works the other way, too: Coat a solar cell with such particles and they take the random-frequency photons from the sun and slice them up into multiple new photons at a frequency good for the solar cell bandgap, and mash the levtovers into more big photons to re-slice to the correct size. (It's not 100%, since some of the photons get away. But it's more than a 2x improvement over a bare cell, which only takes one slice off each photon and throws the rest away.)

If this is correct, this project looks like just a fine-tuning of making the nanoparticles, or finding materials for them that are somewhat more efficient than what was already being used (which was pretty good).

I haven't been following this all THAT closely. Have I misunderstood the current stuff? Or is this just a little incremental tweak along the cutting edge?

Slashdot Top Deals

The power to destroy a planet is insignificant when compared to the power of the Force. - Darth Vader

Working...