Please create an account to participate in the Slashdot moderation system


Forgot your password?
Take advantage of Black Friday with 15% off sitewide with coupon code "BLACKFRIDAY" on Slashdot Deals (some exclusions apply)". ×

Comment Re: It's not the Earth's fault (Score 1) 291

This is actually how we do it now, except the chalk line is measured by looking at the angular positions of various celestial bodies. This measurement determines the length of a sidereal year. We have been able to make it fairly accurately for the last 50 years or so, and extremely accurately for the last ~60, enough to know that our planet's rotation has slightly slowed during that time. But what we don't know is exactly how long a sidereal year was, say 100 million years ago. Perhaps the earth used to spin around 366 times during its trip around the sun instead of the current 365.25? It's mass and orbital period also change enough on a geologic timescale to affect this. These are problems we know about, but are difficult to solve because we just don't have the data.

Comment Re: It's not the Earth's fault (Score 1) 291

This is not necessarily true. It largely depends on how the rotation of the Earth might change over the next hundreds of thousands of years. We have only been running with leap seconds for a bit over 30 years. And we have only had the ability to measure the orbital period accurately enough to worry about seconds for about 100-150 years. Just because we have always "lept forward" in the current system, we can also leap backward. There is simply not enough collected data to know how far "off" our definition of the second is with respect to the history of the earth nor how much "jitter" we are likely to experience with an unadjusted clock. It's entirely possible that the error would never accumulate enough to be a big societal issue. If we are able to determine the average length of a year over a large time span more accurately, it's quite probable that the easiest fix might actually be simply to redefine the second.

Submission + - Congressional Black Caucus Begs Apple for its 'Trade Secret' Racial Data

theodp writes: In Silicon Valley this week with other members of the Congressional Black Caucus to turn up the heat on the tech industry to hire more African Americans, Rep. Barbara Lee called on Apple and other holdouts among the nation's tech companies to release federal data on the diversity of their work forces. "If they believe in inclusion," said Lee, "they have to release the data so the public knows that they are being transparent and that they are committed to doing the right thing." Apple has refused to make public the EEO-1 data that it routinely supplies to the U.S. Dept. of Labor on the demographics of their workers. In the absence of the race and gender data, which Apple and others historically argued were 'trade secrets' that were not subject to release Freedom of Information requests, tech companies were free to make unchecked claims about their Black employee ranks (Google's 2007 Congressional testimony) until recent disclosures revealed otherwise, and the National Science Foundation was even convinced to redirect NSF grant money specifically earmarked for getting African American boys into the computer science pipeline to a PR campaign for high school girls of all colors and economic backgrounds.

Comment Re:Slashdot you are no better (Score 1) 474

Oh just you wait, it will eventually be subjected to the slow burial. I guess I should not have said 'censored' since .. well the editors weren't actually removing content right? They were instead being extremely selective in which articles got through and how that would affect the context of the story. What SF does is not right; it's contrary to the entire purpose of the site. The chinese blocked it ages ago. Maybe their great firewall deserves more credit than we give it /sarcasm

Comment Slashdot you are no better (Score 3, Informative) 474

I have been around here a long time.

I can honestly say that I am dissapointed to see /. post gloating over a row brewing on another community site while at the same time censoring discussion and posts related to the recent and ongoing Sourceforge controversy. Choosing which subs stay and which go is going to upset a small but vocal set of users. They would be stupid not to know this.

In the case of Sourceforge, I think it's much worse to sell out and betry the trust of an entire community. But let's not talk about it!

Comment JBIG2 (Score 1) 290

I haven't even read this article and I know the culprit exactly: JBIG2.

The compression algorithm operates on binary (2 color) images and has two modes, a lossless mode which is sort of like the love child of RLE and JPEG and a higher compression mode which operates by running the lossless blocks through a comparison routine and discarding and replacing any blocks that are sufficiently similar with references to the first copy. It's actually a good algorithm, but you have to understand how it works to implement it properly. When you have a perfect storm of certain fonts (especially small ones where a glyph can fit perfectly inside a block), have some noise in the bitonal images and have the compression threshold too high you can get some real zingers.. 9, 6, 0, 3, and 8 can all easily get muddled up, not to mention what happens to letters like e o c etc. The key to the whole thing is having good algorithms that can produce quality bitonal images from poor originals and scanning at sufficient resolution (or lowering the compression threshold enough) that blocks cannot hold an entire glyph.

As to why the copier is using the lossy mode of JBIG2 internally is mystery, especially in the "copy" pipeline. I can think of no good reason that it should use anything other than the lossless mode or uncompressed data.

Comment Re:Interesting (Score 2) 239

I'm curious why you like that interview so much; reading that is when I realized that that dude is nothing more than a talking head. Why do you think he needed the Internet to do commentary for Iron Chef America? IIRC he not only got a question about cooking with lava completely wrong, but he insulted the person asking as a way to avoid answering it. When Google failed him, he just bailed.

Comment Re:IPV6 on AT&T Residential DSL (Score 1) 155

AT&T is not issuing you an IPv6 on your residential DSL. I know this because they don't do it. Your computer is generating an IPv6 link local address. Depending on your router and a couple of other factors, you may (probably can) access IPv6 sites using a public 6to4 gateway.

The only advantage to you is that you at least have the ability to access Internet resources that are only available via ipv6, but currently I would imagine that there are probably not any that are particularly relevant to you.

Comment Overblown (Score 1) 205

$1MM of iPads represents about 2500-3000 users depending on the discount they received. First, I'm presuming that these users already had mailboxes and it's just the additional load of ActiveSync that is causing the trouble. If that's the case, with the types of discounts that government and education receive from microsoft and hardware vendors this is like a $15,000 problem at best. In the scope of a million-dollar project a 1.5% budget problem represents poor planning, but I've seen much much worse.

Comment Passive cooled GPU (Score 1) 205

Don't use high end GTX cards; twice as many lower end passively-cooled GPU cards will provide more than the equivalent performance with far less cost and failure rate. If your application really benefits more from additional threads vs single thread execution speed, this is the way to go. Most GPGPU clusters that aren't built using Tegra use this approach.

Comment Re:But weren't they on anyway? (Score 1) 621

Wait, you really don't believe this? I have a kill-a-watt and can assure you he speaks the truth. I don't have a ridiculously high end computer, but I can get its power consumption to vary by more than 300W between it being idle + LCD's in DPMS power save and me actively pushing the cpu and gpu with something. Putting it into S3 suspend will break off another 50W or so.

Now 10 year old hardware pushing 40W delta between unloaded and loaded not including a CRT going to sleep or something? Doubtful. Maybe 15-20W tops. But then again some of that school district's hardware was much newer and $0.06/kWh is a pretty decent utility rate too. I'd say $1 million is a pretty good round number here even though it probably represents a modest 10-20% increase over the bill were 5000 machines simply left on and idle. But consider if this guy who had enough control to install software on 5000 machines had simply set them to go to S3 after a couple hours of not being used? He could have saved the school district millions on power just as easily as he wasted it.

Comment Re:Standard Calculus (Score 2, Informative) 369

The real answer here is it depends a great deal on the GPS itself, then it depends on how whatever software is reporting and logging this information post processes it.

GPS itself is capable of reporting an instantaneous velocity vector calculated by measuring the doppler shift from each satellite. (Comes in as a GPVTG sentence in the NMEA data) So if the receiver is tracking a lot of satellites with a good distribution and there isnt a lot of multipath problems, the accuracy of this vector is ridiculously good. Also, a receiver may not support GPVTG.

Now you can also get velocity data from a GPRMC (ie normal position data) sentence too. According to the specification, the bearing here is supposed to be calculated based on position track angle (presumably so that you dont have to be moving to have a GPS bearing).. The spec seems silent to the origin of the speed reported in this sentence -- seems like it could be calculated as track speed (average speed over the interval) but could easily be reported as instantaneous speed as well.

Of course I haven't tested any, but I imagine in practice, GPS receivers would normally report track/position averaged data in GPRMC and instantaneous data in GPVTG. Any software that is supposed to present this data to a user would have to determine how to aggregate and filter it to provide for its intended purpose. If you really intend to beat a speeding ticket with GPS I would suggest that you need data points of either type (instantaneous or averaged) with at least 1Hz if not 5Hz granularity along with knowledge of what the data represents and how the raw data is filtered and processed. This 30s interval business in this case is just dumb, and nobody ever bothered to determine anything about the nature of the data it seems.

No problem is so formidable that you can't just walk away from it. -- C. Schulz