Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Re:Some good ideas, overall mediocre. Horrible syn (Score 1) 48

I can't argue with "must have some penalty cost," other than to point out that benchmarks show this particular overhead to be insignificant.

Keep in mind that, back in the day, people argued against Display Postscript because it uses floating point calculations rather than fixed point integers. Again, here is where benchmarks show that processors without native floating point arithmetic units pay a severe performance penalty for floating point calculations. However, modern processors can perform floating point math in the same number of cycles as integer math, and thus there is no performance difference at all. At least there's not performance difference in speed: There's certainly a performance difference in the fact that DPS (and other float based graphics systems) perform better in terms of their output quality.

I can't imagine anyone successfully making the argument today that we should avoid the use of floating point because it must have some penalty cost compared to integer calculations.

Another beauty of ObjC is that you can mix in Standard C whenever performance becomes an issue. Thus, I've been writing application since 1992 in Objective C, and have never had issues with performance.

Comment Re:Some good ideas, overall mediocre. Horrible syn (Score 1) 48

Apple put the myth of poor performance to rest during a WWDC 1997 session.

Due to caching and the differences between PowerPC and Intel, it actually took fewer processor cycles to make a method call in Objective C on PowerPC compared to a C++ function call on Intel. That's because, at the time in the late nineties, calling a library function on an Intel processor required several segment override opcode prefixes, and this resulted in many cycles.

Actual benchmarks and cycle counts show that Objective C does not have a performance hit.

Regarding language features, Java 2 borrowed interfaces from Objective C, long before Objective C 2.0 was dreamed up.

Comment Re:Of course (Score 1) 945

... but government regulating fluorinated water is a serious issue. It is an abuse of democracy for 51% of the voters to force certain choices upon the remainder, at least when they are innocent of any crime. Fluoride has a scientifically proven benefit, but ingestion is not necessary for that benefit. In fact, ingestion is questionably harmful. In other words, not everyone agrees when it comes to costs versus benefits analysis.

The question here is whether we need government to protect us from something that isn't actually happening, and that could just as easily be handled by informed consumers fleeing any ISP who alters content negatively.

Personally, I wouldn't mind BitTorrent traffic being discriminated based on protocol content rather than volume, at least not for lost-cost, shared internet services where BitTorrent (and similar) traffic takes away from the available bandwidth. It should be consumer choice, not one-size-fits-all regulation, that decides which ISP uses particular rules.

Tim Berners-Lee talks about the internet as if everyone gets to enjoy 100% of the bandwidth "volume" that they purchase. That was true when T1 lines and phone modems were the only choices. Now that we have more options, including shared bandwidth technologies such as DSL and Cable, it makes sense that some providers will shape traffic to favor one type of protocol versus another - i.e. based on content instead of volume - because you have to pay a lot more to get exclusive access to bandwidth.

This seems like a tragedy of the commons, because as internet becomes cheaper, people start to believe that it is a right or that the internet can be taken over by a central authority. Legislation could end up raising costs for everyone, or lowering performance - and the ISPs may never censor anyway.

Comment Re:Not arguing about Net Neutrality, but Reality (Score 1) 945

Agreed, SuperKendall.

My view is this: With all of the real problems we have, can't we just agree to keep an eye on the internet and solve the problems that are killing people for now?

Part two: All older forms of media are highly regulated by the government, and they're almost completely corrupt. Does nobody see the problem with jumping the gun to regulate the internet? Government rules will just accelerate the corruption of the internet, which is certainly the hidden agenda here.

Comment Re: Could be hard (Score 1) 536

Very good point. However, I would say that there is a difference between just "looking" at code versus doing a thorough review. Thorough reviews are expensive and rare, but when done properly they would reveal the purpose of every line of code. The trouble is that you can't really guarantee a good code review unless you do it yourself, or really trust the person or people doing the review.

OpenBSD seems to be one of the groups where they focus on this sort of thing, specifically.

In other words, I agree with the statement in the subject - "Could be hard" - but I would say that the sole distinguishing feature of OpenBSD is that they have always focused on exactly this kind of difficult code review / audit. If anyone is going to do it right, it's probably OpenBSD.

Comment Tim doesn't say how we can stop allowing the trend (Score 1) 108

Why does anyone think the government will do a good job of making our choices for us?

If large social networking sites are really a problem because they wall off data, then why are people on those sites? People should be able to recognize the problem for themselves and join other sites, ask that existing sites be changed, or develop new sites. If the government interferes, then everyone will be stuck with one kind of site. Democracy is not supposed to force everyone to use a site without walls just because 51% of the people don't want walls.

If wireless ISPs are selectively slowing traffic, then why aren't consumers savvy enough to complain, cancel their service, and move to another ISP? Are we really saying that consumers should be able to simultaneously choose the cheapest ISP *and* get the best service? If the government requires full speed for all data types and all possible connections, then everyone will have to pay for more capacity than they might otherwise. What about people who don't care about speed and had rather get cut-rate services? If 51% of the people want to force their definition of "service" on everyone else, is that really why we have a democracy?

Comment Re:Nonsense (Score 1) 283

I talked to my independent ISP about the back haul, and the word is that it has always been neutral with no threat of becoming non-neutral. My ISP actually has fiber optic links to at least three different back haul companies, and that competition is generally what keeps them honest. At that level, the service is expensive but professional.

It's only when you get to the really cheap internet for grandma that you find shaping, and there you get what you pay for. When you say AT&T is your only choice, are you really sure that you looked everywhere? There was once a time when internet cost $100/mo or much more, and only businesses had it. At that level, the traffic is not shaped or capped. But you have to pay for it. Prices will have to go up if the government starts messing with the cheap internet.

Comment Re:Yes, and? (Score 1) 484

Except that such a limitation on donations means that third-party candidates cannot raise enough money to be heard over the incumbents. The Canadian system already has countless established parties, at least 4 of which are currently in federal offices, so C-24 would have a different outcome in a different country with a different party system.

A real improvement would be to stop Federal matching on campaign funds for the two major parties (where the Federal government donates $1 for every $1 raised by the party, and the money comes ultimately from the people whether they support the candidate or not), and also to stop the legal system from treating corporations as if they had individual rights. That might mean capping corporate donations at $0, but it should come with other reforms as well (such as holding board members and executives responsible individually for the consequences of their decisions).

Data Storage

Flash Destroyer Tests Limit of Solid State Storage 229

An anonymous reader writes "We all know that flash and other types of solid state storage can only endure a limited number of write cycles. The open source Flash Destroyer prototype explores that limit by writing and verifying a solid state storage chip until it dies. The total write-verify cycle count is shown on a display — watch a live video feed and guess when the first chip will die. This project was inspired by the inevitable comments about flash longevity on every Slashdot SSD story. Design files and source are available at Google Code."

Comment Re:Monitor gamma? (Score 1) 368

After re-reading your comments, I realize that we're actually talking about slightly different things, and neither of us is being specific.

Professional graphics software is designed to work with data that is linear. This requires professional operators who maintain a carefully calibrated set of input devices and output devices. All image files that come from film or digital sources must be converted to pure linear, and only processed in linear, then converted with the gamma of the output device. In this world, 2.2 is only used when sending linear data to a CRT. A different gamma would be used for printing, but again the assumption is that everything starts with linear data in the files.

The professional world used to be the end of the story. With the explosion of the web, more non-professionals were creating image content. More importantly, the vast majority of viewers were looking at the web on monitors that were never calibrated, using an operating system (Windows) which did not have professional color management abilities. Unfortunately, the result of this is that the standard practice became to author image files with the gamma of the monitor built in, thus minimizing the errors.

So, you actually have two standards. Linear data is expected by professional software, and is used by professionals at all stages of their work flow for source images. Non-linear data is used for image files that are intended for web distribution. Thus, we're both right, because there are really two standards.

The problem is that you cannot expect to grab an image file off the web and be able to scale it in professional imaging software without accounting for the built-in gamma used for web distribution.

One happy advancement in the industry is that now images can carry around a description of their linear or non-linear status. What should happen in these cases is that image editing software should use the supplied color management tags to convert the image to linear format before any scaling is done. Theoretically, this should solve the problem of not knowing whether the image file is a professional source image (linear) or one intended for web distribution (non-linear).

I fully understand the science that you're trying to explain, as I developed commercial monitor calibration software for NEXTSTEP, the precursor to Mac OS X. The only place we disagree is whether the file "should" be linear. The answer is that it depends upon whether you're talking about files for editing or files for distribution. As I explained above, the graphics software that is being criticized is not designed to edit files authored for distribution - at least not unless they're tagged properly with their embedded gamma.

Comment Re:Monitor gamma? (Score 1) 368

I think you're wrong. The data in the pictures is "supposed" to be linear, but with amateurs creating content on non-compliant systems the reality is that it rarely is.

Our eyes are certainly non-linear, and so is a CRT, but the whole point of having the gamma curve is so that all processing (math) can be done on linear data. If the data in the picture is not linear, then why bother with gamma conversion at all? Another part of the problem is that different systems have different gamma, from 0.0 to 1.4 to 2.2, and that means you can't assume the image data was properly captured to a linear curve. I think that the biggest culprit here is Windows, which had no color management, and so users just adjusted colors to look right on their CRT, and eventually the web standard became this non-professional workflow.

In other words, the assumption of 2.2 gamma that you refer to is only valid when the output device has a gamma of 2.2, while another gamma would need to be assumed when a different output device is used. The only constant is that the data should be linear, but we all know that you can't rely on data to follow standards when every grandma has a camera or scanner.

Comment Re:There's more to this story (Score 1) 691

Point taken. I stand corrected regarding that one sentence.

However, I still think it's invalid to compare a country of 300 million to countries around 50 million (plus or minus) each. Your suggestion that Europe is bigger than the US is promising, but I've not seen any aggregate health care statistics for all of Europe. It would possibly be rather difficult, considering that all of Europe doesn't even belong to the EU (CH, etc).

Comment Re:There's more to this story (Score 1) 691

I don't see your point, at least not completely. We don't have the statistics for California or the EU, so what does it matter that the UK is larger than CA, or the EU is bigger than the US?

Show me health statistics for another country with 300 million people, and I might agree we're comparing apples to apples. Or, show me the health statistics for the entire continent of Europe, and again it would be a more informative comparison.

I'll just repeat that I'd like to see the results broken down by state, particularly if we could have them in database form where they could be combined based upon different attributes to see what really affects health care in the US. I find it hard to believe that the health care realities are exactly the same for every US citizen in every part of the country. I would expect more variance between Americans than between some of the countries you listed.

Slashdot Top Deals

<<<<< EVACUATION ROUTE <<<<<

Working...