Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Well. (Score 1) 195

As for being tougher, my understanding is that it's far more scratch resistant than gorilla glass, but not necessarily as shatter resistant.

fyi, scratch resistence is also a measure of shatter resistence, so if a substance is more scratch resistant than another, it is also more shatter resistant, at least that's what I just read in an articled about sapphire linked from somewhere else in this thread.

Comment Re:Well. (Score 2) 195

Also, there is some speculation on several different sites that Apple may not intend to use sapphire for the screen, but instead for the camera lens. They currently use it on the camera lens and the home button.

That (external) speculation sounds kind if silly... considering there are lots of other teeny tiny parts in iOS devices that the cost of which probably is more volitile and fluctuates more than the price of synthetic sapphire. So for a billion dollars, it seems like an investment that would take decades to pay for itself.

I wonder if it's something they could use in other things that don't currently use Gorilla Glass, like macbook screens?

That is interesting, and would absolutely justify a billion dollar investment if that is their intention, because they would need a metric shitton of sapphire to pull that off. My guess is its just for the iPhone/iTouch screens, because the idea is already out there and being used for mobile device screens, and Apple sells a lot of iPhones, which I think would amount to enough sapphire needed to make it worth it to the Apple bean counters.

Comment Re:Well. (Score 4, Interesting) 195

The only thing it's hurting is the other people looking for sapphire display covers like was mentioned a couple months back.

Personally, I'm on the Gorilla Glass bandwagon. It's: Stronger Stronger Cheaper & faster to produce

apple can pretty much do what it wants and they have plenty of money so it's not like it's a gamble at this point. $1bn is not going to dent their bank.

I own a couple of their devices, but I've personally relegated them down to be things I don't even carry around, and the interface always makes me feel like I'm using one of those kid's toy computers that has like 6 buttons with pictures on them (the cow says Mooooo).

I to am on the Gorilla Glass bandwagon as well, and a big big fan of Corning. But Gorilla Glass is under patent. Synthetic Sapphire has been around since 1902, and it was cheap back then. Sapphire is hard... 9 on the Mohs scale, and the only substance harder is natural and synthetic diamond. I find it difficult to believe... so...do you have any references that says Gorilla Glass is cheaper and harder than Sapphire?

Comment Re:uh uho. problems.. (Score 1) 50

Did you mean "there are no more novel, original ideas"?

Yes, that is what I meant. Nice catch. I wrote the exact wrong thing, and yet you still were able to understand what I meant. I should really slow down when I respond to posts. Thanks.

If so my answer would be (A) Bullshit.

Don't be so hard on yourself. I'm sure your answer has value. heh, just kidding. My answer to that is "prove it." Show me this novel and orginal idea, that is new and not based on what came before, and is not standing on the shoulders of giants.

and (B) So? Then there's no more need for patents.

I don't see how you can legitamately draw that conclusion, certainly not by what you wrote subsequently. Why don't you dumb up your reasoning for me a little bit, if you're so inclined, I would appreciate it (if you're serious).

They were always a social contract of dubious benefit to begin with.

...says someone obviously biased agianst the existence of patents. Certainly there are unncessary patents hurting innovation. But those are the exception. The purpose of patents is not to prevent innovation, but to protect the IP of the patent holder for an extremely reasonable period, in exchange for detailed public disclosure of an invention. An invention is a solution to a specific technological problem and is a product or a process. Without the patent system, an inventor has no incentive to share their invention with anyone, possibly to the extreme detriment of society.

Copyright is not a good way to describe patents... copyright is out of control (Disney, et al., yada yada yada), and the public domain is suffering for it. Not every patent is unncessary, certainly not to the inventor or patent holder who, very often, has a lot invested in their innovation. I don't think you would be perfectly happy slaving over a marketable innovation for the better part of your life only to have me copy the idea, and effectively steal profit from you, the very wind from your sails, the moment you share the innovation or a day after you finally achieve that wonderful goal of tagging it and bringing it to market.

Innovation tends to surge in countries that remove their patent system.

I am not aware of this. Obviously, you must have some specific examples. Please share them so I know what you're talking about, as I'd like to examine these innovation "surges" myself. Because, as it stands, it appears that you just made this up out of thin air because you don't like patents for some reason, and for the life of me, I cannot figure out why.

My point is you can be validly awarded a patent on something that is not original or novel in any way, but it is being applied in a way that original patent did not specifically cover.

*Only* if the invention is also something novel and non-obvious to one skilled in the field. Otherwise it's something that anyone so inclined could be reasonably expected to create given a reason to do so, and the "invention" offers society no benefit to compensate it for granting the inventor government-backed monopoly rights.

I'm nearly certain the US Patent Office is using a broader definition of "novel" and "non-obvious" than you are. And merely creating something new doesn't cut it... one must also understand and document in the patent application what it is exactly and the correct and undeniable mechanisms that permit to function.

Remember, like copyright, patents aren't an expression of some sacred right to exclusivity, they're a limited deprivation of the natural right to mimicry that is granted in order to encourage creators to be more prolific and provide more value to society.

Again, using copyright as an example is not the way to go... because the state of copyright as it stands today is exactly as you describe it as not being, i.e. effectively an "expression of some sacred right to exclusivity" (I like how you phrased that) that apparently goes on and on, long after the author has died and his children have grown old, and the author cannot possibly benefit or suffer any longer for their copyright expiring within a reasonable period, to the detriment of society. And as the rich copyright holder approaches the end of this ridiculously long period, they just pay enough to change the laws, and extend their exclusive rights, and this has happened over and over with copyright. Nothing even remotely like that has happend with patents, nor have patent holders been able to extend the quite reasonable time of granted exclusivity. I'm not certain it is fact, but I think that time period has been reduced on occasion, from 20 years, to 14 years, or whatever that period is now.

If that equation falls out of balance then they no longer justify their existence.

And yet again you have drawn a specious conclusion without sharing any of your reasoning to support such a conclusion.

Look, if you want to argue about software patents (which we haven't even breached the subject of in this thread) being unncessary and innovation-killing monsters, then do so, and I will eagerly and gratefully consider your argument for the effort you put into it. But if you're trying to convince me or anyone that all patents are bad, and that, if I may invoke a poster child of the patent system, the destitute but brilliant Nikola Tesla did not deserve the patent unltimately awarded him for the invention of none other than radio and everything necessary for it, you're going about it the wrong way. Certainly there are wonderful examples of amazing generocity and philanthropy that have occured by individuals sharing ideas and inventions, to societies incalculable benefit, and not seeking exclusivity nor to be compensated in any way whatsoever for their significant work (Archimedes, Leonardo Da Vinci, Benjamin Franklin, Jonas Salk, plenty more I'm sure). But these anacdotal examples do not invalidate all patents nor the patent system (which often necessarily includes disputes settled by an adversarial court system).

Just because Bill Gates gives hundreds of millions of dollars away, and it is a good thing that he does so for a lot of individuals that it must benefit, and further, owe their very lives to, doesn't mean that you or I must also do so. (I can't afford that!) Nor does the innovation-stifling train wreck that resulted from the Wright Brother's patent on flight control, directly setting US aviation back years, invalidate the whole system and all patents. To say so is a fallacious argument. The litigation steming from the Wright Brother's flight control patent ultimately lead to the birth of patent pools, which are a good thing, and help prevent a patent's exclusivity from stifling further development while still compensating the patent holder. And the Wright's patent expired in 1917, less than 14 years after it was granted to them. That doesn't sound unreasonable to me, even if the US really needed that innovation for WWI, because that, relatively speaking of course (I don't want to offend any veterans), is merely an accidental circumstance of history. Patents, if nothing else, are very limited, almost limited to the point that an inventor might not think it even worth it to invent if s/he could only benefit exclusively from it for such a short period... it's right on the border there.

Again, patents are not like copyright, which are a complete disgrace and an insult to rationality, when technically you owe fees for singing "Happy Birthday" to your child once a year. That song was awarded its copyright in 1935, and it doesn't expire until 2030!! There is nothing in the patent system even remotely as absurd as this.

And if I may be so bold, I need the patent system intact, because I am not rich, but at times I think I am clever, and I intent to personally scoop everyone on the patent on these new VR devices and HMDs, because every single developer is rushing into to the endzone without the football. And I won't actually need to physically develop anything because I have an understanding of these devices, that I have never, btw, used nor even seen in person, that these foolish manufactures do not have (and no, I'm not going to share it with you just because you're curious or just because you believe patents are evil). In the same way that a hypothetical patent holder for an invention to make "fire" could have their patent invalidated, because they failed to explain sufficently, or explain correctly or comprehensively, how it works (magic!) would be scooped and lose their patent to a chemist who came along years after the fact by explaing what it actually is and how it actually works. Years after the first patent was awarded for such a device, I will file my patent application, and be awarded my patent, and retroactively invalidate all previous patents because my patent will be comprehensive, complete, and correct, and none of the current manufactuers patents can be... because, as I insinuated and have stated in posts elsewhere, it is obvious they have no idea what they have nor any idea what it really is and what it is going to do for humankind... namely, cause us to literally evolve, likely within our lifetimes. It's not my fault they are fools, (albeit brilliant fools) nor is it my obligation to help make some rich company richer just so some tweener can get their game on.

Comment What computer science? There is no CS here. (Score 1) 183

they remain a begrudging anomaly in computer science pedagogy

Here we go yet again. We have an OP that can't seem to grasp what computer science is and what it isn't, yet it doesn't stop them from waving the term around like a flag. And we have post after post after post of obviously extremely intelligent and likely capable programmers that, perhaps even once studied computer science, and still insist on ignorantly equating "programming" or software development with "computer science."

What the Hell is wrong with you people? And don't think my animosity towards you ridiculous usurpers of an entire field that predates "programming" and software by thousands of years, that you (all of you) apparently and obviously know absolutely nothing about (including what it is and isn't), is misplaced, no more than yours would be towards me if I continually insisted I worked in medicine because I work behind the front counter cash register and sell bandaids and aspirin at your local drug store. (And I don't mean to insinuate that programmers are beneath computer scientists the way a counter clerk is beneath a medical doctor... I'm just giving an example in metaphor, and using hyperbole so that what is in your thick skulls will finally comprehend that you need to stop using the term "computer science" when it is completely irrelevant to the subject... which very often happens to be programming and software development, very noble professions that do not need to be propped up as something that they are not, which is, namely, computer science.)

By now, I have a lot of posts such as this complaining that the term "computer science" is being abused and really watered down as to mean more than it is which ultimately has the effect of changing it to mean almost nothing at all. Usually, I focus on the first word in the phrase... "computer" (which isn't you damn alienware linux and supergaming laptop any more than a ringworm is jewelry). But I'm going to try a different approach so you can see how ignorant you're being and why I'm so fucking pissed off about it.

Science (from Latin scientia, meaning "knowledge") is a systematic process that builds and organizes knowledge in the form of testable explanations and predictions about the Universe. Science is the process of acquiring knowledge based on the scientific method , as well as to the organized body of knowledge humans have gained by such observation, experimentation and research. If you're not doing this, as defined quite necessarily and ordinarily, then you cannot possibly be doing "computer science," because the second word in that phrase is not a trick, is not a homonym for another word I am unaware of, nor is it incidental, but quite very specific. If you're not doing science, i.e. observation, hypothesis, methodical, procedural and repeatable experimentation, and drawing conclusions from the results of that experimentation, then, again to be clear, you're not doing computer science .

Now that we have that cleared up, allow me finish this post before returning to my garage for a bit of mechanical engineering, as I think by now my oil pan has drained all the oil, and I can thus complete the oil and filter change (see what I did there, you stubborn morons??? I'm mocking you, because you damn well deserve it.).

To address the fragment from the OP that I quoted above: its completely false. Perhaps programming courses do not always include any treatment of ethics... idk. I'm not a programmer, I never studied programming, and I wouldn't presume to talk about programming as though I were some kind of expert. But every single legitimate Computer Science curriculum I am aware of, including the one that nearly killed me some 20 years ago, has a course in ethical responsibility that is manditory for graduation.

Comment Re:As a Change Manager... (Score 1) 294

I was speaking in hyperbole... and of course I figured you and team and co take the data seriously... but no one was mentioning it; I thought it worth pointing out a different perspective. But still, compared to the data, the systems, as long as they are caretaking the data properly, are incidental, and from this perspective, irrelevant, as far as what the systems actually are. Thus it (or it should) follows that spending more time focusing on incrimental patches to incidental systems at the expense of spending due time or rational contemplation focusing on the persistence of the vastly more important data seems counter, or skew, to how time should be spent. (Fwiw, I think change management is a very "good thing," but also I think that having groups of employees spending time in meetings, any meetings, kill a company. Nothing ever gets done in meetings... its all "ok, here's reports, and here's what we want to go and do now," when anyone that has ever taken meetings knows that it never (ug, I mean rarely ever) has anything to do with how some initiative works/plays out... its just a way to synchronize intellectual capital of human resources, which can be done far more quicky, effectively, with an email.)

Comment Re:uh uho. problems.. (Score 1) 50

Yes, but... think about it, there are no more non-novel, original ideas. And there are other considerations, such as the purpose of the thing. For example, you just can't patent an electro magnet, but you can (or could reasonably recently) patent an MRI; you can't patent an automatic ball pitcher (anymore), but you can patent a gun that uses baseballs as amunition, also you could probably get a patent on a auto-baseball throwing "car denter and windshield breaker." My point is you can be validly awarded a patent on something that is not original or novel in any way, but it is being applied in a way that original patent did not specifically cover.

Comment Re:uh uho. problems.. (Score 1) 50

I'm nearly certain the basic concept has been around a lot longer than a century, and probably at least 2 or 3 millenia and more, when rudimentary optics (a nice way to say "holes") were used to view, say, a performance of dancing women behind a partition, in one respect, or to view minature depictions (a nice way to say "carvings" or "sketches") of women in various submissive or compromising positions in another. And I don't think this and what you've generously shared is going to matter to the clerks that approve Apple's patents( that's plural–because there are at least two Apple patent applications that, if either or both are awarded, would likely prevent -hypothetically, or on paper, prior to litigation that makes it so- any from bringing the completed concept to market). But before we condemn the individuals that do ultimately and probably infamously sign off the approvals , we should recognize that they are not crusaders or villians but just people doing their job, like all non-elected government employees. We don't really know the rules a patent clerk must follow in the execution of their duties. I'd really like to see a reply to an innacurate post that begins: "IAAFPC (I am a federal patent clerk)..." or "IAAPA (I am a patent attorney)...."

Comment Re:uh uho. problems.. (Score 1) 50

Dang, I had to go digging for that patent url, when less than a week ago it was someone else's story. Its only slightly important to note that I'm not the only one that remembered this, and that the idea is somewhat currently in the collective consciousness. I figured when I saw it that it was going to be one of those patents intended to prevent any such thing from ever making it to market, for whatever reason Apple might not want it developed. But now I really hope Apple has a viable product for release whether they're granted a patent or not.

Comment Re:uh uho. problems.. (Score 1) 50

Well... you're referencing the wrong patent there, and should Apple be granted a patent on their 2008 application for a strikingly similar idea, then yes, quite clearly this DIY smartphone based HMD would violate their patent (if it was produced and sold by a company that could be sued). And quit you're belly aching, I had this exact idea in July 2007 within days of owning my first smartphone. Should the concept be perfected, so that it was universal to any smartphone, and sent to market, I expect they would sell amazingly well. Whomever holds the patent could make a handsome sum of money just for strealing my idea.

Comment Re:As a Change Manager... (Score 1) 294

There is no way we would allow a sysadmin to patch anything at any time without some level of oversight

Change management is not your enemy

First let me say I really like it when people like what they do, and it sounds like you like what you do, take it seriously, and probably do it well.

However, this being a techy site, the commenters, yourself included, seem to inflate the importance of the one thing that is irrellivant and incidental. And by that I mean, of course, the system, the OS, and whatever state it may be in. Let me repeat, the OS is irrellivant and incidental. Thus, any time, effort, meetings, plans, etc., focusing on them at the expense of the state of what is important are an incredible waste of time and resources. I don't mean to belittle your job, btw, but merely to point out what every commenter I have looked at on this story seems to miss. The damn systems don't matter! (I am a systems admininstrator, myself, btw).

What matters? The data.

Change management is, in its own way, important for the reasons it is important and not for what those who have positively described it here. It is important because it tells us: "What the Hell have we done? How did we get here? What the Hell are we doing? Now what?" It is history, and it is intent. But it should only be as important as the systems themselves. And, again, the systems themselves don't matter one whit, especially if you have emergency procedures for when something breaks (i.e. fix it, or if fixing it is going to take too much time or effort, just reinstall it). So if you have a change-management-heavy operation, you're wasting a lot of time and effort for no great benefit or reason at all.

Tell me, at your company, is there a group whose sole responsibility is to manage the Data, and provide oversight concerning its reliability or how it is used or managed by the systems or users? Well, then its crazy that any resources would be sacrificed for the sake of the systems.

Systems administrators and the IT department are like the teams of airline mechanics. They're very specialized and very good at what they do because, in a relative way of course, lives depend on it. But what is important is not the damn planes but the cargo... i.e. in getting someone or something from point A to point B successfully. Does it matter if the plane is 40 years old? Not really, if that 40 year old plane does an identical job to a brand new plane. And we can replace the planes with high-speed trains, or (hopefully someday) teleportation. The planes themselves, and the use of them, are incidental, and irrelevant; they don't really matter.

In conclusion, I wouldn't say to the OP "get a new job!" as others have. That's not very helpful. I'd say, if you have control over these systems, and now you have to champion each new patch and get every patch approved, then switch everything to a system that will require less patches. Running Windows servers? Migrate to linux or a *nix variant. You'll still get security patches and bug fixes coming down, but far far less than with a Microsoft based operation, and if you miss a few months of patches, nothing really bad will happen (like on a Windows system). The world will still turn and the work will continue.

Comment Re:Resolution is not the hard-to-solve problem.. (Score 0) 135

They're obviously related, but one of these we can measure directly, the other we cannot. Ergo, we get our proxy suitably low until we find a point where the trade-offs are acceptable.

You're again making their same mistake. They and you seem to be focused on the product, what your prejudices already tell you what it is and what it should be. I can't make you see the wrong-headedness of your beliefs. And it absolutely is false that we do not have the ability to measure perceptive capacities. Let me put it this way: everyone believes they are trying to design a head-mounted display... but the reality is they are producing a mind-mounted display and ignoring this! That is why they will continue to come up short, forever, until they realize what they are really trying to do.

Comment Re:Resolution is not the hard-to-solve problem.. (Score -1, Troll) 135

Carmack is closer to the truth, but still misses. Really, it's not that the problem is latency of the device, but of our brain or conscious/unconscious minds' ability to notice the latency. You may think that is saying the same thing, but it is not. Manufacturers are putting their focus on the wrong thing. The device works exactly as it's spec called, exactly as it was designed. It is insufficient because they did not measure this stuff at where it matters, and apparently they are going to continue with this style of trial and error or hit and miss development, and releasing these devices that are insufficient for sensory immersion. What the manufactures are effectively doing is making a shoe for a foot they never see or even try to look at, and never seem to get any feedback from until the shoe is completely fabricated with all the bells and whistles, and then they try to shoehorn it onto a foot... then they start to see what's wrong but in the wrong way, and they go back to making another shoe with the little they learned from the shoe being too tight or too big for the foot.... instead of learning and understanding everything they need to about the foot first to make a shoe that fits and is comfortable and does what they intended it to do.

Slashdot Top Deals

The aim of science is to seek the simplest explanations of complex facts. Seek simplicity and distrust it. -- Whitehead.

Working...