There are times when I have to endure the company of appallingly stupid people, but they all involve either getting paid or public transportation. Slashdot is neither of these, and I am fucking done. When I find myself missing the expert guidance of CmdrTaco, it is well past time to move on.
Slashdot videos: Now with more Slashdot!
We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).
I think it must be some multiple of a quatloo.
Like a lot of people, I had incredibly shitty math teachers in school who managed to completely turn me off to the subject. Later in life, once I learned what mathematics is actually good for -- which is nearly everything -- I sat down with cheap used textbooks and Schaum's guides and started with algebra and worked through calculus, and then branched out into advanced mathematics. Right now, I'm teaching myself group theory. It is a bit harder to do it on your own without someone to answer questions when you get stuck, but I'm not sure that's actually a disadvantage in the long run: the concept you struggle to understand is remembered better than the one that is handed to you.
So now there are online courses. The difference between a MOOC and a book is what, besides lower information content?
So he cloned Bell Labs down to some pretty specific details. Big deal.
I have no trouble looking at laser printer output and telling the difference between 600dpi and 1200dpi. The problem isn't that it's pointless for human vision, it's that many people have uncorrected or inadequately corrected bad vision, and the rest just don't pay attention. Consider seeing an optometrist.
But marketing to the common person in a way that is useful to them is not "bullshit".
True. But that's not what's happening here. What Apple is doing is using meaningless jargon to take advantage of their customers' technical ignorance. I don't actually have a problem with that in this case, since that ignorance is largely self-inflicted, but it's not customer service in any meaningful sense.
It sounds to me like the music industry is bent on self-destruction, and this is something we should encourage. Let them price themselves out of existence. Once they're gone, maybe we can have some good music again.
Technology must now work for everyone, not just 'computing enthusiasts.
Not to single out Apple -- all of the major consumer tech vendors do the same thing to one degree or another -- but this is why I'm not terribly interested in their products. I am a technology enthusiast, and I do tweak settings and heavily customize my machines to suit me. As interfaces are dumbed down and the underlying feature sets are minimized, there's less and less use I have for the associated products. And to the extent that GNOME has followed suit, I find myself doing more and more at the command line because there's a real limit to what can be accomplished efficiently with a mouse or at all jabbing at a tablet with my fingers like a chimp in a lab.
But for the tendency of some major UI projects in the Open Source world to imitate corporate products, I wouldn't care. That tendency is frankly bizarre, since the average consumer doesn't care about any aspect of FOSS except free-as-in-beer, probably won't install any FOSS software anyway, and certainly won't ever crack open the hood. Meanwhile, skilled users who understand that a shallow learning curve also means a shallow power curve are increasingly out of luck.
...and then be blandly pleasant. Otherwise, just don't do it. What are they going to do, fire you?
I'm always amused at the naive goodwill that people extend to their employers. Most of us live in at-will states, without unions, and without any real workers' rights that can be exercised without spending more than they're worth retaining counsel. These are the people who can fire you at any time for any reason, but they want two weeks' warning if you leave on your own. Why give them extra freebies?
Look, forget the employer-employee bullshit. You are a vendor, selling a service. Your employer is a customer. As long as they're buying what you're selling at the best price you can get (which includes work conditions and perceived job security as well as pay and benefits), the customer is always right. As soon as they stop buying, or you find someone willing to pay more, then go attend to your new customer. The old customer wants to take more of your time for free? Politely decline. You're running a business -- you -- and the only point in giving something away free is if it leads to another sale.
Don't bother with work ethic or pride in your job at this point. Those are good concepts and they have their place, but that place is well before anyone starts talking about exit interviews. If you're leaving voluntarily, they treated you well, and you feel like extending the courtesy, sure. But even then, don't say anything that can be used against you later. It's just business, and that's how they see it. Go and do likewise.
Seriously? There was nothing more important or interesting going on than some nebbish mumbling about the importance of packaging? Even for Apple fanboyism, this reaches new depths. "The boxes sit on shelves serving as a constant reminder of the beauty within." I wish there was a more appropriate and genteel response to that than, "Get a life!", but there you are.
I've programmed in every major language and several minor ones from the 1970's to the present day, never mind design methodologies. They all have their relative strengths and weaknesses, but at the end of the day, the only thing that really recommends one over the other is a) what's available, and b) what you're most familiar with. No widely used language is "broken" any more than any natural spoken language is broken. No one ever says, hey, this novel would be much easier to write if we were taking advantage of the greater expressive power of Indonesian instead of kludgy old Lithuanian.
Aside from juvenile cliquishness and fashion obsession, every language flamefest starts with people obsessing on some awkward feature of the dominant language du jour, and then concluding that all of their problems would be solved if we all switched to some other language without that awkward feature. Of course, tomorrow's language (or methodology, editor, coding standard, platform) has its own awkward qualities that will only become apparent once it collides with the real world on a large scale, setting the stage for the day after tomorrow's language. Rarely does anyone pull their head out of their compiler/interpreter long enough to recognize that it's the real world that's awkward, and no amount of switching between fundamentally equivalent machine-parseable languages is going to change that.
Instead, we keep implementing the same stuff over and over in one language after another until the pace of real progress slows so much that we can actually get excited that the document viewer we're trying to port everything over to is receiving a "major" new features in HTML5 that will allow it to get a little closer to matching the desktop GUI functionality of twenty years ago, only not as well and with the added requirement of several orders of magnitude more hardware power required to keep it going.
But by all means, let's get rid of PHP if that makes it easier to imagine that we're doing something besides reinventing the same old wheel and doing it badly.
Shannon's limit is a Mathematical principle.
Unfortunately, most people have next to no understanding of mathematics beyond some rote memorization from school. This is just another example of people confusing analog signals with magic. To be fair, the actual researchers involved probably understand this quite well, but the scientifically uneducated class from which science and technology journalists are drawn is another matter.
The non-mathematical version, for those interested, is that yes, analog signals are continuous and so can occupy an infinite number of states. The reason you can't get infinite bandwidth out of that is because both the transmitters and receivers have limited precision, and because there is always noise, which is another manifestation of the Second Law. For example, there are an infinite number of real numbers between 0 and 1. If you could actually use all of that space, you could encode any amount of information in an arbitrarily short signal. (Well, there's a limit to that, too, for which see Georg Cantor.) In practice, you can't use all of that space, because your instruments might distinguish quite well between 0.001 and 0.002, but they can't reliably tell the difference between 0.001 and 0.0005. On top of that, there is noise, which is also a big topic, but you can think of it as a random fluctuation in the signal. If the ambient noise varies between 0.0 and 0.0005 in the same example, you can't even reliably tell the difference between 0.001 and 0.002.
What the parent is getting at is that laws of physics, being derived from observations of nature with limited precision, might occasionally be overturned by better observations. Fundamental mathematical principles, on the other hand, are much more reliable. There might be a difference between rest mass and inertial mass that we could exploit for thrustless propulsion. It's extremely unlikely, but it can't be ruled out. But there is zero possibility that 2 + 2 will ever equal anything other than four. Shannon's limit and, for that matter, the Nyquist sampling theorem are a little more complex than a simple integer sum, but the actual math for both would fit on an index card with plenty of room to spare to blather on about "infinite" analog signals. We use digital signals most of the time these days because it makes the hardware easier to design, but neither digital nor analog can be used to make an end run around the Second Law.
What the researchers in TFA claim to have figured out is another way to use part of the signal outside of the frequency domain to stuff data into. It's a really ingenious approach that might be quite useful if it pans out in actual practice, but it's not magic, and it's not infinite.
We figure out a way to enhance human mental acuity, and the very first thing we apply it to is training snipers. Interspecies communication? Military dolphins. Never mind nuclear physics.
If we were as good at anything as we are at killing each other and stealing each other's stuff, we might have a chance. Hell, if we were even more interested in something else -- and no, screwing doesn't count.
So our first steps into bridging the biological/electronic divide involve what is essentially an artificial parasite? Somehow, I fear it will not be long before some medical researcher who has read too much Douglas Hofstadter thinks of taking it to the next level of abstraction by monitoring intestinal health using parasitic instruments attached to tapeworms.
It is a lesson we continually fail to learn: Industries built on government subsidy suffer when those subsidies begin to go away, even if the product itself is sound.
The lesson "we" continually fail to learn is that not everything is a vindication of one's favorite economic-religious theory. Every time there is an increase in demand for something, investments pour into the relevant industries far in excess of that demand, and most of those ventures fail before a handful of them succeed and become the dominant players. There's nothing magical about either the market or subsidies. Subsidies are just market forces, like weather influencing crop prices or international trade policy influencing imports and exports. The theoretical free market in which prices are not "manipulated" does not and cannot exist in the real world because it is ultimately based on human beings, and humans manipulate everything they can. It doesn't matter whether the influx of cash comes from subsidies or sales: companies benefit while the cash flows in, and they suffer when it stops flowing. Money is money.
Email is simply not a medium I would even consider using for sending sensitive information precisely because there are countless places between me and my correspondents where a message could be intercepted. In such circumstances, encrypting my email would simply alert anyone watching that something sensitive is being transmitted. And since the only "anyone watching" that I'd worry about is the government, why bother attracting the attention? If they want to know what I'm sending, all they have to do is wait for me to go to work, enter my house, and install a keylogger on my box. It's not like they even need warrants nowadays for that crap.
If I was going to do something I wanted to hide from the government -- and let's face it, that would almost have to be a major federal felony -- and if I absolutely had to have documentation and accomplices, none of it would be in electronic form to begin with, never mind transmitted over the public internet. Encryption is useful for governments and major corporations that are basically above the law. It's not terribly useful for private citizens unless you're just trying to hide your porn folder from your roommate.