This will only work on a few people. When I google myself, William Douglas, I get a pile of hits and none of them are for me. Additionally, people can still change their name if they want to distance themselves from their past. Will not hide you from government agencies but will be good enough for everyone else.
On a side note, a question to the grammar Nazis. When using the word "Google" as a verb, should the first character be capitalized? And as a website that supposedly stays neutral, should it even be used as a verb within headlines?
Now if you would please forgive me, I must go monitor Bing to see if their is a sudden spike in searches for "William Douglas".
Since using X applications on Mac OS X did not work seamless at all, that statement is not very reassuring to me
You can't expect a program written for X11 to be seamless on another operating system. To integrate properly one would have to get the X11 app to respond to native keyboard shortcuts, integrate drag-and-drop, have knowledge of the host file system. The list goes on but my point is that an X11 app will remain an X11 app even if run on a Mac/Win host. One can not expect a seamless experience.
But things are different for Wayland. All of the previously mentioned problems will not impact Wayland in the slightest. It is reasonable to assume that the user experience will be seamless.
QtOctave is not an integrated GUI for Octave - it's more of a graphically enhanced command line interface. It looks great on the surface but fails to have any depth. I don't mean to fault the QtOctave project - limitations in Octave are what prevented it from working well. From what I've read, Octave has been undergoing changes for the past two years in order to restructure it in such a way that it can work well with a GUI.
The situation is similar to GCC. GCC does not integrate will with IDEs - sort of but not really. The CLANG project solves these problems because it was designed to be more modular and facitlitate interacting with other programs and not just the command line. CLANG based IDEs have better support for highlighted code, variable highlightling, and anything else where the GUI needs to perform a task typically done by the compiler.
I only bring up GCC and CLANG because their differences help explain why the old Octave and QtOctave could never provide a good, integrated GUI. Sounds like Octave has changed to be more like CLANG so we can now expect improvements to the stagnant QtOctave. Only the design changes to Octave are so great that it is simpler to start a new project then modify QtOctave.
F-150s tend to be grocery getters
Yes - this is exactly right. F-150s are almost always used as glorified cars so the benefits of reduced weight are much higher then for other models. Anyone purchasing a work truck who plans on towing or hauling is much better suited with an diesel F-350.
Interesting tidbit; in Canada (not sure about the US) a 1 tonne pickup is considered a work truck where as a 3/4 tonne is considered an expensive toy and is levied with a luxury tax. So while a F-350 is more expensive then a F-250, the difference is minimal and the F-350 is a better bargain. As a result you see lots of F-150s and F-350s but not so many F-250s.
When, as a kid, I first took an interest in computers, 300dpi laser printers were all the rage. Now they boast around 2400dpi. No one seems to complain that images and text are sharper. Now displays and the printed page may be different but one certainly doesn't generally hold a piece of paper 2 feet away to read a book.
Displays and the printed page are too different to compare - so why bring it up? If anything, you just provided another example that agrees with the GP's post. At 300dpi, laser printers print almost perfect text and increasing to 2400dpi does almost nothing for plain text. If it was not for printing graphics, the increase would be pointless.
When it comes to high resolution displays, so long as they continue to use more power then the lower resolution models, users will complain. When additional GPU hardware is required with no visible improvement - users will again complain.
They say engineering is the art of compromise. High resolution displays require one to compromise on battery life and purchase price (GPU) so there is a limit as to what is practical. At a certain point the higher resolution becomes nothing more then a marketing gimmick.
In Canada mortgage interest isn't tax deductible so there is more incentive to pay down the principle
Thank you, I thought it was something like that.
If you already have a ARM processor, the FPGA will likely be used for real-time interfacing with the outside world - for example, many robotic applications. If this is the situation you are in then it might be worth looking at the Lattice iCE40 line of FPGA. They're small, cheap, use almost no power, and are programmed via SPI. The high-end versions have around 7500 LUTs so they are reasonably powerful.
There are some very inexpensive iCE40 developer boards on the Lattice website - between $25 and $40 (I believe). Makes for an inexpensive introduction to programmable logic. Just do not expect them to be as large and powerful as other FPGAs on the market. They were designed to compliment a CPU by interfacing and filtering sensor data thereby allowing the CPU to remain asleep for as long as possible. Most other FPGAs were designed to implement CPUs...
... compared to other forms of lighting when taking into account his source of heating.
Incandescent light bulbs make for great heaters so at times they are almost perfect. Significant heat can be lost through the lighting fixture so hanging lights are best. The position of the lights can also reduce efficiency. Electric heaters typically go under windows in an effort to reduce drafts. Putting heaters on the ceiling has the opposite effect.
The grandparent's point that in colder climates incandescent bulbs are not as bad is valid. But they are still worse then higher efficiency bulbs so it is hard to argue against a nation wide ban. To do so is somewhat selfish and shortsighted. Even if one does rely on electric heat, (and very few people in cold climates do), then you still benefit from reduced load / pricing on the electric grid. Even if prices do not go down, at least the power company will not have to raise them in order to pay for additional infrastructure.
Yes, other parts will degrade far faster then the magnetic media so the magnetic media essentially does not degrade.
But even in the absence of actual use, eventually, even on a hard disk, the bits are likely to get corrupted by random stray cosmic rays and possibly superparamagnetism. Of course, any real data loss is likely to take decades, if not centuries.
The bits on a hard disk do get corrupted with time - I believe it is referred to as bit-rot. You would be lucky to have a hard drive last for 20 years even if unplugged. Just like those old floppies go bad - so do drives. But bit-rot can be prevented. ZFS includes an option where it will re-write data every so often. In addition, it automatically detects and corrects these errors - so long as you have redundancy configured on the ZFS drive(s). Don't know enough about btrfs but it probably has the same feature.
It's called putting money aside each month and saving for a rainy day
Sure, but they you are paying far too much for your health care. Hospitals charge individuals significantly more then the insurance companies. You're better off just buying insurance - it'll be less expensive and you don't have to worry about what happens when you get diagnosed with cancer (or some other unavoidable and expensive illness). Regardless of how much you put away, there are some conditions for which it will not be enough.
but are they redesigning every detail of the ARM for higher performance like Qualcomm does w/ Snapdragon?
yes. Apple has purchased several companies that specialize in ASIC design and the latest A6 CPU is the fruit of their labor. It is very different from other ARM processors on the market so this should not be much of a surprise.
an iPhone is basically just a systems integration job, with the sophisticated core tech bought from elsewhere
To a certain point, yes. But using that same logic, all other cell phone manufacturers are even worse. And there is nothing wrong with contracting out design when a company requires a unique part. There are times you want to do it in-house and times when contracting out is the preferred solution. Apple performs more in-house design then any other cell phone manufacturer. Only Samsung comes close - but their displays are developed a separate division and sold to everyone so it really doesn't count.
My point is that you are greatly under-appreciating the difficulty and importance of integrating / specifying the various different system components. Combining the different hardware and software to produce an efficient final product is typically the hardest part - assuming it's done right. And this is precisely why the market for premium cell phones exists. If it was easy, everyone would do it.
By contrast Intel is still one of the leading CPU design companies in the world, and almost always has the most bleeding edge fabs. What's that worth?
Based on sales - less then iPhones. Don't forget the Intel is not the only designer and manufacturer of CPUs. IBM has their Power7 series which outperform Intel - if you're willing to pay for it. Numerous low cost, highly efficient, low power CPUs exist from other companies. So while Intel is impressive, their value is still limited by the market in which they operate - just like every other company. It's why they're trying so hard to break into the mobile market.
"Stupidity, like virtue, is its own reward" -- William E. Davidsen