Forgot your password?
typodupeerror

Comment: Re:Advice (Score 1) 63

by lordcorusa (#46409945) Attached to: Canonical Ports Chromium To The Mir Display Server

Apt does feature an unattended-upgrades mode. It's not the default, which is annoying, but it's pretty easy to configure. It's one of the first things I configure on a new Debian box.

As for outdated packages: Debian unstable and experimental usually contain cutting and bleeding-edge versions of most open source software that is packaged in Debian. Unlike the grandparent poster, I would not recommend running sid (AKA unstable) as my main repository, because doing an apt-get upgrade has occasionally wrecked my system on sid. I use Linux Mint Debian Edition, which is based on regular tested snapshots of testing, but I do occasionally need cutting-edge packages. I would recommend looking into apt preferences, which provides a nice way to grab specific cutting-edge packages from unstable and experimental while dragging in a minimal number of unstable dependencies. So far I have had almost no problems doing this, and certainly far fewer problems than using 3rd party repositories.

Comment: Re:Shuttleworth is a lunatic. (Score 1) 63

by lordcorusa (#46407735) Attached to: Canonical Ports Chromium To The Mir Display Server

I'm quite happy with Linux Mint Debian Edition. It provides a nice middle ground between stable-but-out-of-date-and-highly-political Debian and unstable-but-nice-features Ubuntu. LMDE is basically a tested snapshot of Debian testing (like CUT was supposed to be, but actually regularly maintained) with a lot of the ease-of-use features (proprietary drivers, etc) of Ubuntu. Unlike Ubuntu or regular Linux Mint (which is based on Ubuntu), LMDE is completely compatible with Debian. I can use the apt preferences mechanism to pull in some more recent packages from Debian unstable and experimental, more or less seamlessly.

Both the MATE and Cinnamon editions of LMDE are good, depending on your GUI preferences. MATE seems a bit more mature, but it's definitely old technology. Cinnamon provides modern compositing coolness, and has been making some huge improvements, but it's still rough around the edges with respect to some of the GUI configuration screens.

They just cut the new ISOs a few days ago, so the packages on them are fresh.

Comment: Re:CFL are no savings (Score 2) 990

by lordcorusa (#36739300) Attached to: Congress Voting To Repeal Incandescent Bulb Ban

I agree with the parent that quality control is a problem with all new CFLs, and disagree that it is merely a matter of household electrical quality. I made the switch to CFLs throughout most of my apartment way back in 2002. I did it because my overhead lighting sockets were inconvenient to replace, and I was replacing incandescents on average one every 2 months. I bought some Phillips (or Sylvania? I don't remember) 32-watt bulbs that emitted the lighting equivalent of 120-watt incandescents. They were expensive at the time, but they worked beautifully: bright light, perfect color, and no flicker. All 9 of them kept working for well over the rated lifetime; 2 of them still are. I was very impressed and encouraged friends and family to switch.

Fast forward to last year. The bulbs eventually began to show signs of dying: flickering when turned on, and black charring around the bases of the bulbs. Over a period of months, they died and I bought some new replacement 23-watt 100-watt equivalent CFLs, from Phillips and Sylvania. Of the 7 replacements I purchased in the last year, so far 3 of them have failed, and another is showing signs of failing. Furthermore, every now and again I can detect a bit of flicker from the new bulbs, but still none from the old ones. For each failure, I have complained and gotten a coupon for a new bulb, but it's annoying and inconvenient, and it's put me in the same place as with incandescents regarding frequent replacement.

I don't believe that I have electrical problems in my apartment. These new bulbs are plugged into the same sockets that the old ones used, and 2 of the old ones still use. Furthermore, I have SmartUPSs attached to my computers and monitor their logs, and they *very* rarely record a power-related event. The power grid where I live is stable.

My conclusion is that, like most things in the business world, in the early years of CFLs, corporations put out the best product possible to gain market acceptance. Now that the market has shifted and CFLs have gained acceptance, I think that corporations are cutting corners and cutting costs to maximize short-term profits, and if this means a higher failure rates, then that's just another cost to be balanced on their books.

Comment: Re:How is this different than an ad-hoc wireless L (Score 1) 78

by lordcorusa (#34018372) Attached to: Wi-Fi Direct Gets Real With Product Certification

I hate to reply to my own posts, but I just thought of an additional comment to add.

In the example of the picture frame, likely all of the extra Wi-Fi Direct magic will be baked into the firmware.

On the other hand, for devices like laptops, I doubt that they would put this amount of software into firmware. It is likely that the extra components that turn plain Wi-Fi into Wi-Fi Direct will be entirely software that is delivered by a package of drivers and helper programs that are all provided by the OS or via a setup disc. This sort of of all-in-one setup will likely be offered to Windows and Mac users. However, users of independent operating systems, like Linux, will likely not see this, and will likely still have to manually setup these subsystems. Therefore, for Linux users, I imagine that we will see no real difference at all between a plain Wi-Fi chipset and a Wi-Fi Direct chipset.

Comment: Re:How is this different than an ad-hoc wireless L (Score 3, Informative) 78

by lordcorusa (#34018222) Attached to: Wi-Fi Direct Gets Real With Product Certification

According to Wikipedia, Wi-Fi Direct is ad-hoc mode Wi-Fi device with a built-in Wi-Fi Protected Access setup daemon, optional access point software (e.g., routing to other networks) and an as-yet undefined service discovery mechanism (e.g., UPnP, Bonjour). Basically, they are writing a standard which ties together several existing standards and best practices. This sort of meta-standard is quite common.

One example they give is a picture frame, which offers only the required ad-hoc mode Wi-Fi and Wi-Fi Protected Access daemon, and a simple service for file upload. The user would connect to it, upload pictures, and then disconnect. Nothing else would be offered by the frame, but the user would not need to do any manual setup or buy any additional devices.

A more complicated example is a cell phone which offers tethering. In addition to the required ad-hoc mode Wi-Fi and Wi-Fi Protected Access daemon, has full blown bridging/routing and service discovery daemons built-in. The user would expect to treat this device more like an infrastructure mode network in a single package; perhaps some setup would be required on the Wi-Fi Direct device, but virtually no additional setup would be required on each connected device.

So basically they are just making a standard, the implementation of which requires doing all of the things we have done manually for our own networks. This is just one step further in simplifying network setup, but not any kind of new revolution.

Comment: research paper tips (Score 4, Informative) 279

by lordcorusa (#32681482) Attached to: Best Way To Publish an "Indie" Research Paper?

0) By "greatly improves the performance" do you mean by some order of magnitude, or merely by a constant factor? For example, are you going from O(n^2) to O(n log n), or is it only O(10n) to O(5n). Don't get me wrong, the latter can be useful, but the former would draw more attention from the research community. I assume you know Big-O notation and formal analysis of algorithms, otherwise you will need to learn about it before submitting a research paper in algorithms.

1) If you have never even read a full research paper, then how do you know your approach is new or better than existing approaches? First, I would recommend getting a data structures and algorithms book and a computational geometry book. Read through those looking not only for things similar to your technique, but also just to make sure you have the vocabulary correct. Then move on to Google Scholar and start looking into the more recent scholarly journals and conference proceedings on the topic. You will need subscriptions (probably via a university) to see a lot of that content, but you can try the "All X versions" link beneath most articles to see if the author published a PDF on a public web site. Books are usually years behind the state of the art, and a lot of newer research and algorithms only fully appears in papers. Also, a lot of research (most?) is not published on blogs, so your algorithm may not be as new or groundbreaking as you think. Or if it is, you still may find more inspiration to improve it from related techniques.

2) Ditto what others have said about learning LaTeX for page layout. However, if you might want to publish in a specific journal or conference, then you might have to use their specific format, so you might just want to type your first draft as plain text and a collection of images, for import into a specific LaTeX template later.

3) Writing style: You must be *very* *formal* in your writing style to be considered credible in academic circles. Have an English teacher (or similarly-minded person) go over the paper with a fine-toothed comb looking for any spelling, grammar, or word-use errors. Absolutely no slang or colloquialisms whatsoever are acceptable in a research paper. Do not use contractions. Try not to use any analogies unless they are truly apt and likely to be universally understood. Try not to use first or second person in the paper. Remember, people from all over the world from different cultures, many of whom do not speak English as their primary language, will hopefully be reading your paper, and you don't want them to get confused by any culture-specific concepts or words.

4) If your algorithm really is new or groundbreaking, then I would strongly recommend trying to publish in a proper academic workshop or conference first (try ACM or IEEE conferences on computational geometry, location-driven computing, etc.), rather than a free online archive. You will get far more credibility and exposure in academia, and you just might get your employer to pay for a junket to a conference! Workshops are more for newer, less developed research, so you may have an easier time publishing there. Conferences are for more established research, so it's harder to get in them, but they carry much more respect. Also, most workshops and conferences have industrial tracks, if your paper focuses less on formal algorithmic analysis and more on real-world uses.

5) Be warned though, that although conferences are supposed to be submitter-blind, often it's much easier to get a publication when you have a known academic co-author on the paper. You might want to look up authors of papers related to yours, find the Ph.D.s on the paper, and approach them about a collaboration. This might take a bit more time, and you would have to share credit (just make sure you are first-author), but it may be worthwhile to get more exposure and credibility. They might also be able to help point you toward making further improvements to your algorithm.

6) Please, please, do not patent your algorithm! There is more than enough patented math already; the world does not need yet another algorithm that can't be used by anyone for 20 years.

Comment: Re:what do you call "truly open" there?? (Score 4, Interesting) 322

by lordcorusa (#30333438) Attached to: Why Open Source Phones Still Fail

Take it from someone who owned one: the OpenMoko was a terrible phone and a terrible handheld computer. It was nearly useless when not hooked up to a computer via SSH over USB. OpenMoko earned an A for vision in getting a fully open and documented hardware interface, although the results were dubious (crappy GPRS GSM modem in an era when 3G was just becoming popular, crappy non-accelerated drivers for the video chipset). However, OpenMoko's worst failing was the total inability of the company to push a singular stable and complete platform for development; there were about 20 different incompatible distributions in various states of disarray, and you cannot have a platform for end-user app development in that sort of environment. (Imagine how unsuccessful Apple's app store or Android's marketplace would be if developers and users had to choose between 20 different incompatible distributions, all in permanent alpha status...) I think I can live with a few proprietary blobs if it means having a useful device. All of the open technology in the world means nothing if the platform dies on the vine before ever taking off. OpenMoko's ideal of a fully open phone platform proved unsustainable, as the company canceled their "next-gen" (translation: 2.5G in an era of 3G) phone and switched to producing a ridiculous "WikiReader" device which contains no pesky radio or accelerated video modules.

After more than a year of trying to use it, I finally was overjoyed to get rid of my crappy Freerunner. On the other hand, even though my N800 does not have a cell radio, I still like to use it, and am strongly considering buying an N900. I think the OpenMoko was for people who love putting together distributions and blogging about how much freer their device is compared to everyone elses'. A platform like the N800/900 is for people who like programming mobile computers to accomplish useful tasks and then distributing those programs to non-programmers.

Comment: Re:Wiping the Hard Drive After Litigation (Score 1) 470

by lordcorusa (#27864203) Attached to: Court Sets Rules For RIAA Hard Drive Inspection

I don't know what the protocol is for civil litigation, so I do not know whether some officer would seize your equipment at the time of service of litigation, as happens in criminal matters.

But assuming that you are able to retain control of your machines and autonomy in their use for some time after being served, then it would actually be quite difficult to securely wipe them and reinstall them without leaving behind some evidence that could be discovered by a forensics expert. Other posts in this thread do a good job of going into detail about specific ways of telling that such a wiping happened, such as looking for evidence of massive patching, or unusually large timestamp jumps. If you are caught, which is likely, then even assuming that you are not subject to criminal penalties for evidence tampering, you can still be nailed by a default judgment against you in the civil matter (where the evidence has merely to be more likely than not, rather than beyond a reasonable doubt).

So, trying to wipe a drive is a losing strategy.

Your best bet to handle this situation requires some fore-planning and regular updating of planning. You must have a brand new hard drive available *before* you get served. Them your best bet is, assuming you retain control of your computer for some time, to *immediately* remove your hard drive and destroy it, and replace it with a brand new hard drive. Then you claim in your affidavit in response to request for discovery that your old hard drive died *before* you were served, and you destroyed the old hard drive *before* you were served. You have to have bought the new hard drive *before* you were served, because they can track when the hard drive was manufactured and possibly even sold, and if the records say it was sold *after* you were served, you get nailed for perjury. Also, the hard drive should be reasonably recent, as one would be unlikely to install a 5 year old "new" hard drive in case of a failure, rather than buying a newer hard drive at the time of failure. Note that some forensics analyses can identify a specific instance of an operating system install based solely on network port scans and other traffic analysis; even though it is currently unlikely that the opponent would have used such a scan on you before serving you, to protect yourself against potential proof that your operating system instance remained the same up until the time of discovery, you should *always* have a hardware firewall between your computer and the Internet.

Of course, the above paragraph details a theoretical method to attempt to subvert the legal system. I do not support perjury and my advice to you is to not to tamper with evidence or lie about evidence.

Comment: Re:OpenMoko (Score 1) 176

by lordcorusa (#27004911) Attached to: Android Gathers Steam Among Open Source Developers

Minor point of interest - OpenMoko is the software company, AFAICT, and FIC are the hardware company.

I don't know if that is an accurate statement.

FIC is definitely the hardware manufacturer, but OpenMoko seems to be responsible for a significant portion, if not all, of the hardware design for the OpenMoko phones. (At least, this is what I glean from mailing lists where OpenMoko employees discuss the development of hardware revisions.) This would put them squarely in the hardware company category.

Furthermore, OpenMoko has said or implied that their primary software responsibilities are to write drivers for low level hardware, and to provide basic utilities for testing the hardware. It seems that they have more or less pushed the responsibility of creating a stable application development platform, AKA distribution, to the ephemeral "community".

The abdication of the development of a singular, stable, comprehensive platform for application development was a strategic blunder of monumental proportions that will ultimately be fatal to the company. In the absence of a single platform, the Free Software community has done what it naturally does: fragments. No serious mobile application developers will target the OpenMoko, because there does not exist one stable, comprehensive API that is guaranteed to exist on all phones. (Note: I am not advocating that no one else should be allowed to provide an alternative distribution, or that the official distribution should be closed-source; I advocate that OpenMoko should have allocated the necessary resources to ensure that there would exist one stable, usable, and comprehensive platform, before the first phones ever shipped to the general public.)

In the absence of a single platform, we have many competing platforms such as Gnome Mobile, home-rolled FSO, various partially incompatible versions of Trolltech Qtopia, and a seriously hacked-up Android derivative. On top of that are many competing GUI libraries, such as GTK, QT, Enlightenment, and others. What does a developer target to target all OpenMoko phones? Already, many distributions include all libraries; this is wasteful on a device with only 256MB of built-in storage. What does a developer do to make sure his app fits in thematically with the rest of the phone? There is nothing that can be done, with so many different competing GUI libraries with incompatible widget sets and theming mechanisms. OpenMoko apps look and feel like a Chimera, the mythical Greek monster made of an assortment of animals.

OpenMoko still lacks basic features such as a usable on-screen keyboard (none of the alternatives are particularly good), a usable web browser (once again, many options, none particularly good) and a usable email client. In general, we lack most apps in finger-friendly format. Note to OpenMoko developers: I DO NOT WANT to carry around a stylus with my phone. Furthermore, there is no proper centralized repository (opkg.org is close, but not a real repository), unless you use a Debian distribution, in which case you've got a whole raft of usability problems. In many cases, you have to download important applications from some random hacker's website in Russia (GPRS). And there is no digital signatures mechanism currently being used by anyone that I am aware of, so I have no idea if any of the software I download has been trojaned.

The purported advantage of the OpenMoko design over Android was the native support for X11; I was at first convinced that this was awesome, because "existing GNU/Linux apps could be ported in no time". However, I have seen what most "ports" consist of: dumping the app onto the phone with little or no GUI changes. The OpenMoko screen is 480x640, and the overwhelming majority of X11 apps are no longer are designed to work at this scale. Often, with dumped apps, I find that critical GUI elements are off-screen, with no known way to get to them. Even when the GUI fits onto the screen, often the widgets are so small that they can only be manipulated with a stylus. To be honest, a proper finger-friendly porting of most apps would require so much design and coding re-work, that having an existing desktop GUI in X11 is no advantage from writing the GUI from scratch.

There is still no official encouragement of cellular Internet (currently, GPRS), which is the primary selling point of a smartphone. There is no easy GUI method for setting up Bluetooth devices, which is another selling point of most phones nowadays. Furthermore, there are still obnoxious hardware/firmware bugs. For example, my Freerunner does not warn me with audio when it is running low on battery (a UI bug, most phones have a little tone they make every few minutes when the battery is low), and when the battery runs out, I cannot recharge it without jumping through kludgy hoops; this is a known bug in hardware or firmware, but it has not been fixed for no apparent reason, despite rendering the phone unusable to non-hackers.

In general, OpenMoko still feels like it is a project controlled by hackers who like the idea of a phone, but only use the phone when it is sitting on their desk, attached to a computer via USB. I cannot believe that management could be this clueless about what it takes to build a product, so I will assume that the project is running so low on funding that they simply do not have access to sufficient resources, and are desperately trying to keep it going, hoping for another round of funding.

Wow, that was long, and a bit more ranty than I wanted when I began writing it. I started out extremely enthused about the OpenMoko project, and even when the Freerunner arrived in my mail and was barely functional, I made excuses rather than being upset. The software has improved somewhat over the last 6 months, to the point where it is a usable basic phone for a hacker. However, now that Android is (almost) as Free as the OpenMoko, and the Android phone is available unlocked, and both the Android phone and the Android software are both so much better than the Freerunner and the various OpenMoko distributions, I can only wait until I have saved up enough money to buy an Android phone. I might even try Android on the Freerunner, especially if it becomes less of a hack to install. I simply see no reason for OpenMoko to continue to exist, either as a software or hardware platform. They had a great idea, much earlier than others, but they simply bungled it.

Comment: Re:Potential for Netbooks (Score 1) 244

by lordcorusa (#26920193) Attached to: Web-based IDEs Edge Closer To the Mainstream

As others have mentioned, `xhost +` is unnecessary in this scenario and potentially harmful.

Also, X is a bit painful to use over slow and/or high-latency connections. For this, you may want to set up FreeNX. Once setup, it works similarly to X, only optimized for low-speed high-latency connections.

Comment: old adage (Score 2, Insightful) 231

by lordcorusa (#26775809) Attached to: Wikileaks Publishes $1B of Public Domain Research Reports

The end result is the old adage I first heard applied to the Chicago political machine of the 1960s: A government does not have to be good, and rarely is. It only has to be good enough that the populace will tolerate it.

An older version of the same adage:

Prudence, indeed, will dictate that Governments long established should not be changed for light and transient causes; and accordingly all experience hath shewn that mankind are more disposed to suffer, while evils are sufferable than to right themselves by abolishing the forms to which they are accustomed. But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security.

Comment: Re:Performance Crippled? (Score 1) 101

by lordcorusa (#25588795) Attached to: Triple Booting an Intel Mac the Right Way

The last time I installed using LVM, you still needed a separate /boot partition. Maybe that's changed, though.

I used to use LVM religiously. I still love its ability to resize partitions. However, when anything goes wrong with the filesystem, it's really, really painful to try to fix it. I finally quit using LVM because of that.

Cellphones

Neopwn, the World's First Pentesting Mobile Phone 103

Posted by timothy
from the data-rate-plan dept.
thefanboy writes "What do you get when you cross BackTrack Linux apps with a mobile phone? This is the first ever publicly available mobile phone running a full custom Linux network auditing distribution, and it runs it surprisingly well. One can literally go from phone to pwn in 2 seconds. Based off of the Openmoko Neo Freerunner, many steps have been taken to compensate for the lack of a QWERTY keyboard with automation scripts, dialogs, and a point-and-pwn menu. It runs applications such as Metasploit and the Aircrack suite quite well, especially given the fact that it supports a wide array of USB WLAN cards."
Education

Ben Stein's 'Expelled' - Evolution, Academia and Conformity 1766

Posted by timothy
from the ben-stein-is-smarter-than-you dept.
eldavojohn writes "Painting the current scientific community as just as bad as the Spanish Inquisition, an extended trailer of Ben Stein's "Expelled" has a lot of people (at least that I know) talking. It looks like his movie plans to encourage people to speak out if they believe intelligent design or creationism to be correct. In the trailer he even warns you that if you are a scientist you may lose your job by watching 'Expelled.' Backlash to the movie has started popping up and this may force the creationism/evolutionist debate to a whole new level across the big screen and the internet." adholden points out a site called Expelled Exposed, which asserts that 'Expelled' "is simply an anti-science propaganda film aimed at creating controversy where none exists, while promoting poor science education that can and will severely handicap American students."
The Internet

Scientology's Credibility Questioned Over Video Channel 450

Posted by ScuttleMonkey
from the scientology-and-shady-almost-synonyms-these-days dept.
stonyandcher writes to share that the Church of Scientology has come under fire for some items on their recently launched video channel. Most notably, claims have been leveled that dignitaries in one of their videos were faked and at least one of the people featured in the video is claiming their statements were taken out of context.

If I'd known computer science was going to be like this, I'd never have given up being a rock 'n' roll star. -- G. Hirst

Working...