Forgot your password?
typodupeerror

Comment: Re:Huh? (Score 1) 222

by Seyedkevin (#37705384) Attached to: We Finally Know Why Oil and Water Don't Mix

I suppose science never truly reaches the complete "why". Not yet, anyway.

For example, one could ask why adding vinegar to baking soda makes bubbles. Because it's an acid and base reaction and the transfer of hydrogen creates water and carbon dioxide. One can then ask why they react. In fact, one can keep asking why to each explanation until we need to get into the particle physics and eventually get to some property of matter that we don't truly understand and must explain with "because they just do."

In a sense, I think science is about trying to reduce the number of theories based on pure observations in an attempt to try to understand things from the 'core' properties. An analogy would be to be able to explain how a bit of code in a Java VM would work in terms of machine code.

Comment: Re:Open Source Definition vs Free Software Definit (Score 1) 107

by Seyedkevin (#37429496) Attached to: Celebrate Software Freedom Today

Most people don't follow the Open Source Definition when calling anything 'open'. The term 'open' has always been close to the dictionary definition and is a more relative term than that of Free Software. It's often coincidental that "open source", as defined by the average person, means the OSI's definition of the Open Source Definition.

In addition, 'open' is somewhat considered to be some sort of consumer inspired movement to go beyond just source code licensing and into allowing for a business model that's closer to the "Bazaar" model at literally every aspect.

Remember, most of these open source projects are small and ran by donations and would rather have their bandwidth used for something useful. There is no obligation for Free Software/Open Source projects to let you use up massive amounts of bandwidth on their servers to be 'open', which, in this case, means little to their cause. If they wanted you to have offline documentation, they'd probably have put out download links or included documentation in the source code which your distro probably included in the package, as well.

That being said, you can still spider their site by ignoring robots.txt and changing your user-agent.

Comment: Re:nginx has its problems, too. (Score 1) 82

by Seyedkevin (#37211642) Attached to: Apache Warns Web Server Admins of DoS Attack Tool

Web servers run without root privileges so that the server isn't capable of doing overtly harmful things but you can still modify things that the web server is supposed to modify. That is, it can still mess it up.

If you want to give scripts a separately isolated area, you can use this: http://httpd.apache.org/docs/2.0/suexec.html File system permissions takes over from here.

I don't know too much about SQL servers, but couldn't you probably use kerberos or something instead of directly using database passwords?

Comment: Re:Openness? (Score 1) 167

by Seyedkevin (#37132976) Attached to: ARM Is a Promising Platform But Needs To Learn From the PC

I don't understand why people keep making the misconception that "open" means "incompatible is good".

Yes, there are people who make new standards. Perhaps because they felt it logical, impractical to adhere to pre-existing standards, or maybe they made a mistake. But the gross majority of open source is about implementing standards in a different ways.

When open source applications *do* make new standards, it's very common for the said standard to have a nice little library so that anyone else can reimplement the standard. This isn't like proprietary software where someone has to reverse engineer the program and go through obfuscation (see skype) for another application to be able to communicate, and, in other words, contribute to the universality of the standard.

But here, we're talking about hardware. You can't change hardware. You could say that every arm chip is like a different standard in which Linux is supposed to abstract away, which is redundant effort and a hassle for everyone. Openness, I think, is about allowing others to be able to contribute and standardization helps greatly to let this happen.

Heck, look at POSIX. It's done one of the most good for FLOSS since it allowed contributions from all compliant operating systems to contribute to one another.

It's *not* ironic because openness strives on standards.

Comment: Re:Another non-exploit (Score 1) 184

by Seyedkevin (#37021258) Attached to: Guide To Building a Cable That Improves iOS Exploits

Unless its a completely closed system, security is never perfect.

I'm baffled as to why you think a closed system is somehow more secure. Security through obscurity is by no means "perfect".

On a separate note, I think the security of systems only really matter within the scope of possible attack. For example, it's not an 'exploit' if you can crash the system with a script if the system doesn't have another exploit that would allow an attacker to run the script. Clearly, a defense against such exploit should be in place to better harden the system but it's only an exploit if there are other exploits.

Nearly all systems should not be capable of a remote exploit and, is in that sense, secure. It is only when you start starting services like an SSH server or a SQL server where the possibility for remote exploits becomes evident. Fortunately, mobile devices don't have these kinds of services and any exploit depends on the user executing malicious code his/her self, intentionally or not.

Additionally, having hardware access is usually regarded as having a system completely under your own control. This makes mobiles a different ball game since you can't practically lock the phone behind steel walls. If hardware hacks requires software exploits which cannot be done by a malicious attacker, I don't see how that's a problem. This is only a problem to those who voluntarily use software exploits, open up their system, and allows attacker to have physical access to it.

In a nutshell, calling this cable hack an exploit is like saying that it's an exploit if a root user can run "rm -rf /".

Comment: Re:In other words (Score 1) 156

by Seyedkevin (#36836122) Attached to: Mozilla Announces Enterprise User Working Group

We're all missing the point here. Why does a web browser need to run a database for its bookmarks?

Uh, what? It has to store data somewhere.

You make it sound like firefox comes bundled with a full-on SQL server when in reality it just reads and writes to a SQLite database and some XML files. SQLite is basically just an indexed version of flatfile storage.

Although, it'd be kind of cool if firefox could save its config in a centrally maintained SQL database like KDE.

Comment: I actually liked Evolution... (Score 1) 283

by Seyedkevin (#36686788) Attached to: Thunderbird Unseats Evolution In Ubuntu 11.10

Evolution had all the features I wanted and none of the features I didn't care for. I'm not saying it's perfect but after spending hours trying to get Thunderbird to do what I wanted, I realized it was just too painful to use Thunderbird.

If for example, you wanted PGP support, good message threading support, force plaintext if available, Google Contacts + Calendar sync, script generated signatures, ActiveSync support, mbox spool support, maildir support, and what have you, this is all integrated into Evolution by default. A lot of the addons on Thunderbird that accomplishes these things feel really subpar. I recall at one point accidentally deleting all my contacts on Google via an add-on for Thunderbird. Thankfully, I had my contacts backed up, so I didn't suffer too much from that.

That being said, if all you want, like the majority of people, is a simple mail client that you can use to read mail from, Thunderbird is much leaner and probably better. I feel Evolution more suitable in an enterprise environment due to Activesync support and due to its similarities to Outlook. Since Ubuntu's becoming more and more of a personal desktop OS rather than an enterprise OS like what Red Hat offers, I would say their strategy to adopt Thunderbird was a good one.

Additionally, Evolution used to be a lot slower than it is now. It was practically unusable if you had a high volume inbox. The older version of Evolution is used for the Windows version wchich you can get here: http://www.dipconsultants.com/evolution/, making it easily inferior in the face of Thunderbird on Windows.

Comment: Re:The unmentioned BIGGER mistake... (Score 2) 213

by Seyedkevin (#36635514) Attached to: The Most Dangerous Programming Mistakes

In UNIX operating systems it's common practice to use separate user accounts for daemons and to not run anything as root. Isn't that enough to do what is being asked? In addition, we've PAM, SELinux, PaX, and more that's capable of locking down the system beyond filesystem permissions and user accounts.

These systems probably aren't configured by home users at all, but it's practically mandatory in enterprise.

Comment: There's more to privacy than just selling data (Score 1) 80

by Seyedkevin (#36611584) Attached to: Survey Shows Support For New Privacy Laws

The level of tracking that advertisements and such take isn't really personally identifiable information -- they don't try to take your identity but more keep tabs on what other websites you've visited that have ads. If a company collects data from you, it should be, at the very least, for some sort of technical purpose like showing relevant ads based on the "Likes" you have on facebook.

That said, I wouldn't want anyone selling this kind of information to data miners for the pure purpose of stalking your online life. What's the point of privacy settings if they're just going ignore it and sell all your data to any company that shows up on the front door with cash? And, if they were to sell it after you gave them permission to, you should also be notified when and to whom your information was sold to.

They really should also mandate the company reveal the technical process in which they process your information internally. Important details include the storage of passwords (plaintext/hashes/salted hashes/algorithm/etc), if encryption of data is used at all, how keys are stored for encryption if used (Are they stored on client software? On the server? Are the keys on the server accessible by the admin or are they encrypted using your password? What kind of keys are they?), how long collected data is stored, and how can they use it?

There's a lot of "should"s in here, but at the end of it all, it's probably safe to assume that the corporations selling data to data mining companies will put up quite a fight.

Comment: Re:With all respect to Torvalds: (Score 2) 116

by Seyedkevin (#35582766) Attached to: Linus Says Android License Claim Is 'Bogus'

Not to discount Linus, but I think RMS or FSF lawyers would be more qualified than Linus to speak on this particular area.

While I agree with you completely, as cited in TFA, RMS states the following:

I've talked with our lawyer about one specific issue that you raised: that of using simple material from header files. Someone recently made the claim that including a header file always makes a derivative work. That's not the FSF's view. Our view is that just using structure definitions, typedefs, enumeration constants, macros with simple bodies, etc., is NOT enough to make a derivative work. It would take a substantial amount of code (coming from inline functions or macros with substantial bodies) to do that.

Header files don't have much information. Without context as to how the program operates, it's just a compilation of meaningless values.

Besides, even if this were to be qualified as a derivative work, who in the kernel community would consider it significant enough for a lawsuit? In reality, this is just a non-issue brought up by a Microsoftie to spread FUD by attempting to convince consumers that Google is stealing code. If the complaint was made by the owner of the submissions or anyone related, this would, of course, be completely different.

Comment: Re:Forcing authors to lose rights over work (Score 1) 391

by Seyedkevin (#31399190) Attached to: Ask the UK Pirate Party's Andrew Robinson About the Issues
On the contrary, I think that in our current situation, shortening copyright term would only make things worse for the GPL. Think of it this way, the GPL was designed to keep software from going proprietary, and once the work has been out for 5 years or whatever, it'd be in the public domain. That means a company like Microsoft can just paste GPLed code in. Yes, the code may not be the most up-to-date version, but the initial release is probably all the company needs--patching up the program themselves to suit their needs is not that big of a deal. Plus, after that, you get patches going into the public domain on a daily basis. Also, what if they infringe upon the GPL and use works before it goes into public domain? It's difficult to tell if the company wrote the feature or if they used a patch illegally, and there's a good chance that the work will have gone to public domain before legal threat is over as long as it's not a really new feature.

On the other hand, proprietary vendors do not release sourcecode so we'll never see sourcecode, ever. Even if it goes in the public domain. In addition, they still have other terms we have to agree to like EULAs and terms of use. At the end of the day, I think a proprietary vendor being forced to give up their object code in the public domain does not actually hurt them but does hurt the GPL.
Space

15-Year-Old Student Discovers New Pulsar 103

Posted by ScuttleMonkey
from the sky-isn't-the-limit dept.
For the second time in as many years, a student has made a discovery while participating in the Pulsar Search Collaboratory (PSC), a joint program between the National Radio Astronomy Observatory and West Virginia University designed to get students and teachers involved in analyzing data from the Robert C. Byrd Green Bank Telescope (GBT). This time it was high school sophomore Shay Bloxton, who discovered a brand new pulsar. "For Bloxton, the pulsar discovery may be only her first in a scientific career. 'Participating in the PSC has definitely encouraged me to pursue my dream of being an astrophysicist,' she said, adding that she hopes to attend West Virginia University to study astrophysics. Late last year, another West Virginia student, from South Harrison High School, Lucas Bolyard, discovered a pulsar-like object called a rotating radio transient. His discovery also came through participation in the PSC."
Technology

Not Enough Women In Computing, Or Too Many Men? 686

Posted by timothy
from the rational-choices dept.
itwbennett writes "Do geeks really 'drive girls out of computer science,' as the headline of a LiveScience article contends? Blogger Cameron Laird doesn't think so. In fact, 'I don't think "gender issues in computing" is important enough to merit the attention it gets,' says Laird in a recent post. And maybe the problem isn't that there are too few women in computing, but that there are too many men. 'I'm waiting to read the headline: "Women too smart for careers with computers,"' says Laird, 'where another researcher concludes that only "boys" are stupid enough to go into a field that's globally-fungible, where entry-level salaries are declining, and it's common to think that staying up all night for a company-paid pizza is a good deal.'"

The disks are getting full; purge a file today.

Working...