Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×

Comment Re:Repositories for the win (Score 1) 718

The software center isn't really designed for developers, but rather for the "normal" user (i.e. the ones that reach out to geeks like us to fix their printer/monitor/iPod/foot pedal). For developers, using Synaptic, apt-get, or Kpackage Kit, or whatever is probably more desirable, even though we generally end up needing to pick through all those extra libraries that aren't pertinent.

Comment Re:THIS IS A FARCE (Score 1) 510

I absolutely agree. Trying to encrypt every bit of data is silly, to put it lightly.

While I have no empirical data to support this, I would venture to guess that most data security breaches aren't happening because somebody got a dump of the data from a database server, but they've found a way to exploit the server & retrieve the data once it has been decrypted. For instance, any administrative interface that allows searching for customers would probably retrieve unencrypted information or have some relatively simple method of doing so.

In the event that every bit of "PII" data is encrypted, I would imagine that a "hacker" with enough access to view the raw data wouldn't have too much trouble finding the key. I would bet that it would take very little time before the programmers or sys admins took a few seconds to change the encryption so it didn't require a password, or that they would store that password somewhere fairly easy to read, just to save a few seconds when the server (or application) has to be restarted. It would take getting called in once on vacation or on the weekend, finding out that the server/service had stopped at the "enter password" prompt for this to happen. I have seen it all too often.

The system is only as strong as its weakest link. Imagine taking all this time to secure the data, only to get hacked because the server allowed root to SSH in, and the password was simple (i.e. {company name} + "123")? A few minutes on Google would get just about anyone enough code to read data as it is being read/written, thereby circumventing the key.

Comment Re:You know, I've dealt with this kind of problem. (Score 1) 306

That is truly a cool story, and a great example of how inefficiencies get into the system.

As an IT contractor, I've been accosted on a daily basis by the idea that I should only give the customer what they want, and nothing more, or the idea that it is totally awesome when a project takes longer than expected or simply never ends. Most of the time it isn't thrown out that bluntly, but the end result is the same, and the basis for it is the need for security. It encourages concentrating only on the little pieces and not taking the big picture into consideration, to create a security blanket at the price of real innovation.

"Hey, Jon. I wanted to show you this sweet software that could make Nancy's office way more efficient. No more paper-pushing!"

When the word "change" is thrown around in places where papers are routinely pushed and no electronic part is involved, people get scared and start thinking things like, "they're going to replace me with a computer." The people doing the change aren't scared because their JOB is to do the change; the people affected by it are scared because nobody tells them what is in the change... now everybody starts talking even more.

"I heard Nancy and her whole office could get replaced by one computer program."

"I heard Nancy and her whole office are being replaced by one computer."

"I heard Nancy and her whole crew got replaced by computers."

"I heard Nancy's office was replaced by computers."

"I heard Nancy was replaced by a robot."

"Did you hear Nancy's whole building was destroyed by a huge robot?"

"Holy shit... is that Arnold?"

Comment Re:When do people get this (Score 2, Insightful) 613

The 'anecdote' to which you refer is completely plausible. Remember that the info wasn't based on any sort of raw processing power, frames per second, or disk throughput: it was based upon human perception. Don't ever throw the two together and think that they will always (or even mostly) match up.

I can tell you that a computer that has 4x the specs of another doesn't necessarily end up running better, regardless of the (Windows) OS. This can have a lot to do with hard drive fragmentation: one system that has low specs but is defragmented routinely can appear to run faster than a high-end machine that has 50% file fragmentation. I know it sounds like I'm picking on Windows, but the issue with data fragmentation--and the need to defragment--wholly resides on the shoulders of Windows users. And again, the speed difference, especially when it comes to perception, can be hugely affected by how much the hard drive is thrashing when a program is started.

Another issue in perception is with video performance. Two identical machines with the same model of high-end video card can have completely different apparent speeds if one has problematic drivers. A number of other things might be measured. I remember back when a person would say that their high-end, I-just-spent-more-on-my-computer-than-my-car computer was running slow simply because their dial-up connection was slow: the computer was running perfectly, but the user perceived it to be an issue with the computer (instead of the slow connection) because the data wasn't loading fast.

Comment Re:So Iran's standards then? (Score 1) 697

Purchasing these nude pictures constitutes them crossing the state border? Wow... by that right, the simple act of going to *ANY* website where there is *ANY* sort of obscene pictures, even if they're just poor-taste adverts, is then sending obscene/illegal digital media across state borders. Holy shit.

I can see where this is headed.

Lawyer: "You're being sued by my client. You illegally pirated copyrighted digital media from my website."

Person: "What? What the hell are you talking about?"

Lawyer: "On March 10, 1993, you downloaded three pictures from rapemyart.com."

Person: "No I didn't."

Lawyer: "We have proof. Through a search you performed on DogBone.com for 'anal seepage', you downloaded a copyrighted picture through an advertisement."

Person: "I didn't download it."

Lawyer: "Your browser cached the picture. That constitutes downloading a copy. Downloading or copying copyrighted digital media through any means constitutes piracy."

Person: "Wait... my browser cached it, so I'm getting sued?"

Lawyer: "Yes. Also, you can't talk to anyone about this. The individual in the picture was a suspected terrorist, so we're seizing everything you own under the Patriot Act. All your base are belong to us. Resistance is futile."

Comment Re:So Iran's standards then? (Score 1) 697

Laws telling people what they can and cannot read/view in their own homes are wrong: these laws are, very simply, imposing moral/religious values upon people. The government and the laws it is built to uphold should NEVER impose moral or religious views on its people. I can't think of any reason why anything on the Internet would ever be illegal to view... that's like trying to tell me that "sector X" of the sky is okay to view with my telescope, but that I'll go to prison if I view "sector Y".

Comment Re:Looks good (Score 1) 401

*sigh*

GIMP will simply never be as popular as Photoshop. Even if Photoshop becomes less popular, it will still be the dominant adverb, like "Coke" is to soda, "Kleenex" is to nose tissue, and "Q-tip" is to ear swabs.

"That picture looks Photoshopped" vs. "That picture looks GIMPed." I think we all know who the clear winner is, even if the program that actually did the work was GIMP (or some other program altogether).

Comment Re:Doesn't matter (Score 1) 370

Seriously, it really doesn't. I haven't cared about the default search provider for quite some time, thanks to the use of keywords in Firefox: when I want to search for something on Google, I type "gg something" and I get the results page, which is faster than tabbing or clicking to the search box, and I know where the result will go. Just typing "something" into the browser's URL bar is asking for trouble (assuming you're not being literal.

FYI: to do keyword searching on google, go to a results page, bookmark it, then change everything after the "q=" to "%s", and put "gg" into the Keyword field (you may have to click the "more" button to see it). The URL in my bookmark is "http://www.google.com/search?hl=en&q=%s&btnG=Search"

I've done this with a lot of sites I search on. Dictionary.com (dictionary.reference.com) is "dd", Thesaurus (thesaurus.reference.com) is "tt", Wikipedia is "wiki", and IMDB is "imdb", and AcronymFinder is "ac". Now I don't have to put in a longer search string for definitions, alternative words, wikipedia articles, and whatever. Super fast, very simple.

Comment Re:Again? (Score 1) 361

The truly ridiculous part is how bass-ackwards the industry is becoming.

First, they attempt to put safeguards in things to keep them from being copied digitally or otherwise (i.e. "MacroVision" for VHS, various things on DVDs & CDs like purposely writing bad blocks in the data).

Next, implement silly DRM into the media so you can copy it on their terms: Sony's rootkit issue for audio CDs; BS DRM that requires you have an internet connection before attempting to use your digital file--assuming that the DRM server is still running.

Now they've decided to "make it easy": you can watch your digital copies wherever you want, on whatever you want, as long as you're using the industry's "user-friendly" online systems to stream it, i.e. "D.E.C.E.".

OOH! I GOTS AN IDEA!

Let's take this DECE thing, but don't require an online system. Take out the part where they've got to have special hardware to encrypt it so anybody can use any existing system to watch this content. Now EVERYBODY can use it!

Now, setup an online content system so people can buy & download digital copies in a couple of clicks, and avoid putting any "codes" into the thing. No typing codes to activate the content or to copy it to another device. Now you've got a system that anyone can use these digital copies on any device without being online, and they can buy new copies whenever they want! Oh, wait... this scheme doesn't have any DRM in it... let's call the new scheme "NRM": "No Rights Management".

Comment Re:On Which Planet? (Score 1) 460

I think we're arguing semantics here. By placing a machine (or machines, or devices) behind a NAT firewall means they are implicitely NOT directly accessible to the Internet unless they are explicitely granted access. So if BrokenMachineX has port 22345 open and is expoitable, placing it behind a NAT-enabled router means things on the Internet won't be able to talk to them: the router will receive the request, but can't route it anywhere because it doesn't have any forwarding rules for that IP.

It is true that improperly setup NAT-enabled routers might put a device into the DMZ by default: if this happens to be BrokenMachineX, then port 22345 will be directly accessible to the Internet. Placing another machine/device on the network with the same problem will not be accessible, however, due to the nature of NAT. If BrokenMachineX *initiates* connections to something on the Internet, that machine can communicate with it.

By your own words you denote the ability to protect private addresses through the gateway. If you think I'm still wrong, explain yourself.

Comment Re:On Which Planet? (Score 1) 460

Don't be an idiot. You're talking about something that you apparently either have inadequate or no knowledge about.

Using NAT means devices cannot be directly connected to from the Internet unless rules are specifically put in place to allow it. Using a public IP address means the world has access to it by default. There is a HUGE difference. Make the presumption that something is behind a firewall doesn't make it so.

Comment Re:On Which Planet? (Score 1) 460

... it allows all manner of insecure and misconfigured gear to avoid being probed from the other side of the planet...

Bravo! NAT absolutely does provide security features in exactly the way the parent described. To say that NAT does not provide security is like saying SSL does not provide security. Neither is the be-all-end-all of security to sure: NAT insulates broken network devices from the rest of the Internet, and SSL encrypts data to keep sensitive data from being readily used by malicious Internet users and devices.

Comment Re:We are better off without such charitable peopl (Score 1) 569

Absolutely. Further, the point isn't whether or not you support or oppose piracy: the point is that you shouldn't be giving up 99.9% of your FREEDOM so that a bunch of idiotic rich pricks can block the .1% of people that might possibly have engaged in some sort of piracy.

Yes, there are bad people out there. People who steal for the sheer purpose of depriving a person of something or causing them anguish. But stealing something over the Internet in the way that the RIAA, MPAA, and various other idiotic money-centric organizations/people explain it is ridiculous.

This whole thing revolves around the RIAA and to a lesser extent the MPAA. The whole thing is a ruse in a (hopefully) futile attempt to exert more control in order to make more money. Instead of innovating and making technology that allows people to purchase their products with less effort than it takes to steal them (or with more quality), they blame their inadequacies on that which they cannot control.

What if everyone that used e-mail had to use a system that registered every email they sent or received with the government? People are now horribly inconvenienced and penalized for using what used to be a convenient method of communication while those that are abusing the system (spammers) are still sending unregistered emails. This is exactly what those money-grubbing jerks are trying to do to the Internet: punish the masses for the sins of the minorities in an attempt to recoup imaginary losses.

They spend massive bundles of cash to build broken technologies like DRM to keep people from copying these files. Joe Schmoe down the street doesn't know what it is, but gets pissed-off when he has to spend a week without his computer after "installing" a music CD onto his computer, waiting for his grandson to come over and fix it. Meanwhile those that had been copying files before DRM continue to do so with little mitigation. Now normal people find out about these services that allow them to get digital versions of their stuff for free and require very little technical knowledge beyond being able to use a keyboard and search.

Comment Re:Not more safe (Score 1) 611

One of the biggest drawbacks of the Windows Registry is that it truly is a single point of failure. If something corrupts the registry, the computer won't boot anymore and oftentimes there is no way of fixing the problem. Since the registry holds configurations for pretty much every program installed, the user must not only reinstall Windows, but also all the programs.

In this way, Linux is truly more powerful than Windows. Should something happen that is similar to the afore-mentioned registry corruption issue on a Linux box, the user might still not be able to boot: in this case, however, the user can generally boot into single-user mode (or off a boot CD) to fix or remove the problem. Since the problem would be with a configuration file, the worst that would happen is that the file would have to be removed.

But let us not bash Windows just for the sake of bashing it: I think a fair comparison is in line.

All that being said, I don't really think that Linux is 100% secure by design. It has tons of advantages compared to Windows. For the most part, a Linux user has permission to edit things only in their home directory, so installed software generally can't take the whole system down: in Windows, a user can potentially access most of the system, especially since many Windows installs give the user administrative rights (or they login as Administrator) by default. A user on a Linux box can FUBAR their system just as effectively (or more so) than a Windows user if they have the root password or have unlimited sudo access, it is usually just a bit more difficult than on Windows.

As Linux begins to make larger strides into the desktop market, I think some very important issues regarding "viruses" will emerge. For instance, I would bet money that I could write "a virus" that would immediately break any modern Linux box in a heartbeat and it would never be detected by antivirus software. A script that asks for root permission to recursively delete "/" (kdesudo rm -rf /) probably wouldn't get picked up by antivirus. A script that opens a port on startup which executes commands as root would be almost trivial to create, and the user would never know.

The issue of detective viruses is tricky. What is a virus? A/V companies (as far as I can tell) seem to believe that searching for "signatures" is the way to find them. The fundamental issue, here, is that the viruses have to first be *discovered* (not to mention being classified): at least one person has to be infected by it before others can be protected... meaning that a given virus, assuming it actually runs, has virtually 100% chance of getting at least once system infected. Virus scanners have to become more proactive to be useful in Linux: besides finding these signatures, they must also interpret a processes' activities and be able to pre-emtively stop it from doing something bad.

The bottom line? If there were one Linux desktop out there for every one Windows desktop (or laptop, or netbook, or whatever), we would find that Linux simply has a different set of vulnerabilities. And the biggest vulnerability is the user.

Comment Re:Subs and functions (Score 1) 477

Being able to separate your code into small logical pieces makes it easier to understand, test and maintain.

Well said! Bravo!

Breaking code into smaller logical pieces is paramount to making it maintainable. When you or someone else goes to maintain it in the future and finds there is a problem with a specific portion, such as with writing data to the database, they should be able to find the database portion and trace back from there.

A good example of this principle of logic separation is with code that interacts with a database. I've worked with some custom PHP code that had a lot of interaction with a database, but all the connections were done directly when needed instead of using a class (or even just a function). The effects of this decision had a ripple effect:

  * parameters to the connections had to be given inline
  * connection strings had to be globally accessible, though were regularly hard-coded
  * SQL had to be written & executed inline
  * checking for rows returned/rows affected/errors had to be done for each call manually
  * code was regularly created via copy/paste
  * variable re-use often caused errors (previous call was successful, last call wasn't, but variable continued to have a successful appearance)

The first thing I would have done, had I been allotted the time, was to integrate an abstraction layer for the database. Separating this logic would have allowed for the ability to write-out SQL statements to a text file for analysis--which is very helpful for determining where a database can be optimized (i.e. better indexes, etc).

This specific level of separation would also allow for moving to a different database with a smaller impact on the code: modify a smaller set of code to connect to a new DBMS, then use it to track any issues resulting from the new database. Not only does it help save time, but it avoids rewriting a ton of code!

Slashdot Top Deals

Serving coffee on aircraft causes turbulence.

Working...