Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:Hope the trend continues. (Score 1) 263

If the company had a history of never patching vulnerabilities or even being spotty and refusing to support new products, then it makes sense to out them immediately.

But Microsoft has been issuing monthly patches for supported versions of Windows for years.

Yes, they'll delay or rescind a patch once in a while when it breaks things. Any company can be in that position though, and that's OK too provided they reissue a good patch when it's ready.

Instead of publishing exploit details and POC code automatically after 90 days, they should publish mitigation measures immediately (to actually help admins secure their assets) and sit on the more technical details for longer than 90 days if they reasonably expect the vendor to issue a patch. Maybe set a hard cap of 180 days to avoid being strung along indefinitely. While 90 days is a good starting point, no two bugs are the same.

An automatic one-size-fits-all approach is draconian and stupid. Some bugs require multiple rounds of testing because things get broken unexpectedly by the first "fix". Large software projects often end up with hidden dependencies that complicate bugfixing; it's a fact of life, and ignoring reality in favor of ideologically-driven rules usually ends poorly.

Comment Re:Try Again Next Time (Score 4, Insightful) 248

The fact what they think went wrong was insufficient hydraulic fluid, and not their engineering process that allowed a major mistake to make it into the design and not be detected during testing, is the *real* problem.

It was detected during testing. Their entire retrievable/reusable concept is being developed and tested right now. Their contractual requirement is to put payload into orbit. The landing mechanism is merely an economic advantage for the company that will keep their costs lower; their contracts certainly don't specify it as a requirement.

Some shops use an iterative design process. It usually comes with being new to the market (and thus lacking the funds for extended pre-operative testing).

Some shops even do iterative design as standard practice when they are well-funded.

They were only required to launch supplies to the ISS. The ability to test and refine their landing mechanism is a bonus for the company. Hell, NASA's other contractor doesn't even have a reusable vehicle.

In conclusion: Do you know what we call a service that fulfills its contractual requirements? A success.

Comment Re:Application installers suck. (Score 2) 324

Pretty much.

The Windows Store has more granular permissions, restricted UI modes, and reduced legacy API support. These things will lead to apps using modern security and UI conventions, which is mostly a good thing.

A curated app store is probably good for normal users. As long as sideloading apps is always supported, this should make some headway on taming the burden of legacy software.

I expect to see an unending avalanche of shitty Win32 apps for the rest of my life, but the Windows Store at least offers some vague hope that it will diminish over time.

Comment Re:Application installers suck. (Score 2) 324

Applications and config/data files that need to be available for multiple users can be installed to C:\Users\Public by default without admin privileges. This location is available in an environment variable in case the admin has changed it (can't remember the variable name off the top of my head).

Applications with per-user installation or config files can use the %USERPROFILE% environment variable to find a safe place to store their data (defaults to C:\Users\username). Creating your own directory there is probably a good idea and is permitted by default.

There are guidelines for using the pre-established directories for Desktop, Documents, Downloads, Music, Pictures, and Videos though, since they are shared with the OS and other applications.

Comment Re:Application installers suck. (Score 1) 324

Chrome has a Windows installer that does not require elevation. The single-user installer unpacks to a directory in the user's personal profile and runs from there.

Since it cannot install the updater service without admin privileges, Chrome cannot upgrade seamlessly---the browser must be running to detect the update, so it must be restarted afterward. I suspect this is why the standalone installer is not the default option and not widely advertised.

The latest version is always linked at https://support.google.com/ins... if you need to grab a copy.

Comment Re:Seriously? (Score 1) 252

It's clear you don't know where to begin criticizing it. DVDs do it (very poorly) and Blu-Ray do it (less poorly).

You identify two systems as examples of your new "security" feature, but both of them have been laughably compromised. Neither scheme lasted more than a year in the wild, and with a PC security standard you'd need to manage a bit more than that.

A similar system would be trivial. As would be putting the PRIVATE KEYS on the mass produced hardware (encrypted and signed, of course). You do know how PKI works, don't you? You don't send someone your private key for them to authenticate you. You encrypt their public key with your private key and send that encrypted PRIVATE KEY derivative. So, burn that encrypted key into the USB device as part of the driver.

I bolded the part that is problematic. How does one burn a key into the device as part of a driver, exactly? With security, the devil is in the details, and your proposed system sounds no better than similar systems which have failed in the past.

That you are too dumb to understand an idea doesn't mean the idea is dumb.

Nice ad hominem, but maybe you should have provided a substantive argument instead.

I believe your explanation rather than my intelligence is at fault here. You identify two systems as functional examples of your new "security" feature---neither of which is effective in practice. AACS has been compromised repeatedly, which shows that simply revoking the exposed keys and hoping new equipment fares better is not an effective strategy.

Can you explain, clearly, how your system differs in such a way as to render it immune to similar attacks? If not, then there is absolutely no reason to take your proposal seriously.

Comment Re:Seriously? (Score 1) 252

Yes. Is that a problem?

If you don't see a problem with PRIVATE KEYS being distributed inside mass-produced hardware, I do not even know where to begin criticizing your position.

Every piece of equipment would need significant anti-tampering measures because as soon as the keys are retrieved from one device, it is game over.

This is why DRM software keeps getting cracked over and over in spite of the billions of dollars being spent on developing it. If your scheme requires a secret that the user needs to operate the device, it will be compromised.

People crack stuff like this for fun. We've seen it happen year after year. Do you think there will be more or less cracking attempts when there are serious espionage or financial incentives?

Comment Re:Seriously? (Score 1) 252

To be an HID, it must announce itself as one (called "driver" even when it just announces itself and requests the default OS driver). To do so, it must authenticate with the host OS. If not, the HID functionality will be disabled.

What? USB devices in general, and HIDs in particular, do not authenticate with the OS when plugged in.

You plug it in, and it negotiates with the host controller automatically. The host controller notifies the OS that the device is there, and then the OS queries the device for its properties. The device is perfectly capable of lying about what it is and what it does.

If the device identifies as a keyboard, mouse, Smart Card reader, or removable storage, by default the OS will load its native drivers and handle the device seamlessly. The device could have nefarious functionality, but the OS has no way of knowing about that.

Various OS security tools and third-party utilities can attempt to restrict the use of USB devices. None of them are pleasant to use---from the standpoint of either the administrator or the end user.

I've been told the problem is when the USB drive is actually a storage device, but leaches power (but no connectivity to the host computer) to broadcast the contents of the device on WiFi to a listening attack machine outside (but in WiFi range).

Not terribly practical or interesting. This idea probably came from someone who watches too many "hacker" movies. Anyone who is concerned about restricting USB devices probably already has a solution for detecting rogue Wifi clients and APs. If not, they can buy one off the shelf. This is something I would expect to see in a Hollywood movie.

Rogue USB devices are not something a hacker is going to use against some random citizen in hopes of scoring access to their checking account. This is something enterprises and governments are going to be worried about, and they have options for mitigating the threat.

Comment Re: Plain text e-mail... (Score 1) 141

Storing authentication credentials in a retrievable form anywhere is stupid, even in memory. Kerberos has been around for decades, and it eliminates the need for a password to exist in memory as soon as it is hashed.

If Kerberos is not practical for a particular application, the same principles can be used in proprietary authentication mechanisms.

A plaintext password should never persist. Period. It is the result of a stupid decision somewhere, and we have known better for a long, long time.

Comment Re:This should be free (Score 1) 170

Your theoretically sound system is not practical to implement. Plus, we already have a better solution.

The only problem you've clearly identified with the CA system is already addressed by certificate pinning. Your solution offers nothing of value beyond what I can accomplish with pinning---and your idea brings a whole lot of administrative overhead.

While certificate pinning does require local administration, it is significantly less burdensome than your approach. Even Microsoft supports it now, so it is not some niche security option anymore.

Certificate pinning takes ultimate trust back from the CAs yet works easily with the existing infrastructure for applications that you don't need to control as tightly. I have no idea why you are promoting a system that is more complicated and less compatible with no concrete advantages.

Comment Re:This should be free (Score 1) 170

Yes, the lack of a theoretically sound system is a problem. Your "solution" was to disband the existing system without any sort of meaningful replacement.

We can always use PGP/PKI internally and with close associates. But we need some form of identity verification for everyone else in the world too.

The CA system is flawed---but better than nothing at all. Your "solution" returns us to having nothing for the rest of the world. I.e., it is not a solution or an improvement in any meaningful way to what we already have.

Comment Re:Sure... (Score 2) 343

So your suggestion is, let's keep all of our super important stuff on a front-end facing system in the first place.

I never said that, but thanks for throwing an asinine straw man up there.

They can probably lock things down better than they did, but I don't work at Sony and I haven't seen their network diagrams so I can't really say. But the idea of air-gapping financial systems for a company of Sony's size is mind-boggling stupid.

Even something as simple as warranty work breaks down without automation. Every authorized repair depot needs some way to order parts, submit claims, and receive payment at an absolute minimum. If you air-gap the systems for that, guess what happens to time and cost of warranty repairs? And this is just one facet of the business.

So right there, you have network-accessible procurement, payment, and personally-identifiable information (customer name/address and product serial number are typically included in warranty documentation). Waving the magical air-gap wand as a security fix means nothing if it fundamentally breaks the way the business operates.

So yes, Sony probably fucked up somewhere. If they're like most businesses, there are probably multiple problems with their infrastructure. But pretending there's a simple answer is just ignorant and does absolutely nothing to advance the discussion or solve any real-world problems.

Comment Re:Sure... (Score 5, Insightful) 343

If you air gap email and financial systems, you're stepping right back into the mid-1900s. Back when it took an entire office of secretaries to process correspondence, and another office full of accountants to handle billing and ledgers. Because if those systems are disconnected, someone will have to transfer reams of data in and out of them. That is no longer feasible.

Your suggestion is so completely impractical, I wonder why you joined slashdog in the first place. You clearly have no understanding of modern IT.

Comment Re:This should be free (Score 1) 170

And your solution only works for entities with which you have a pre-established relationship and a shared secret (in this case, your personal information).

This does not solve the general problem of identifying an entity on the internet with whom you have no shared secrets.

This suggestion is nowhere near being a replacement for existing CAs as they are currently used.

Slashdot Top Deals

Any circuit design must contain at least one part which is obsolete, two parts which are unobtainable, and three parts which are still under development.

Working...