Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×

Comment Old problem (Score 5, Interesting) 200

Apple ran into something similar a long time ago for Mac OS X Server. The servermgrd daemon uses a self-signed SSL cert by default to secure communications with remote management tools. About four or five versions back the certificate was identical across all installations because it was contained in the installer package. Someone had to go down and show them that you could read all of the traffic by using sslsniff and the private key from your own copy of the installer. They changed to an individual, automatically generated certificate shortly thereafter.

--Paul

Submission + - Alleged FBI backdoors in OpenBSD IPSEC stack? (marc.info)

Aggrajag writes: According to Theo de Raadt: "It is alleged that some ex-developers (and the company
they worked for) accepted US government money to put backdoors into
our network stack, in particular the IPSEC stack. Around 2000-2001."

Comment Re:Good vs. Great (Score 1) 504

HEY YOU DISGUSTING PIECE OF *(&^*^&%&!!!!$#$#!! I CAN'T BELIEVE YOU WOULD HOLD THAT KIND OF RIDICULOUSLY IGNORANT AND BIASED OPINION IN THE FACE OF MY OWN MORAL RECTITUDE AND OBVIOUS SUPERIOR KNOWLEDGE. YOU SHOULD BOW DOWN BEFORE ME THAT I AM DEIGNING TO RESPOND TO YOUR POST!!!!

Happy now? ;-D;-D;-D;-D;-D;-D;-D;-D;-D;-D;-D;-D;-D;-D

--Paul

random text to get by the lameness filter.

9. Religious conditions were similar in Java but politically there was
this difference, that there was no one continuous and paramount kingdom.
A considerable number of Hindus must have settled in the island to
produce such an effect on its language and architecture but the rulers
of the states known to us were hinduized Javanese rather than true
Hindus and the language of literature and of most inscriptions was Old
Javanese, not Sanskrit, though most of the works written in it were
translations or adaptations of Sanskrit originals. As in Camboja,
ivaism and Buddhism both flourished without mutual hostility and there
was less difference in the status of the two creeds.

In all these countries religion seems to have been connected with
politics more closely than in India. The chief shrine was a national
cathedral, the living king was semi-divine and dead kings were
represented by statues bearing the attributes of their favourite gods.

6. _New Forms of Buddhism_

In the three or four centuries following Asoka a surprising change came
over Indian Buddhism, but though the facts are clear it is hard to
connect them with dates and persons. But the change was clearly
posterior to Asoka for though his edicts show a spirit of wide charity
it is not crystallized in the form of certain doctrines which
subsequently became prominent.

The first of these holds up as the moral ideal not personal perfection
or individual salvation but the happiness of all living creatures. The
good man who strives for this should boldly aspire to become a Buddha in
some future birth and such aspirants are called Bodhisattvas. Secondly
Buddhas and some Bodhisattvas come to be considered as supernatural
beings and practically deities. The human life of Gotama, though not
denied, is regarded as the manifestation of a cosmic force which also
reveals itself in countless other Buddhas who are not merely his
predecessors or destined successors but the rulers of paradises in other
worlds. Faith in a Buddha, especially in Amitâbha, can secure rebirth in
his paradise. The great Bodhisattvas, such as Avalokita and Mañjurî,
are splendid angels of mercy and knowledge who are theoretically
distinguished from Buddhas because they have indefinitely postponed
their entry into nirvana in order to alleviate the sufferings of the
world. These new tenets are accompanied by a remarkable development of
art and of idealist metaphysics.

Comment Good vs. Great (Score 4, Interesting) 504

Just a quick comment from a former Apple employee; most people are familiar with the old saw, "Perfect is the enemy of good enough." I.e., instead of trying to get something perfect, you should get it good enough and then ship it. Within Apple the perspective is slightly different. There, it's more along the lines of, "Good enough is the enemy of great." I.e., good enough isn't acceptable -- for an Apple-branded product we're going to look for the next level of polish and care that differentiates our stuff from everybody else's.

I think this comes from the fusion of NeXT and Apple engineers. Most people recognize that NeXT brought a heckuva foundation for Apple's next generation operating system to the table in 1997. However, few people recognize what Apple brought to the table -- an engineering culture that regards rough edges as anathema. There was plenty of NeXT software, but much of it was very rough; it wasn't easy to pick up for the new user, was missing essential features, crashed often, or all of the above. This was a direct consequence of the fact that Foundation and AppKit allowed you to create apps quickly and easily, but then as a software developer you still have to trap errors, check for corner cases, add documentation, tweak the UI design so that common tasks are easy to accomplish, etc. This can easily take three to four times as long or more as standing up the initial core functionality. Most NeXT apps never went through this stage and so they lacked the polish for mass market users. Once the NeXT technology went through the polishing process (and it took four years before the first consumer release, really five years and 10.2 Jaguar before it was truly ready for my mom!), the new OS was a completely different animal from OpenStep 4.2 -- much more polished and suitable for mass-market consumers.

--Paul

Security

Backdoor Malware Targets Apple iPad 196

An anonymous reader writes "Apple iPad users are being warned of an email-borne threat which could give hackers unauthorised access to the device. The threat arrives via an unsolicited email urging the recipient to download the latest version of iTunes as a prelude to updating their iPad software. Apart from opening up a backdoor, it also tries to read the keys and serial numbers of the software installed on the device, and logs the passwords to any webmail, IM or protected storage accounts."

Comment Re:I dont understand ... (Score 2, Insightful) 501

What education should be about is understanding, if you just train someone in one version of s/ware many just adopt a point and click approach with little understanding of what they are doing. You need different sorts of s/ware to make them think. Schools should use a mixture of: MS, Mac & Linux PCs.

I think it's a little more subtle than that. 90% of the kids using these things will go on to be standard users in life, treating computers as one tool among many. Have you seen how regular users treat computers? Most of them are uncomfortable using a new app without formal training -- even today's twentysomethings. Even on a Mac (yes, I'm a Mac guy).

What concerns me more are the other 10%, who will become power users, sysadmins, and developers. If all they know is MS and their pitifully low standards for stability, security, and usability, I am scared of the outcome for the next generation of software; not for the 0.1% of brilliant developers whom you can't keep down, but for the rest who grind out code in obscurity producing internal-use-only enterprise apps and vertical markets apps.

I think of a kid in my son's Boy Scout troop who had no idea that "SQL" had a broader meaning than a Microsoft product named "SQL Server". He's a brilliant kid and will go far, but he needed to have his horizons broadened quite a bit. I don't fault him -- rather, I fault those who mentored him and didn't show him the alternatives.

--Paul

Comment Re:Cross-platform code signing costs (Score 1) 158

I just checked and a 1-year code signing cert from Comodo is $179.95, with discounts for multi-year certs. Other vendors also seem to have pretty reasonable prices.

That's at least on the order of $100 per platform. The certificate for Windows is $179.95 per year, and the certificate for a secure web site from which to distribute copies of the software is another $99 per year. It gets even more expensive to target more than one platform: the certificate for XNA is $99 per year, the certificate for iPod Touch is $99 per year, and by the time one has ported an application to all the platforms that his audience uses, he'd be out of his hobby money.

Hold on, most code signing CA's include both the codeSigning and msCodeCom usage extensions in the same certificate so there's no need to buy multiple code signing certs. Unless you're conducting an e-commerce transaction (in which case you're no longer a hobbyist), there's no need for a website cert -- and even then I've found website certs for as little as $15/year. Mac OS X/iPhone code signing certs just require the code signing extension, so they just work. Ditto XNA. To join the iPhone developer program is $99/year, so we're up to $280/year, or about $24/month. This is well within the budget for most hobbyists. Most hobbyists won't be faced with multi-platform issues anyway. I sure as heck don't have the time to write and maintain a cross-platform app -- keeping up with Mac OS X is enough for me.

--Paul

Comment Re:Start working at 9 AM (Score 1) 158

The price of a certificate from a trusted root makes it uneconomic for some people to sign their software. Or at least that's what I've seen from Authenticode on Windows: most non-corporate-backed free software and freeware and much shareware is distributed without a signature. Likewise, homebrew applications for video game consoles use holes in the operating system's signature verification to start executing. At least Mac OS X has an option for self-signed certificates, which do #2 (make sure two binaries have the same publisher) without having to do #1 (make sure each publisher is part of a private club).

Grr... I just checked and a 1-year code signing cert from Comodo is $179.95, with discounts for multi-year certs. Other vendors also seem to have pretty reasonable prices. Anyone who has the time to put together a serious app (even for freeware) can afford that amount. Verisign charges an unconscionable amount (around $900 for one year!) for a code signing cert. Bleah!

Video console homebrews are a different story, as the console makers won't sign an app that hasn't been through their (rather expensive) developer programs.

--Paul

Comment Re:I've heard that before.... (Score 0) 158

The intent is to improve performance in situations where running an anti-virus scan or back-up utility would result in otherwise recently-used information being paged out to disk, or disposed from in-memory caches, resulting in lengthy delays when a user comes back to their computer after a period of non-use.

In my opinion as an experienced application developer the user should never run into the problem that Superfetch attempts to solve. Anti-malware scans or backups are generally limited by I/O transfer rates, not by CPU. In such situations, using lots of memory to pre-load data makes no sense. It is relatively easy to write a two-buffer, threaded, streaming system for situations that are constrained by disk transfer rates without consuming scads of memory.

I don't think you understood it right: the perf problem is not for the anti-malware programs, but once they have run they have thrown everything out of the cache and subsequent applications have to re-populate it again, thus slowing eveything down. There used to be the same kind of problem under linux after the 'locate' cronjob.

In these situations it's usually not the disk cache that's the problem, it's what binaries the OS has loaded in RAM. If app A is inactive and app B requests a lot of memory, the OS will swap app A's binary out to the VM backing store. When app A becomes active again there is a delay while its code is reloaded from the VM backing store. There is no reason for an app such as an A/V program or backup program or locate (all of which are I/O-bounded) to require huge amounts of memory other than the developer not realizing how to scale to handling large volumes of data.

What's happening is that the A/V app is just loading the entire application that it wants to scan into memory, then scanning it. This is unnecessary, especially if the app is really big and has lots of non-code resources (e.g., graphics). It's easy for the programmer, but bad for performance; indeed, it may cause problems for A/V performance as well if the app binary is larger than the amount of available memory, causing the A/V program to thrash in the VM system directly. Instead, create two buffers (each one around 1 MB as a good starting point, tune according to available memory, disk transfer rates, and system loads) and two threads. Thread 1 loads the buffers from disk -- its conditions are:

  1. Check buffer X -- if X is empty, load it from the disk and mark it as full. If X is full, go to step 2.
  2. Check buffer Y -- if Y is empty, load it from the disk and mark it as full. If Y is full, go to step 1.

Thread 2 does the actual A/V scans:

  1. Check buffer X -- if X is full, scan it and mark it as empty. If X is empty, go to step 2.
  2. Check buffer Y -- if Y is full, scan it and mark it as empty. If Y is empty, go to step 1.

You can do more improve efficiency by more threads, blocking on semaphores, and waiting on locks, but you get the idea. Note that there is no point in more than two buffers, since Thread 2 will always be done long before Thread 1. If the malware scan requires comparing widely separated parts of the target binary you may need to cache portions of it, but there's still no reason to hold in RAM the vast majority of the target binary.

--Paul

Comment Re:Start working at 9 AM (Score 1) 158

But then you rely on the operating system to provide a method for applications to provide cache hints, and you rely on the antivirus software to provide such hints. SuperFetch tries to infer these even for applications developed prior to widespread knowledge of these hints or ported from systems that lack these hints.

As I understand it, neither function pointer uniquing caching nor Superfetch require that the A/V or backup software provide cache hints. On Mac OS X, almost all apps load the libobjc.dylib library so caching the uniquing of function pointers is a big win for app launch times.

Having my applications ready to start at 08:57 when I'm about to grab the mouse at 08:58 improves my productivity. Consider that employees have sued their employers for requiring that employees be present during application startup time but not paid until the application has fully started up.

This may work for some, but not for others. The problem is the lack of consistency -- e.g., if I grab the mouse at 8:58 AM I get my e-mail quickly, but if I come in a little early at 8:30 AM I have to wait for it. This leads to user frustration and unnecessary force-quits of apps or hard power cycling.

But who signs the developer's certificate? And what keeps malware publishers from signing their trojans?

That's the point of the X.509 PKI system -- you have to be able to trace the signature on the app binary back to a known trusted root that signed the code-signing certificate. No, this won't prevent a malware writer from signing his or her code, but it accomplishes two things:

  1. It gives a traceable connection back to the author. Malware writers generally don't like this. ;-)
  2. It makes it impossible for the malware to inject code into an existing binary without disturbing the signature.

The second is really the key point -- Mac OS X won't run a signed binary if the signature is present but is not consistent.

Any malware that attempts to insert itself into applications will run into problems.

Unless an application tries to insert itself as, say, an assistive technology using the accessibility API.

The official accessibility API seems to be pretty safe from a security perspective. I haven't been able to find any reports of problems and a cursory look at the developer API docs doesn't send up any red flags. I would agree (and have long expressed the opinion to Apple) that Input Managers are a serious problem, and it looks like they've severely tightened the hole in Snow Leopard so that you need admin privs to install one. I'd like to see them gone entirely and have a proper system-level plug-in API. Now if only someone could get Microsoft to understand why ActiveX downloaded over the network is such a BAD idea... (*grumble*).

--Paul

Comment Re:I've heard that before.... (Score 5, Informative) 158

Moderators, please mod the parent down -- it completely misses the point.

Objective-C selector uniquing caching is NOT the same as Windows Superfetch.

Objective-C uses a two-phase dispatch for method calls. When you see a call in the Objective-C source code that looks like:

[myObject init];

the dispatch system:

  1. Looks up the function pointer for the method "init" in a table.
  2. Calls the "init" function via the function pointer.

The problem arises in the method dispatch table when you have multiple methods named "init" -- which is very common. When an application is loaded the dynamic loader ("dyld") needs to separately identify all of the methods named "init" (and any other methods with conflicting names) that apply to different classes. This is done by "tagging" each method in the dispatch table, a process called selector uniquing.

Now, this has to be not only for the application binary itself, but also for any Objective-C classes in shared libraries that are loaded. Almost all apps on Mac OS X load the libobjc.dylib library, which is cached to improve performance. As a part of the caching process, Snow Leopard now does the selector uniquing only once, and then stores the uniqued selectors in the cache. Thus, any application that links against libobjc.dylib (or any other library that is in the cache) only has to unique its own selectors, not those of the library as well. This significantly reduces the amount of overhead for launching an application compared to previous versions of Mac OS X.

This process does not attempt to retain application binary code in memory in the face of page-outs as Superfetch does. Selector uniquing caching speeds application launch times by reducing the amount of computation that has to happen at launch, not by pre-loading the application's binary.

Thread-local garbage collection is NOT the same as Windows Superfetch.

Thread-local garbage collection is a third phase of garbage collection added on top of the Objective-C 2.0 garbage collection system, which speeds up the garbage collection system even further. By concentrating GC to what has occurred in a single thread, the GC system can delay and reduce the cost of a slow global sweep even beyond the generational GC algorithm.

Windows Superfetch is a response to poorly written software.

To quote from the Wikipedia article:

The intent is to improve performance in situations where running an anti-virus scan or back-up utility would result in otherwise recently-used information being paged out to disk, or disposed from in-memory caches, resulting in lengthy delays when a user comes back to their computer after a period of non-use.

In my opinion as an experienced application developer the user should never run into the problem that Superfetch attempts to solve. Anti-malware scans or backups are generally limited by I/O transfer rates, not by CPU. In such situations, using lots of memory to pre-load data makes no sense. It is relatively easy to write a two-buffer, threaded, streaming system for situations that are constrained by disk transfer rates without consuming scads of memory.

In the bigger picture, Superfetch attempts to learn the times of day when apps are used and pre-loads their binaries. This is a nice concept, but I have serious doubts as to how useful it really is. The penalty for guessing wrong is fairly high, and users are more tolerant of consistent small slowdowns than they are of occasional long hangs (see the Mac literature on the spinning beach ball).

Mac OS X is less likely to need such anti-malware scans in the first place as the application binaries are now digitally signed by the developer. Any malware that attempts to insert itself into applications will run into problems. This is not to say that the Mac is immune -- I can think of a number of holes that could be exploited (such as the fact that unsigned binaries will still run and are still common; or that a malware author could attempt to insert his or her own root certificate into the trusted certificate store). However, these take more work and tend to limit the propagation rate of malware, making the platform a less-attractive target.

--Paul

Security

Virginia Health Database Held For Ransom 325

An anonymous reader writes "The Washington Post's Security Fix is reporting that hackers broke into servers at the Virginia health department that monitors prescription drug abuse and replaced the homepage with a ransom demand. The attackers claimed they had deleted the backups, and demanded $10 million for the return of prescription data on more than 8 million Virginians. Virginia isn't saying much about the attacks at the moment, except to acknowledge that they've involved the FBI, and that they've shut down e-mail and a whole mess of servers for the state department of health professionals. The Post piece credits Wikileaks as the source, which has a copy of the ransom note left behind by the attackers."
Security

Researcher's Death Hampers TCP Flaw Fix 147

linuxwrangler writes "Security researcher Jack Louis, who had discovered several serious security flaws in TCP software was killed in a fire on the ides of March, dealing a blow to efforts to repair the problem. Although he kept good notes and had communicated with a number of vendors, he died before fixes could be created and prior to completing research on a number of additional vulnerabilities. Much of the work has been taken over by Louis' friend and long-time colleague Robert E. Lee. The flaws have been around for a long time and would allow a low-bandwidth 'sockstress' attack to knock large machines off the net."

Comment Plan for on-going testing (Score 1) 93

I can't answer the question about recommending a testing company. However, I can tell you that you will need to have your app re-tested at regular intervals, as well as after any change (no matter how small) to the code or infrastructure. You need to build that into your plan and budget, and you need to have the tests run against your staging/QA setup so that you can catch problems before they hit the production site, as well as against your production environment.

--Paul

Slashdot Top Deals

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...