Great article from BBC News about the hazards of how people interperet statistical accuracy.
If a test to detect something (a terrorist in this case) is 90% accurate, how good is it? Ninety percent sounds pretty good. Until you flip it around and calculate how many errors it can generate, especially if the thing you are looking for is rare. Lets assume there are 300 people who are terrorists in the US. Thats approximatly 0.0001% in a population of about 300 million.
Our test is 90% accurate, which means that 10% of the results will indicate that someone is a terrorist when they in fact are not. So now you have 30 million of your 300 million people that are labeled as probable terrorists. That’s not terribly good.
Even if you increase the accuracy to 99.9% and ran your test again, you would wind up with 300,000 positives, of which only 300 are possibly correct. Possibly correct, because if the test can give false positives, it can also give false negatives (usually).