The complete question was:
"I am trying to use bad_rsfit.omac to display the "bad" residues. I am not sure what cutoff I should be using. I am working on 3 A model. Another related question: when i use rsr_setup, how do I choose the correct scale (rsr_scale) ?"
RS-fit values are resolution dependent; I suggest you use
O > db_statistics mol_residue_rsfit
To get the average and standard deviation and then to use (average - 2 * sigma) as a cutoff. In this way you relate your cutoff to the overall results (there's no point in using a 0.8 cutoff for a 3A map; using average - 2 * sigma tells you which residues are considerably worse than the average residue).
By the way, optimise the values of A0 and C before you use rs-fit ! See Q.030 on how to do this.
The scale is first of all unimportant since rs-fit calculates a correlation coefficient and secondly it is calculated automatically by rs-fit. (This is not as big a contradiction as it seems; soon we'll get the real-space R-factor back again and then the scale is needed.) The only thing you have to check is that it's actually a number (i.e., not "NaN.000") ...
Other FAQs related to rs-fit are: Q.002, Q.026, Q.030, Q.824 and Q.832.
By the way: instead of using bad_rsfit.omac I would urge you to use OOPS ! This program checks many diferent quality-related properties and produces a set of O macros which will automatically take you from one suspect residue to the next, telling you exactly what's wrong with each residue.