I would put it that you should set the (max) relative gains at levels that allow the system to stay out of interference under average nationwide lighting conditions, with no medium-intensity storms with 300 miles or so.
Then, keep those ideal relative gains, but adjust the thresholds and gains downward so that the thresholds are as low as possible, given your local noise/interference situation - still keeping the same computed relative gains
So, for example, you may set your gains and thresholds to 16x10 at 120mV on both channels and find that the sensitivity of the system keeps you out of interference most of time, given lightning conditions mentioned above. Those settings compute to a relative gain of 16 * 10 *40 * 100/120 = 5333. You have a low noise floor between detections and little local interference, so you decide to reduce the threshold to 65mV on both channels. To keep the same optimal relative gain sensitivity, you would also need to readjust the gains lower to maintain an equivalent relative gain, say 10x8. That would give roughly the same relative gain (10 * 8 * 40 * 100/65 = 4923.
Basically, that's part of the manual process that equates to what "auto adapt to noise" does. At some point, excessive noise or excessive lightning strikes, which look like noise, will force the "auto adapt to noise" algorithm to begin reducing the relative gains from what you have set.
Operating with lower thresholds has the advantage of requiring less gain in the amplifiers, reducing amplifier noise and operating with a better signal/noise ratio which enhances sensitivity, even though the relative gains of both sets of settings are apparently equivalent.
Easier said than done, and there is no one setting that will account for all combinations of lightning/noise conditions.
Don
WD9DMP