If the scope is "perfect" and there are no atmospheric limitations you can easily calculate the magnification required to make its line pair resolution just large enough to become barely visible to an observer with a given eyesight acuity.
Divide 115 by the scope's aperture in millimeters. Then divide the observer's acuity by the result. For a 50mm scope and an observer with 20/20 line pair acuity (120") that would be about 52x. I find in the real world the smallest line pairs are much easier for the eye to resolve when they are a bit larger than barely visible, so I would say 60x would be a bit better for that observer to easily discern the smallest details.
Also in the real world many people have better than 20/20 acuity and most telescopes are not "perfect" and the air is seldom perfectly steady, but I would start with that calculation and modify it to match reality. Personally I wouldn't consider any scope, no matter how inexpensive, with arc second resolution much worse than 140/D (where D is the aperture in mm) and I wouldn't consider a really expensive scope with resolution worse than about 120/D. Unfortunately, to add to the variables there is considerable sample variation among scopes specimens, so one unit may be worse than 140/D and another unit of the same model may be 120/D. To paraphrase PYRTLE......you should get what you pay for (but don't count on it.)
Henry