To physically determine capacitance amperage limiting for any particular AC voltage and frequency, you can perform a very simple experiment.
Take a (motor run) nominal (say 25 uF) capacitor rated for your voltage (or higher) and wire it in series with NO load (this would normally be considered a direct short). The capacitor will prevent it from actually being a direct short by limiting the amperage. Put an AC ammeter in series with the capacitor to see what the amperage is.
O = AC inputs
|| = capacitor
A = ammeter (can be direct wired or amp-clamp)
If you measure the wattage of the ‘test’ circuit in Note 1 with a wattmeter, you will find it to be near zero. This is because the capacitor ‘decouples’ the load voltage from the source voltage, by returning any voltage the load didn’t use to the source on the alternate cycle.
The ‘test’ circuit shows no (or very little) wattage ‘consumed’ because there is no ‘load voltage’ (only the voltage to overcome circuit resistance).
Wattage actually used (recorded in a wattmeter) is the load (circuit) voltage times the load (circuit) amperage.
So if the load (circuit) voltage is zero and the load (circuit) amperage is 1 amp, then the wattage will be zero (0 x 1 = 0).
Electrical engineers often have trouble grasping this concept because they are trained to think the source voltage and the load voltage are the same thing. Normally they are.
But when you add a capacitor in series with the load, you ‘split the voltage’ and the source voltage no longer matters to the wattage calculation; because any voltage that is not ‘consumed’ by the load is returned to the source ‘unused’.