The current they need on the test set relates to how low the resistance is and how accurately they need to measure the resistance. The lower the resistance, the higher the current required for the test.
In general it is the current not the voltage of the system under test that is important.
As an example, a 10 A micro-ohmeter will generally have an accuracy down to 0.1 mΩ, a 600 A micro-ohmeter would generally measure accurately down to around 0.1 µΩ
You normally measure resistance to check it does not exceed the acceptable values for your application. What the acceptable values are is dependent on the application. Some applications have specifications (often stated by a component manufacturer), some applications have guidelines (and rules of thumb), and some applications have various formal or industry standards.
You normally measure low resistance for one (or more) of the following three reasons:
1 & 2 are generally dictated by the current flowing through the conductor. 3 is generally dictated by the resistance of the other paths to ground.
If you have a 500 A circuit with a joint that has a resistance of 0.1 Ω, then ohms law (V=IR) tells us that there will be a 50 V drop across the joint. If this resistance is in a joint in the conductor feeding one phase of a three phase motor, the other two phases don't have this resistance, and the motor is running at 1000 V, then the motor is going to be electrically unbalanced by around 5% which will change its power output, efficiency, and input power requirements by substantially more than 5%. This will lead to premature failure of the motor. In addition, the machine will be mechanically unbalanced which will cause it to wear out things like bearings (and generate a lot of heat in the bearings). Furthermore, there will be a lot of energy lost and heat generated in the 0.1 Ω joint itself, which is covered in the next section.
There will also be a power loss across the joint (dissipated as heat) which can be calculated from P=I2R. In the above case this is 500 x 500 x 0.1 which is 25,000 W or 25 KW! That is the equivalent of more than 10 three bar radiators of heat from one joint! At 20c per KW/h that is also $5 per hour of wasted energy (and probably more since you will probably also have to use more energy to cool the system). This is obviously bad, but what if this was a 3000 A system? This would be 3000 x 3000 x 0.1 which is 900,000 W or 900 KW of energy loss and heat dissipation (in both the above cases this would generally just melt the joint). That is also $180 per hour of wasted energy from one bad joint. If we talk about a domestic or light commercial electrical switchboard with a 0.1 Ω joint in a circuit which could be perhaps 20 A at full load, that would mean there would be 20 x 20 x 0.1 = 40 W energy lost and heat generated in the switchboard. 40 W is a fair amount of heat and energy lost in the confines of a switchboard, but certainly not 25 KW or 900 KW.
Obviously 0.1 Ω is a large joint resistance value for all of the above, and if you had a joint at say 0.1 mΩ (the lowest value you could accurately measure with a 10 A low ohms meter) you would reduce these numbers to 2.5 W for the 500 A, 90 W for the 3000 A, and 0.04 W for the 20 A. Not really significant in the 20 A circuit, but still not good in a 3000 A circuit.
So as you can see, the appropriate test will depend very much on the application and the currents involved.