Conductor Resistance Testing 2019-07-30T16:12:40+00:00


Conductor resistance is a key cable test as a conductor with too high resistance poses a safety threat. When current passes through a conductor, the inherent resistance causes a heating effect. If the resistance is too high, the heat of the conductor may cause the premature failure of the insulation which may in return result in a fire or short circuit.

Measure the d.c. resistance of the conductor(s), either on a complete length of cable or flexible cord or on a sample of cable or flexible cord of at least 1 m in length, at room temperature and record the temperature at which the measurement is made. Adjust the measured resistance by means of the correction factors given in Table A.1.  The test determines the d.c. resistance of Class 1, Class 2, Class 5 and Class 6 conductors for plain copper, metal-coated copper, aluminium and aluminium alloy, circular or shaped. The result is expressed in ohms/km.

The cable shall be kept in the test area for sufficient time to ensure that the conductor temperature has reached a level that permits an accurate determination of resistance using the correction factors provided. Conductivity is directly related to temperature and the conductor resistance testing is based on a conductor temperature of 20°C. Calculate the resistance per kilometer length of cable from the length of the complete cable and not from the length of the individual core or wires. If necessary, correction to 20 °C and 1 km length shall be made by applying the following formula:

R20=Rt × kt × 1000 / L

kt is the temperature correction factor from Table A.1;
R20 is the conductor resistance at 20 °C, in Ω/km;
Rt is the measured conductor resistance, in Ω;
L is the length of the cable, in m.

Traditionally the conductor resistance test equipment is either a Kelvin Double Bridge or a Wheatstone Bridge, using the principle of balancing the voltage across known resistors to determine the unknown resistance. Modern equipment has been developed which determines the cable conductor resistance through the principle of Ohms law, calculating the current and voltage across the sample, and from this determining the resistance. Accuracy for the test is typically in the region of 0.2 to 0.5%.

If you’re interested in getting full lists of the standard required conductor resistance at 20 ℃, please contact us at [email protected]

Return to the quality control overview page.