The core challenge in power transmission is "how to reduce losses." According to the laws of physics, the loss of current passing through a conductor is directly proportional to the square of the current (P_loss = I²R). To reduce losses, one must either reduce the resistance (by thickening the conductor, which is extremely costly) or reduce the current. However, current is inversely proportional to voltage (P=UI). Under the premise of constant power, increasing the voltage can significantly reduce the current-this is the core logic of high-voltage power transmission.

At this point, the key difference between alternating current (AC) and direct current (DC) becomes apparent: AC can easily raise and lower voltage using transformers, while DC cannot do this efficiently for a long time.

The electrical energy generated by the power plant (usually around 20kV) can be stepped up to ultra-high voltages of 110kV, 220kV, or even over 1000kV by a step-up transformer. When transmitted over long distances via transmission lines, the current is compressed to an extremely low level, and losses are controlled within an acceptable range. After reaching the user end, the voltage is further stepped down by a step-down transformer to 220V (civilian) or 380V (industrial), ensuring safe and convenient use by equipment.

The inherent weakness of direct current (DC) lies in the complexity of voltage conversion. In the early days, there was a lack of efficient DC transformers. To achieve high-voltage DC transmission, voltage regulation had to be carried out through complex mechanical devices or expensive electronic equipment, which was not only costly but also far less reliable than transformers. This seemingly simple "transformation problem" directly determined the dominant position of alternating current (AC) in the power grid.

Ultimately, the power grid chooses alternating current (AC) because it perfectly solves the core requirements of "large-scale, long-distance, and low-cost" power transmission.






