Why do electric chargers get bigger as their power increases?
Mike MunayShare
Something doesn't add up.
We live surrounded by increasingly efficient, miniaturized, and elegant technology. Mobile phones are thinner, chips are more powerful, and computers are more compact. Everything seems to be moving in the same direction: more power in less space. And yet, there's an everyday object that goes in the exact opposite direction. The more power an electric charger delivers, the larger and heavier it becomes.
The small mobile phone adapter fits in any outlet. The laptop adapter is already half a brick thick. And high-power chargers look like pieces of industrial engineering.
Intuition tells us that shouldn't happen, because if a charger has to reduce the power to adapt it to the device, shouldn't it be smaller the more powerful the device is? Wouldn't it have less work to do?
The question is complex because it points to an apparent contradiction between what we think we understand about electricity and what we see on the table every day. But as is often the case in physics, when something seems absurd, it's not because reality is poorly designed.
It's because our mental model is too simplistic. And behind the increasing size of chargers there's no bad engineering, no unnecessary marketing, no clumsy design. There are transformers, magnetic fields, heat losses, and very specific physical limitations that are rarely explained.
Understanding them completely changes the way we look at that "brick" hanging from the socket.
How do electric chargers and transformers work?
A charger converts high-voltage alternating current (AC) from the electrical grid into low-voltage direct current (DC), which is the form of energy that electronic devices need.
- Alternating current (AC): electricity changes direction many times per second. This is the type of electricity that comes to your home from the electrical grid (230 V at the outlets). It is advantageous for transporting energy over long distances and powering electrical networks due to its efficiency and ease of voltage change.
- Direct current (DC): electricity always flows in the same direction. This is the type of current used by electronic devices (batteries, mobile phones, laptops). It is best used to power and control electronic devices and batteries where a stable and precise voltage is needed.
To perform the conversion, an internal transformer or converter circuit is used to reduce the voltage and adjust the current. Traditionally, this conversion was done with large, heavy 50/60 Hz transformers, which directly stepped down the 230 V mains voltage to a few usable volts.
Modern chargers work differently. They first convert electricity into a high-frequency signal and then transform it, allowing for the use of much smaller and more efficient components. These switched-mode power supplies explain why today's chargers can be so much more compact.
In both cases, the goal is the same: to deliver the appropriate power (watts) to the device without damaging it, by precisely regulating voltage and current.
A charger acts as an intelligent intermediary between the power outlet and the device, adapting the energy so that it arrives in the exact form it needs.
So far, everything seems simple. The problem starts when power comes into play.
Voltage, current and power: understanding the difference
This is where intuition often fails.
A common misconception is that reducing the voltage less (for example, from 230V to 19V instead of 5V) should make the charger smaller. In reality, what matters is the power (watts) the charger delivers, which is the product of voltage and current (W = V × A).
Although a laptop charger reduces the voltage less than a mobile phone charger, it typically delivers significantly more amperage. For example, a phone charger might provide approximately 5V * 2A = 10W, while a gaming laptop requires approximately 19V * 6A = 114W. This much higher power output means it can handle more current and energy, even if the voltage reduction is smaller.
The voltage difference is not wasted, but the charger converts the energy; and the more energy (W) it has to transfer, the more demanding the conditions are for the internal components.
Why does more power mean a larger charger?
Delivering more power at lower voltages requires more current (W = V × A). To conduct this current without overheating, the wires, windings, and internal traces must be thicker, and the transistors, diodes, and capacitors need to be larger or more numerous. In transformers, this higher current generates stronger magnetic fields, necessitating the use of larger magnetic cores to prevent saturation and transfer energy efficiently.
Added to this is the heat. Losses are unavoidable, and some of the energy is dissipated thermally. Even if a charger is very efficient, a small percentage of 100W generates much more absolute heat than the same percentage of 10W. That's why high-power chargers require more surface area and volume to dissipate the heat safely, while low-power chargers can be much more compact.
Delivering more watts requires bulkier hardware: more material, a larger cross-section, and more surface area to manage current, magnetic fields, and heat. This is driven by Joule's law (heating is proportional to I²R) and magnetic saturation limits, physical laws that necessitate scaling up the size as power increases.
The role of frequency: miniaturization vs power
For a long time, the size of the loaders was limited by the frequency of work.
Conventional transformers operate at 50–60 Hz. At these frequencies, to transfer power without saturating the magnetic core, large cores and bulky windings are needed: more iron, more copper, and more weight. This is not a design problem, but a matter of physics.
Miniaturization begins when the mains frequency is abandoned.
Modern chargers first convert electricity into a high-frequency signal, typically tens or hundreds of kilohertz. At higher frequencies, the transformer can be much smaller for the same power output because energy is transferred in much faster and more efficient cycles.
But increasing the frequency doesn't come without a price. It means more switching losses, more electromagnetic interference, and more heat generation. For years, conventional silicon set the practical limit: switching too fast caused efficiency to drop and temperature to rise.
Therefore, frequency became a compromise between miniaturization and power. It allows for smaller chargers, but only to the extent that materials and thermal management permit.
Modern technologies: GaN and new materials that shrink chargers
Before current technologies, the size of chargers was limited by the physics of the available materials. You could optimize designs, fine-tune components, improve efficiency… but there was a point beyond which you couldn't shrink them any further without paying a price in the form of heat or losses.
That began to change with the arrival of new technologies.
The key lies in the semiconductor materials. Traditional chargers use silicon, a reliable, inexpensive, and well-known material, but with clear limitations: it doesn't tolerate very high temperatures or extremely fast switching without generating significant losses.
This is where gallium nitride (GaN) comes into play.
GaN allows for operation at much higher frequencies, withstands higher voltages and temperatures, and significantly reduces energy losses. In practical terms, this means transistors can switch on and off millions of times per second without overheating. And that has a direct consequence.
At higher switching frequencies, transformers and inductors can be much smaller to transfer the same power. Less copper, less magnetic core, less volume. Furthermore, by losing less energy as heat, the need for large heat dissipation surfaces is also reduced.
That's why today there are USB-C chargers of 65, 100 or even more than 140 watts that fit in the palm of your hand, something unthinkable just a few years ago.
GaN doesn't eliminate the laws of physics. It pushes them, optimizes them, and makes better use of them. Even with efficiencies exceeding 95%, a high-power charger still generates several watts of heat that must be dissipated, and it still handles high currents that require safety margins. That extra volume isn't a design choice. It's energy obeying the laws of physics.
Modern chargers are smaller not because power no longer matters, but because current materials and designs allow them to get much closer to the physical limits without exceeding them. And yet, as power continues to increase, the size grows again.
Conclusion: Physics rules, although technology helps
The answer is simpler than it seems, although not always intuitive: chargers get bigger as their power increases because they have to handle more energy safely and stably.
It's not about how much they lower the voltage, but how many watts they need to transfer. More power means more current, stronger magnetic fields, components that can't be allowed to saturate, and heat that needs to be dissipated. All of that takes up space.
Physics doesn't negotiate here. Joule's law, magnetic saturation, and thermal management impose clear conditions: to move more energy without breaking anything, you need more material, more surface area, and a greater safety margin.
The good news is that technology is pushing those limits. Switched-mode power supplies, high-frequency designs, and materials like GaN are enabling increasingly smaller chargers for ever-increasing power outputs. But even with these improvements, the general rule still applies.
As long as there is heat to dissipate and current to control, more power will tend to mean more volume.