Hardware rules of thumb dictate that more power is better. The more juice you can keep consistently flowing to your components, the smoother your system will run. A wimpy, tiny power supply can cause all sorts of problems by not feeding your components enough electricity. Crashes and glitches can occur if your power supply isn't up to snuff; in fact, the word "glitch" in electronics refers to unwanted electrical pulses that can disrupt circuits. Logically, you want a power supply that can put out as much electricity as possible. But is it possible for a power supply to be too powerful?
"sweet spot" of power consumption is between 60 and 85 percent of your power supply's maximum rating; if your system's maximum power consumption is 450 watts, you should get at least a 550W power supply. If your system's maximum power consumption is 600 watts, you should get at least a 750W power supply. Calculating your system's potential max load isn't difficult--most component websites show you how many watts their parts will need on a full load (eg. max TDP for CPUs and GPUs).
There's very little reason to get a 1200W power supply when you're building a 400-watt workstation, but besides the initial cost, as long as the power supply is well-made, there's no real disadvantage to it. According to Anandtech, efficiency can vary between different power supplies, and a more efficient power supply will be more economical in the long run, but there's no real correlation between power supply size and efficiency. The low-power (300-500W) power supplies showed that the 300W models were generally more efficient, while the high-power (600-800W) power supplies showed that the 800W models were more efficient. It simply varies between power supplies.