4 ohm vs 8 ohm speakers
Now the power delivered is the square of the current X the resistance.
So if we have the 8 ohm load the square of the current is 200 watts/8 ohms which is 25. The square root of 25 is 5 amps. So the amp has to deliver 5 amps to produce 200 watts into 8 ohms. Now the voltage from ohms law is the current X the resistance which is 5 amps X 8 ohms which is 40 volts
Now lets take the four ohm load. The square of the current is 200 watts/4 ohms. Which is 50, the square root gives a current of 7.071067 amps
Now the voltage from ohms law is 7.071067 amps X 4 ohms which is 28.284 volts.
Now if the amp did not drop its voltage when driving the 4 ohm load then the power delivered is the square of the voltage divided by the resistance. So we would have 40 volts squared/4 ohms this is 1600/4 which is 400 watts. Now you might say what is the difference. Well if the amp drops voltage supplying the four ohm load it is current limited. The transistors will be stressed and the amp will likely hit a hard clip and in the worst case scenario will clip at five amps, if that is its maximum current output. If you do the above calculations that would be 100 watts into 4 ohms.
The point is that amplifiers that have to deliver more current require larger power supplies and bigger output devices. It is the current output of the amp that has a bearing on cost. The maximum current output determines how much power an amp can deliver to a given load. The voltage an amp can deliver is determined by the operating voltage of the power transistors. Obviously the output voltage can not exceed that.
I hope that has answered your question.