Originally posted by Fishchris
View Post

The amps should draw less current off higher voltage supplies, though I'm not clear on how they designed them of if they have a DC-DC supply internally that locks in the internal supply to an expected voltage. The pages were mostly in Spanish. Presuming they have a DC-DC switching supply in order to keep the output in an expected design range, the input current will be inversely proportional to the input voltage for the same power output.
For example, lets say 1000W RMS output at 2 ohms. That's 22A RMS into 2 ohms, 45V RMS, 127V P-P. So the amp theoretically needs a 130V supply at 22A to keep that going. On a 12V input, that's 238A but on a 48V system you only need 60A to get that kind of power. Those are all rough theory calculations, the real current needed is much lower than that, plus nobody runs a full out sine wave indefinitely at full power.
Leave a comment: