
1w/1m vs 2.83v
Hey folks. I've searched PE along with the web and have not found a definite answer to my question. I know that 1w/1m and 2.83v/1m is the same at 8 ohms, but that the 2.83v/1m at 4 ohms equates to 2w/1m, effectively doubling the power and resulting in a rating 3db higher than if the same 4 ohm driver was measured at 1w/1m. So, I have 3 questions...
1. If speaker A is rated at 93db at 1w/1m(4 ohms), and speaker B is rated 96db at 2.83v/1m(4 ohms), does that mean that they are both actually 93db at 1w/1m?
2. If speaker A is rated 93db at 1w/1m(this time it's an 8 ohm) and speaker B is rated at 96db at 2.83v/1m(this one a 4 ohm) am I correct in thinking that these 2 drivers are both actually 93db at 1w/1m as well?
3. Now, assuming the amplifier powering these speakers doubles it's ouput power when increasing it's load from 8 ohms to 4 ohms, as many amps do, does that mean that speaker B from question 2 will actually play 3 db louder than speaker A in question 2?
This seems like it should be simple, and many of you probably think it is, but I've read so much about it in the last 24hrs that all the info is actually confusing my thinking as opposed to helping. I'm hoping someone here can help set this straight. Thanks.

Re: 1w/1m vs 2.83v
Not important unless the amp is a very low watt amp. Many reasons why it is not important. here is one many amps that say they are 100 watts at 4 ohm and 50 watts at 8 ohms are not accurate at saying the power doubles.
Also if you have an amp that can put out 100 watts at 4ohms and a 93db speaker that can handle the 100 watts you can play it too loud in most cases.
as for your question as i understand it. the second speaker (4ohm) would play louder with the knob of the volume set at equal numbers on the amp.

Re: 1w/1m vs 2.83v
Th advantage to the 2.83v rating is that you'll get an apples to apples comparison of how two speakers will compare driven by the same amp, as the output capability of amps is voltage limited. But if you have adequate headroom it's not all that important. The main reason why manufacturers rate a 4 ohm cab at 2.83v is that it will give a 3dB higher reading than 1 watt/2.0v, and that's purely a marketing tool.

Re: 1w/1m vs 2.83v
Originally Posted by billfitzmaurice
Th advantage to the 2.83v rating is that you'll get an apples to apples comparison of how two speakers will compare driven by the same amp, as the output capability of amps is voltage limited. But if you have adequate headroom it's not all that important. The main reason why manufacturers rate a 4 ohm cab at 2.83v is that it will give a 3dB higher reading than 1 watt/2.0v, and that's purely a marketing tool.
Also, you may wonder why you don't just see 1 rating method? Why not rate every driver at 2.83v, OR at 1w?
Well, when you're using drivers in a multiway system with a passive crossover, you MUST use a common voltage rating (because that's what the amp is delivering), typically 2.83v. If you've got an FR curve made at 1w, you'll have to modify it, unless ALL your drivers are the same impedance.
But, when using a plate amp on a sub, the 1w rating tells you more, quicker (unless you'rre comparing an 8 ohm vs 4 ohm and the amp's output varies w/impedance, as most do).
chris

Re: 1w/1m vs 2.83v
Originally Posted by billfitzmaurice
Th advantage to the 2.83v rating is that you'll get an apples to apples comparison of how two speakers will compare driven by the same amp, as the output capability of amps is voltage limited. But if you have adequate headroom it's not all that important. The main reason why manufacturers rate a 4 ohm cab at 2.83v is that it will give a 3dB higher reading than 1 watt/2.0v, and that's purely a marketing tool.
Is it really apples to apples though. 2.83 volts into an 8 ohm load equals 1 watt from an amp, and 2.83 volts into a 4 ohm load equals 2 watts from the same amp, how does that make it apples to apples? Theoretically, shouldn't the 4 ohm driver in this scenario play/measure twice as loud? This is where I get confused because it doesn't seem like it is an apples to apples comparison, unless all the drivers one is trying to compare are the same impedance. I too think this simply a marketing tool and find that the 1w/1m rating would be more of an apples to apples comparison, as it doesn't matter what impedance the driver is.
But aside from the reasoning why manufacturers are doing this, is there anything incorrect with my original post? Am I understanding it correctly?

Re: 1w/1m vs 2.83v
Originally Posted by billfitzmaurice
Th advantage to the 2.83v rating is that you'll get an apples to apples comparison of how two speakers will compare driven by the same amp, as the output capability of amps is voltage limited. But if you have adequate headroom it's not all that important. The main reason why manufacturers rate a 4 ohm cab at 2.83v is that it will give a 3dB higher reading than 1 watt/2.0v, and that's purely a marketing tool.
Originally Posted by Chris Roemer
Also, you may wonder why you don't just see 1 rating method? Why not rate every driver at 2.83v, OR at 1w?
Well, when you're using drivers in a multiway system with a passive crossover, you MUST use a common voltage rating (because that's what the amp is delivering), typically 2.83v. If you've got an FR curve made at 1w, you'll have to modify it, unless ALL your drivers are the same impedance.
But, when using a plate amp on a sub, the 1w rating tells you more, quicker (unless you'rre comparing an 8 ohm vs 4 ohm and the amp's output varies w/impedance, as most do).
chris
I guess I didn't say this in the original post, but I comparing individual drivers, not multiway speaker systems. Still not sure why that would matter though.

Re: 1w/1m vs 2.83v
Originally Posted by philiparcario
Not important unless the amp is a very low watt amp. Many reasons why it is not important. here is one many amps that say they are 100 watts at 4 ohm and 50 watts at 8 ohms are not accurate at saying the power doubles.
Also if you have an amp that can put out 100 watts at 4ohms and a 93db speaker that can handle the 100 watts you can play it too loud in most cases.
Sorry, but I don't think that has anything to do with my question. The power of the amp does not matter, we are discussing the sensitivity ratings of speakers and we are doing it in theory. I understand that not all amps double or even increase their power output at all when presented with larger loads, but again, I am just asking about the theory of the 2 different sensitivity ratings that the manufacturer's are using. Thanks.

Re: 1w/1m vs 2.83v
Originally Posted by emilime75
Is it really apples to apples though. 2.83 volts into an 8 ohm load equals 1 watt from an amp, and 2.83 volts into a 4 ohm load equals 2 watts from the same amp, how does that make it apples to apples? Theoretically, shouldn't the 4 ohm driver in this scenario play/measure twice as loud?
Power doesn't govern loudness; voltage swing does. SS amps deliver the same voltage swing into any load. Using the same voltage to measure sensitivity gives one an accurate comparison of how loud the speakers/drivers will go driven by the same amp, assuming the amp can handle the current draw.

Re: 1w/1m vs 2.83v
Originally Posted by emilime75
Is it really apples to apples though. 2.83 volts into an 8 ohm load equals 1 watt from an amp, and 2.83 volts into a 4 ohm load equals 2 watts from the same amp, how does that make it apples to apples? Theoretically, shouldn't the 4 ohm driver in this scenario play/measure twice as loud? This is where I get confused because it doesn't seem like it is an apples to apples comparison, unless all the drivers one is trying to compare are the same impedance. I too think this simply a marketing tool and find that the 1w/1m rating would be more of an apples to apples comparison, as it doesn't matter what impedance the driver is.
But aside from the reasoning why manufacturers are doing this, is there anything incorrect with my original post? Am I understanding it correctly?
No. Amps are voltage sources. The only way to compare two drivers is with comparable voltage sensitivity. You can not match drivers using the 1W/M rating unless you are converting based on a constant voltage and their impedance (which is not constant, anyway). 2.83V/M sensitivity is the method to normalize everthing so you are indeed apples to apples.

Re: 1w/1m vs 2.83v
Originally Posted by emilime75
Sorry, but I don't think that has anything to do with my question. The power of the amp does not matter, we are discussing the sensitivity ratings of speakers and we are doing it in theory. I understand that not all amps double or even increase their power output at all when presented with larger loads, but again, I am just asking about the theory of the 2 different sensitivity ratings that the manufacturer's are using. Thanks.
OK. Let's say that we have an amp that is rated at 100w RMS into either a 4 ohm or 8 ohm load (skipping 2n, 6n, 12n, & 16n, or just saying that the amp is capable of 100w into any REASONABLE load). Also (even thhough this is NOT true), lets say that the driver's load does NOT change with frequency, but that it is a constant load across the spectrum, AND let's say that there is no crossover involved, like we're comparing two full range speakers.
One at a time:
1. If speaker A is rated at 93db at 1w/1m(4 ohms), and speaker B is rated 96db at 2.83v/1m(4 ohms), does that mean that they are both actually 93db at 1w/1m?
A 4n 93dB/w = 93dB/2.0v
B 4n 96dB/2.83v = 96dB/2w
By Ohm's Law, if we apply a 2v signal:
volts / ohms = amps, and amps * volts = watts
2v / 4n = 0.5a, 0.5a * 2v = 1w
2.83v / 4n = 0.71a, 0.71a * 2.83v = 2.0w
Yes, both are actually 93dB/w.
2. If speaker A is rated 93db at 1w/1m(this time it's an 8 ohm) and speaker B is rated at 96db at 2.83v/1m(this one a 4 ohm) am I correct in thinking that these 2 drivers are both actually 93db at 1w/1m as well?
A 8n 93dB/w = 93dB/2.83v
B 4n 96dB/2.83v = 96dB/2w = 93dB/w
Yup. Both are 93dB/w.
3. Now, assuming the amplifier powering these speakers doubles it's ouput power when increasing it's load from 8 ohms to 4 ohms, as many amps do, does that mean that speaker B from question 2 will actually play 3 db louder than speaker A in question 2?
Let's say that the amp can supply 10w @ 8n, and 20w @ 4n.
B 4n 93dB/w = 103dB/10w = 106dB/20w @ clipping.
A 8n 93dB/w = 103dB/10w @ clipping.
Yes. Driver B can play 3dB louder at clipping with this amp than driver A can.
The (obvious) answer is that with 2 drivers both having the same sensitivity (being 93dB/w), halving the load impedance will double the output.
Furthermore:
By Ohm's Law: V = sqrt(W*n), so . . .
For speaker A: V = sqrt(10*8) = sqrt(80) = 8.9 volts @ clipping into 8n.
For speaker B: V = sqrt(20*4) = sqrt(80) = 8.9 volts @ clipping into 4n.
So, this amp can supply 8.9v into either 4 or 8 ohms @ clipping.
Into 8n: 8.9v / 8n = 1.11 amps, and 1.11a * 8.9v = 9.9 watts (rounding err)
Into 4n: 8.9v / 4n = 2.22 amps, and 2.22a * 8.9v = 19.8 watts (rounding err)
I'd want to have a bit heavier gauge wire with the larger current draw.
Chris
Thread Information
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)
Posting Permissions
 You may not post new threads
 You may not post replies
 You may not post attachments
 You may not edit your posts

Forum Rules

Your #1 Source for Audio, Video & Speaker Building Components
Clearance Center
New Products
View Our latest Sales Flyer Prices Effective 12/1/15  1/31/16
Sign up to receive our Sales Flyers
Speaker Component Categories
Home Audio Speakers
Professional Audio & Guitar Speakers
Car Audio Speakers
Speaker Buyouts
Measurement & Design Tools
Subwoofer Plate Amplifiers
FullRange Plate Amplifiers
Crossover Components
Cabinet Hardware & Speaker Grill Cloth
Speaker Cabinets
Subwoofer System Kits
Speaker Kits
Speaker Repair Parts
Speaker Wire
