Kevin- Im sure a lot of car audio guys on the internet out there "think" and "say" that .9v is going to make a "big difference"..........but has anyone actually done any scientific tests, or have there been any official statements from the guys that actually engineered the amplifier in the first place?
I just have a hard time believing that .9v is really going to make a difference. If you do the math, thats barely 5 amps more current draw. And even then, its a wash...because there is no way your system is drawing (lets just say for arguments sake) 1500+ watts for more than a split second at any given time. An acceptable voltage level for a running car is generally accepted as anywhere between 13.2v and 14.7v. So anyone who is designing ANYTHING electronic for an automotive environment should accept that as a margin of error...and therefore the widget they are designing (whether its an amplifier or a lockup controller) should be able to operate at 100% capacity without any noticeable performance degradation or quirks if the input voltage is within that range.
IF the amp in question REALLY is going to perform noticeably differently, or there is going to be an impact in performance or reliability between operating on 13.5v vs. operating at 14.4v, then quite frankly, thats a really shitily (I know thats not a word) designed amplifier.
All im saying, is that I would like to see some scientific data from an actual amplifier engineer that says .9v is going to make a real difference.
Of anything in a vehicle, the most voltage sensitive (and electrical sensitive) components are the computers. If all of the electronics in our trucks can cope with a crappy power input source and work properly on anywhere from ~8v to 16v+, then im sure an amplifier is going to be fine.
As long as your electrical system isnt dropping below 13v continuously, I really wouldnt worry about it..........