I'm not versed in the arcane arts of EE— could anyone give me a basic definition of voltage sag? And would there be any reason to build it as a feature in a charger?
Generally, if the voltage drops under load like this, it's a sign of a poorly designed charger that doesn't regulate well. But since Apple's charger appears otherwise well-designed, it's a bit of a puzzle. One possibility is that maybe the exact voltage input to the iPhone doesn't matter, and the designers knowing this didn't care about the voltage sag. Another possibility is for some unknown reason they deliberately designed the charger this way.
The interesting points to remember here are:
- USB specifies 5V, not the iPhone
- Voltage drop across the iPhone's internal regulators and charging circuits are probably less than 0.5V
- Most cell phone LiIon batteries are 3.7V
So, presumably they made a charger that allows droop to 4.6V because this is a voltage still capable of charging the battery, yet also allows a less expensive charger design.
Afaik USB specs are 5V+-10%, ie 4.5V-5.5V. So the iphone charger is well within the spec and all conforming devices should be able to handle that.
While it's an interesting measurement, I'm not sure how relevant the current sag is to overall quality. Devices really shouldn't be using the charger in the constant current range. I guess it does reflect the overall workmanship and thought put into the design.
Also, nice job doing those ripple measurements - that was interesting stuff! It's not super surprising that the cheap knockoffs fail miserably there. Probably cheap capacitors that are way too small.
I'm not sure why any end user would care about this, unless they are using a device with no battery.
The chargers look like they'll work together. For example, the Kindle charger will make my pocket camera light up its charging light, but after a minute or so, it goes out, as if the charge has completed. In fact, though, the battery is still dead. The other combinations of cell phone / camera / Kindle chargers are similar.
In real life charging is also limited by the battery temperature for safety reasons. If the battery becomes too hot during charging (because of the environment temperature, poor ventilation or just because the device has a huge consumption while charging) the charging will stop and will start again when the battery cools down. This is why we do not see chargers with bigger Amp rating: the batteries are limited to a current that doesn't get them too hot.
You can still find some on Amazon, it's just hard to tell whether the seller will give you the correct, genuine ones. http://www.amazon.com/Palm-Wall-Adapter-Charger-Treo/dp/B002...
After fiddling with those awful 30 pin apple connectors, they feel like pure sci-fi.
Of course I'm not sure if this is applicable to USB chargers which all are relatively low power.
I did a quick test to see what really happens, plugging a Samsung phone into a iPhone charger and an iPad charger. In both cases, the charger used 3.0 watts of wall power. (The phone was turned off and charging since if it is turned on, the load fluctuates a whole lot as the phone does random things.) So my conclusion is that the size of the charger doesn't affect efficiency.
An iPad charger is not a 'large' charger, it's a fairly small step up from an iPhone charger, since you are reporting 3.0 watts of 'wall power' but your multiplication of scope measured values does not correct for power factor you are likely off by quite a bit on both measurements.
GP mentioned a HP touchpad charger to charge a phone, I don't have a HP touch pad charger here but the specs are quite terrible , you'd have to measure with that specific charger to answer the specific question or you'd have to do a comparison of a large range of chargers with accurate measurement methodology in order to really answer the general question.
As it is your conclusion contradicts practical engineering and I'm afraid it will not hold up in a better test, which would be to try a number of switched mode supplies of various sizes designs with various loads. Plugging in one device and doing a hasty (wrong, ignoring phase shift) measurement does not warrant your conclusion.
To measure efficiency you're going to have to take the power factor into account, this can be quite hard to do, and theoretical efficiency doesn't matter for a practical test (you're measuring, not theorizing).
The wave forms that switched mode chargers  output and consequently the kind of load they represent to the grid is so irregular that most non-caloric and power factor corrected measurements will give values that are not accurate. That noise that is present on the output wires will be to some extent visible on the input side.
A normal Watt meter will work best with transformer based supplies or resistive loads, accuracy for small switched mode loads will be anywhere from 'so so' to 'terrible' depending on the make and model power meter. Good brands (for instance Fluke) do most calculations right and will be able to deal with CFLs and other phase shifted loads, bad brands (I won't name them but they're killing it in the domestic watt meter department) will give wildly in-accurate results.
But even a quality meter like a Fluke will still have trouble with this kind of spiky load, especially if it is small.
It would probably be a good idea to (properly) describe your test rig along with the results it says:
"I measured the AC input voltage and current with an oscilloscope. The oscilloscope's math functions multiplied the voltage and current at each instant to compute the instantaneous power, and then computed the average power over time. For safety and to avoid vaporizing the oscilloscope I used an isolation transformer. My measurements are fairly close to Apple's, which is reassuring. "
But you can't really do it that way and get accurate results, instantaneous power draw using a switching supply changes several hundred thousand times per second and is likely phase-shifted so a simple multiplication is not going to work.
Accurately measuring (low) power draw from switched mode consumers is a really tricky problem, it's easy enough to read some numbers from a display but I can assure you that this is not a simple problem to work on if you want to get meaningful results.
The main sources of error in my measurements are the cheap isolation transformer (which causes a bit of line voltage distortion under load), the current sense resistor, the tolerances of the voltage divider resistors, and noise in the measurements. So I wouldn't claim these measurements to be better than 10%.
You can take a look at one of the oscilloscope power graphs at https://picasaweb.google.com/lh/photo/pbrO8BQz38kDo9xU5ejffd...
Yellow is the input voltage, and turquoise is the input current. The non-sinusoidal current shows the non-unity power factor. Note that there's no phase shift, but instead the current flow happens only at the voltage peaks (which is a consequence of the input diode bridge, not of the switching power supply per se.) At the bottom of the image is the instantaneous power, computed from the instantaneous voltage and current.
For the iPad vs iPod measurement above, I didn't have the oscilloscope handy so I used a Kill-A-Watt, which does in fact take the power factor into account.
Going back to your statement that "bigger charger -> larger dead load". By "dead load", do you mean the power consumption under no load, which I call "vampire power" in the article? This varies widely between chargers, having more to do with the design than the size of the charger. But in any case, this wasted power is pretty much irrelevant under load. For instance, 100 mW is a typical vampire power usage. So if a hypothetical larger charger has twice that wasted power, at a 3 watt load, this is only a 3% difference.
If you look a little more carefully at your scope trace you'll see the coils reactance at work in the lower trace, the peak is where the FET in the supply is closed and is drawing real power, the purple trace past the peak and beyond the 0 crossing is inverted and drops slowly back to 0 before the next peak hits. If you use the controls on your scope to zoom in on the bottom trace by increasing the vertical sensitivity you'll get a much better idea of what I'm getting at here. You'll see '0' voltage and yet current is still flowing.
You can't correct for power factor by simply increasing the resolution and averaging. The base frequency of your oscilloscope does not enter into the discussion here, it could be 500 Hz for all I care and that would be enough.
Furthermore, the power factor of a switched mode supply changes as a function of the load applied and gets (much) worse if that load is also reactive or capacitive. Under some circumstances it is possible to draw negative power from the wall socket if you do a naive measurement, or you'll see wall socket power decrease as output current increases.
All this is possible because voltage and current are more or less out of phase with each other.
The kill-a-watt will work well with some reactive loads (such as CFLs) as long as they're of the ballast type.
A switched mode supply presents challenges that can't be met at the cost constraints of a consumer device like that.
Vampire power is a new term, I'm not familiar with it. Dead load (or simply the losses) is anything that does not end up in your consumer (the live load), I'm not sure if that is an accurate translation of the terms. It normally goes up as a function of the amount of power consumed, the base line (consumption without any load at all) is probably your 'vampire power'.
Total efficiency is 100 * ((output power)/(input power)) and will in practice be anywhere from 60% to 98% depending on how well load and supply are matched, and can vary wildly from one powersupply to another due to component variations.
Finally, classical power factor correction applies to sinusoidal wave forms, as you've already discovered switched mode supplies waveforms on both the input and the output side are anything but sinusoidal further complicating an already hairy problem.
The only chargers that I have that won't charge my all of my modern iDevices are old ones meant for iPods (500mA, no signal on the data pins) and my old Belkin iPhone car charger doesn't work with my iPad (Probably only 500mA w/ signal for iPhones. I'd have to check).
Another interesting tidbit: It appears that my car's Pioneer DEH 9400 head unit (aftermarket of course) seems to inform my devices that it can supply 1A of power through its two USB ports. This is the first time I've seen that outside of dedicated chargers and Macs (I assume of course that some PCs do this as well but I haven't seen one yet).
My Asus motherboard can do that with the right software/driver installed and it seems that all products made by Asus these days support this. It's an optional install you can find in the Asus support website for your devices.
I had to uninstall it, when I upgraded from Windows 7 to 8, though.
Of course, the real question is, who promoted it? The Apple sycophants, or the folks who wanted to rub it in their faces?
Unbranded, common on ebay.
They were both awful. I would be interested to know about a known good one.
I tried out half a dozen 5v/1a adapters and this one was the winner - best performance and a very good price for UL listed. The most important points: It has the 'iDevice' resistors and has very good voltage regulation - extremely clean, no big voltage spikes. When loaded down 1A it stays around 5.2V, I actually pulled 1.5A and it was still above 5V. I opened a few up to look at the construction and found it well designed with good soldering, strain relief etc. In the product description I explain why its a 5.25V not 5V adapter - its perfect for high-current devices like the Raspberry Pi and we have thousands of customers who have happily used it for the Pi.
We have a photo of our label here
I did make the mistake, though, so sorry about that.
I believe that she is meticulous about what she sells in here store and tests them out before she makes it available to her customers. She is not rebranding the products as hers. She is just assuring her customers that she is doing due diligence prior to selling it on her site.
I like Fry's just like the next guy and they have great deals, but I don't expect to get high tech stuff there or necessarily high quality stuff. They are just another big box store but also catering to your average hobbyist. I don't know if the purchasing department there puts as much effort into checking out the products they sell.
You got what you paid for at Fry's.
Does anyone know what's the deal?
In some cases the chargers and devices have gone a third way. This is very typical for tablets as they need a lot more current in order to charge within a reasonable time frame. At that point many manufacturers have made up their own way of signalling the ability.
It worked with my brother's iPod classic (maybe 5th gen?), but not my nano; later I saw something on Adafruit about how the iPhone needed specific voltages on the data pins to charge, and I figured that was what was going on.
But recently, I got a battery pack, which can output to microUSB, iPod, or some other things. It works with my iPod, the same one I had years ago. But the cables it came with have swappable ends, I think 2.something mm jacks. At any rate they only have two points of connection. Here's what it looks like: http://i.imgur.com/8TliS.jpeg
So, how does this work? As far as I can tell, this should be exactly the same situation as a standard USB-iPod charger with its data cables cut, but it's not behaving the same.
It's possible I'm getting some details wrong, but can anyone shed some light on this?
You might have been caught by this same problem.
Also, the USB cable may be too high resistance, causing a voltage drop.
It'd be nice to see an even broader test
On a non-technical note, Apple chargers should lose simply because of how goddamned large they are. They basically eat two spots on a power strip and are easily knocked out of wall sockets because of their weight.
And god forbid if there are two Macs plugged into the same strip...