Glyph said:
I was wondering something about the AC that comes outta my outlet: I know that most circuits that convert AC to DC use a bridge rectifier that converts the full wave into DC. But letsay that one of those diodes in the bridge fails and you get essentially a half-wave rectifier. Is this bad for the rest of the AC line? and if it is, how would i detect it (other than hooking up an osciloscope)?
At low currents this is probably inconsequential but i'm talking a system that say would normally suck 40 amps or more outta the outlet.
You haven't filled your loaction in on your profile, so we don't have any clue where in the world you might be - but I'm not aware of outlets which provide 40 amps. You 'may' be in the USA, 'outlet' is probably more an American term than other countries?.
The usual failure mode of bridge rectifiers is one of the diodes going short circuit, this should blow a fuse in a correctly designed circuit.
If one of the diodes goes open circuit, you can usually tell by the increased mains hum on the circuit.
It was common practice many years ago in old UK televisions to only use a single diode half wave rectifier in television sets. Back when two pin plugs were common place it was a random choice which way round the sets were plugged in, so they effectively balanced out. Many of these sets actually used 'metal rectifiers', back before silicon recifiers were available (at least at sensible prices).
The introduction of the 'modern' UK three pin plugs stopped all this, almost all sets 'should' be connected the same way round, so you got an entire street (or block of flats) only taking the positive half cycles from the mains.
The result of this was that the mains tended to have a negative DC drift, which wasn't desirable. Because of this new legislation was introduced requiring all new TV's to have bridge rectifiers on the mains inputs