Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Resistive divider speed

Status
Not open for further replies.

dark

Member
Hi, could someone help me with the maximum speed attainable(baud rate) with following scheme;
cc90.JPG
 
Huh??

Maximum speed of the slowest UART might be a possiblity...

Unless, of course, R101 was 0Ω, in which case I can safely state that the maximum speed would be zero baud...
 
Last edited:
Huh??

Maximum speed of the slowest UART might be a possiblity...

Unless, of course, R101 was 0Ω, in which case I can safely state that the maximum speed would be zero baud...

It is obvious that Resistive divider(within limits.... I dont mean MegOhm) is used to bring down 5V to 3.3V
 
Last edited:
I apologize for being glib.

The voltage level(s) will have no effect on baud rate, which, as I said, is limited by the slowest UART and if the receiving UART is a 3.3V unit.

Now, if what your asking is "will a reduced TXD voltage level (from 5 to 3.3V) effect a receiving UART expecting a 5V signal, then yes, the speed would have to be reduced as well.

This would be because the received signal level may very well be too close to the "is it a '1' or a '0' " threshold, thus requiring massive CRC checking. Or, at worst, not be able to successfully decode the analog signal correctly at all.

Is this what you were asking?
 
Last edited:
The max baud rate will depend on the capacitive loading on the voltage divider tap. If, for example, you were to put the divider at the TX, and the RX were to be hundred feet away on a twisted pair that has a capacitance of 50pF per ft, then you would have to consider the low-pass filtering effect of shunting the voltage divider tap with 5000pF. A sim of this case at 115KBaud shows the voltage at the RX as a function of stepping the resistance values in the voltage divider as R = 500, 1K, 2K, 4K and 8K. In the example, I would say R=500 and 1K would work, but R>=2K is getting marginal. You will have to recalculate based on a realistic estimate of your line capacitance.

ps, it occurs to me that the RX has a line terminating resistance which appears in parallel with R2 in my schematic, so you will have to consider that as well. Also, if you put the divider near the RX, then the line termination could be combined with the divider.
 
Sounds like you are not doing true RS232, but TTL levels?

Here are the specs for real RS232: From **broken link removed**.

RS232 Electrical Specs

The requirements for a driver (output) and receiver (input) are simple and are essentially DC requirements:

1. A receiver may present a load ranging from 3000 ohm to 7000 ohm and
expects signals to swing from below -3 volt to above +3 volt.

2. A driver must have an output resistance of at least 300 ohm and present a swing
of below -5 volt to above +5 volt to a receiver. Open circuit driver voltage
range is as high as -25 volt to +25 volt, but must drop to + and -15 volt
with receiver load.


Most receiver chips are designed to switch in the range of 0.5 volt to 2 volt. (This is within the RS232 spec.) It would be great if all of them were, since then we could cut corners and drive them from circuits that swing only down to ground. Unfortunately, you may find a receiver that must be driven below ground. In our work, the 4th serial port board we tested used a receiver design based on the 75174. It needed a voltage swing down to -1 volt. Later, we'll show you how we did it.
What about Noise Margin? If RS232 says to drive receivers to + and -3 volt, then don't we have to do this? Assuming that the device being powered has little or no cable between it and the RS232 port, it isn't likely to pick up much noise. That lets us get by with driving the RS232 receivers just a little past their switching thresholds.

The receiver resistance is typically 5 kohm, which is the center of the specification. However, semiconductor manufacturers specify that current can be as high as 1 mA with 3 volt applied (or as low as -1 mA with -3 volt applied). In other words, they're saying that the resistance could be as low as 3000 ohm. To be safe, we should assume that it can be this low, even though there will probably always be a guardband that keeps it somewhat higher.

The open circuit driver voltage of 25 volt looks scary. But we have never found any RS232 implementations that used supply rails of greater than 12 volt and are comfortable using circuitry that is rated for 12 volt maximum. The open-circuit voltage is usually only a problem in the sense that, if suddenly applied to your circuit, it could cause something to break down and cease to function. Usually, once your circuit is operating normally and drawing current from an RS232 line, the problem becomes one of not enough voltage. The requirement that driver output be no more than + and -15 volt with 7000 ohm load probably means that there are no RS232 implementations with supply rails outside of + and -15 volt. If you absolutely must have a 15 volt capability, you could either choose components that are rated for 15 volt or else put back-to-back zener diodes on the 3 driver lines (in your device). The 300 ohm (or more) of output resistance will have to absorb the power due to the voltage difference. So don't clamp at too low a voltage. Seven volt is probably OK.

If a driver with output resistance of at least 300 ohm must apply at least 5 volt to a receiver with resistance as low as 3000 ohm, the open circuit driver voltage must be at least 5.5 volt. This makes it impossible, without an on-board DC-DC converter, to create a true RS232 implementation with only a 5 volt supply rail. We have come across psuedo-RS232 driver designs using + and -5 volt rails (probably to get by with fewer power supplies). This seems to be used primarily in programmable logic controllers (PLCs) that are used in industrial control. No doubt they do the job they were intended to, because RS232 is so conservative. But they can cause problems at the RS232 limits. If your device must work with one of these, you should get one and test it to see what kind of power you will be able to draw from it.
 
Last edited:
Mike in your simulation is the digital source using push/pull drivers? Or it is just sourcing 5v in the ON period and then open circuit (and not sinking current) in the OFF period?
 
I apologize for being glib.
The voltage level(s) will have no effect on baud rate, which, as I said, is limited by the slowest UART and if the receiving UART is a 3.3V unit.
Now, if what your asking is "will a reduced TXD voltage level (from 5 to 3.3V) effect a receiving UART expecting a 5V signal, then yes, the speed would have to be reduced as well.
This would be because the received signal level may very well be too close to the "is it a '1' or a '0' " threshold, thus requiring massive CRC checking. Or, at worst, not be able to successfully decode the analog signal correctly at all.
Is this what you were asking?

It dose have some effect , for which I was not sure . I think MikeMl is pointing it..


The max baud rate will depend on the capacitive loading on the voltage divider tap. If, for example, you were to put the divider at the TX, and the RX were to be hundred feet away on a twisted pair that has a capacitance of 50pF per ft, then you would have to consider the low-pass filtering effect of shunting the voltage divider tap with 5000pF. A sim of this case at 115KBaud shows the voltage at the RX as a function of stepping the resistance values in the voltage divider as R = 500, 1K, 2K, 4K and 8K. In the example, I would say R=500 and 1K would work, but R>=2K is getting marginal. You will have to recalculate based on a realistic estimate of your line capacitance.

ps, it occurs to me that the RX has a line terminating resistance which appears in parallel with R2 in my schematic, so you will have to consider that as well. Also, if you put the divider near the RX, then the line termination could be combined with the divider.

Please note its just a UART connection between an RF module(5V UART- TXpin) and my MCU Atxmega128's(3.3V USART-RXpin) and not RS232. I think capacitive loading should be in order of few pico Farads. Rgds
 
Mike in your simulation is the digital source using push/pull drivers? Or it is just sourcing 5v in the ON period and then open circuit (and not sinking current) in the OFF period?

The signal source is a Voltage Source with a zero output impedance, meaning that when it is at 0V, it sinks current out of the capacitor. A zero output-impedance voltage source is not a very good sim of a real TTL or CMOS output, but is close enough when the current being sunk/sourced is less than 10mA = (5V/500Ω). I did model a finite rise/fall time at the voltage source (see the two 100n parameters).

Note that a real TTL output would have asymmetric source and sink capability. CMOS has almost symmetric source/sink. Both of these can easily be modeled, but for a quick sim to make another point, the way I did it is sufficient. The trick is to know when you can get by with a simplistic model, and when it makes a difference so that you have to use a more complicated one...
 
Please note its just a UART connection between an RF module(5V UART- TXpin) and my MCU Atxmega128's(3.3V USART-RXpin) and not RS232. I think capacitive loading should be in order of few pico Farads. Rgds

Likely 5V CMOS output to 3.3V CMOS input. The total capacitive loading would be the sum of the output pin capacitance, the pc traces, and the input pin, likely about 20pF on each side of R1. I sim-ed again at 115kbaud, stepping R2 from 10K to 100K. If this was a battery-powered CMOS circuit, you would likely be interested in making R1, R2 as high as possible. Even 100K would work...
 

Attachments

  • D203a.jpg
    D203a.jpg
    201.3 KB · Views: 191
Last edited:
If, for example, you were to put the divider at the TX, and the RX were to be hundred feet away on a twisted pair that has a capacitance of 50pF per ft, then you would have to consider the low-pass filtering effect of shunting the voltage divider tap with 5000pF...
If you put a capacitor in parallel with R1 you can compensate for the load capacitance and turn the signal back into a square wave, just like the compensation on oscilloscope probes is done.
 
Likely 5V CMOS output to 3.3V CMOS input. The total capacitive loading would be the sum of the output pin capacitance, the pc traces, and the input pin, likely about 20pF on each side of R1. I sim-ed again at 115kbaud, stepping R2 from 10K to 100K. If this was a battery-powered CMOS circuit, you would likely be interested in making R1, R2 as high as possible. Even 100K would work...

Thanks , this gives the confidence to go ahead.


If you put a capacitor in parallel with R1 you can compensate for the load capacitance and turn the signal back into a square wave, just like the compensation on oscilloscope probes is done.

Informative...

Rgds
 
If you put a capacitor in parallel with R1 you can compensate for the load capacitance and turn the signal back into a square wave, just like the compensation on oscilloscope probes is done.

You would have to "tune" it to match the parasitic capacitances. Here is the sim again, this time stepping the compensation capacitor C3 from 10pF (undercompensated), 40pF(just right for these parasitics), and 100pF (overcompensated, peaking).
 

Attachments

  • D203b.jpg
    D203b.jpg
    214.3 KB · Views: 188
Last edited:
You would have to "tune" it to match the parasitic capacitances. Here is the sim again, this time stepping the compensation capacitor C3 from 10pF (undercompensated), 40pF(just right for these parasitics), and 100pF (overcompensated, peaking).
Unfortunately, you can't tune it directly with a scope, because the probe capacitance is of the same order of magnitude as the stray. I have done this by connecting a 1pF cap between the node in question and the probe. This reduces the probe loading to <1pF, obviously.The amplitude will be wrong, due to the capacitive divider, but the wave shape will be correct.
 
The signal source is a Voltage Source with a zero output impedance, meaning that when it is at 0V, it sinks current out of the capacitor.
...

Thanks for clearing that up, I was initially surprised the charge discharge times were so well matched.

Re the real world speeds possible, I recently did one from 5v PIC output into a 3.3v CMOS input using 1k and 1k5 resistor divider, and SPI comms were perfect at 10 MHz even with some messy cabling of about 12". However the resistor divider was at the end of the cabling.
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top