from what I gathered capacitors draw more current at the start of charging and the conductivity slowly falls as it is being charged.
Maybe. It depends. Sometimes.
"draw more charge" is not a good term. Separate from that, The charge in a capacitor inreases or decreases depending on the circuit it is in. I think you are asking about basic charging principles, with a voltage source, capacitor, and resistor. If a capacitor is being charged up by a constant voltage source, such as a regulated power supply or battery, through a resistor, then the voltage across the capacitor increases more rapidly at the start (assuming zero charge at the start), and increases more slowly as the voltage across the cap rises closer to the source voltage. This is because as the voltage across the cap increases, the voltage across the resistor - and hence the current through it (Ohm's Law) *decreases*. And as that current decreases, the rate of change in the capacitor voltage decreases. This highlights an important fact - it is the current that charges or discharges a capacitor.
If you plot capacitor voltage as a function of time, it is an exponential curve. Theoretically, the voltage on the cap never reaches the power supply voltage. IRL it does.
But speaking of that current ...
Another way to charge a cap with with a constant current source. In this case, the voltage across the cap increases linearly because the current does not change as the cap charges up or down. This is the core concept in various waveform and signal generators for ramp, triangle, and sawtooth waves.
ak