To check the current draw of any load you should put an ammeter in series with the load. This means that you disconnect your circuit from +12V, then connect the + wire of the ammeter to the +12V, then the - wire of the ammeter goes to your circuit. You must choose the correct range on the meter switch, and on some meters you must also remove the + wire from the Volts/Ohms input and plug it into the Amps or MilliAmps input.
When you select AMPS or MilliAMPS function on your meter, you switch in a very low value resistor that is wired from +IN to -IN (or equivalent labels on your meter). As current flows through this resistor, the meter actually measures the voltage across it and then translates that voltage using ohm's law to display a current value to the user. So, the meter has a very low resistance when using the AMPS function. This is very different than when you are using the Volts (AC or DC) or Ohms function which make the meter look like a very high resistance. This resistor is tailored to the range being used. For example, if you are using a 10A range, the resistor is probably about 0.1 ohms, and may be capable of dissipating 1 Watt of heat. If you choose a lower range, like say 100 mA, then the resistor value is probably higher, maybe 1 ohm or higher and it can probably handle about 1 watt of heat again. Now, imagine what happens when you mistakenly put this resistance from your +12V supply rail to ground by toucing the + wire to +12V and the - wire to GND. Ohm's law says that if, say, you are on the 100 mA range, then perhaps the power supply tries to push 12 amps through that poor little meter's internal resistor. Of course, the power supply probably isn't capable of delivering 12 Amps, so it does what it can. Let's say it pushes 3 amps. The heat that poor little resistor has to handle is 3^2 x 1 = 9 watts! That's a lot of heat. And its only rated for 1 watt, so guess what happens. It burns out very quickly. That is, unless there is a fuse that blows before the resistor fries. Most of the meters I've had don't have a fuse in series on the current scales.
To avoid such problems, you must always be careful when using the Amps or Milliamps inputs or ranges on a meter. And most used meters that I have come across have blown internal resistors, all because previous owners did not understand or did not care how to use the meter.
I'm guessing your meter no longer works on the Amps or Milliamp scales. So the next easiest thing to do is to use an external resistor placed in series with your circuit (between the power supply and the regulator input) and then measure the voltage across the resistor. The likelihood that your meter's Volts range is still working is quite good. But again, think about ohm's law. Since you are putting a resistor in series, you have to worry about voltage drop and you need to avoid an excessive voltage drop. Usually in such cases, a voltage drop of 0.4 volts or less is considered harmless, so estimate the current draw that you expect (using IC data sheets and ohm's law), and choose your series resistor to give you about 0.4 volts of drop. Most people choose values like 1 ohm or less.