Badar said:
does 32 bit processor simply means calculation of 32 bits per unit time or it means 2^32 calculations per unit time
It's got no relation to time or speed of calculations, it's simply how many bits it can deal with at one time. A 32 bit processor can take two 32 bit numbers and add then together in a single instruction, for a 64 bit addition it will take multiple instructions, and multiple registers. A 64 bit processor will do the same addition in one operation.
I'll give you a nice easy 'example':
Say you run a company that sells barrels of wine, and you have various size trucks for delivering them. You have an 8 barrel truck, and a 32 barrel truck.
If I order 32 barrels from you, you can either send the 32 truck once, or the eight truck 4 times. However, if I order 33 barrels, it then becomes two trips for the large truck, or only 5 for the small truck.
If I only order 8 barrels it's obviously much more efficient to send the 8 barrel truck!.
Bear in mind that the larger truck costs much more money to run.
That's really what you're talking about, more bits tends to be intriniscally faster - but can be much less efficient as well. Bear in mind text and most other data is only 8 bit!.