The wattage rating is how much power the resistor package itself can dissipate. The resistance value between a 100ohm 10watt and a 100ohm .25watt resistor is the same, however as the current increases the power that resistor dissipates will increase as well, a 10 watt resistor simply has a much larger physical bulk or is made from materials that will dissipate that heat better. If you go over the maximum wattage rating of a resistor it will begin to heat exponentially, this will effect the resistance both on the short term and if it's heated enough will permanently alter the resistance value, so it's very important to always chose a proper wattage rating for the resistors in a given circuit. Ambient temperature and how easy it is for the resistor package to actually emit the heat it's creating will effect the practical wattage of a resistor. A simple example being if you encapsulate a resistor in a bead of silicon the silicon will insulate the resistor and cause it to overheat MUCH faster than if the resistor were in contact with something like an aluminum heat sink. Even if the resistors were identical. It is an often overlooked facet of circuit construction because it's a material science problem not an electrical one.