A given transformer, at a given current, will have 50x greater
total losses (hysterysis, proximity, eddy current and resistive combined) at 400Hz than it does at 50Hz.
Whether this is important depends upon what ratio, power and frequency the transformer was designed for.
eg. This overly conservative, 20:1 (240v/12v) E-core transformer designed for 1A output @12V from a 240VAC input:
has 0.25 mW of total losses when used at 50Hz:
but at 400Hz, it generates 13mW of heat:
Insignificant for this design and use, but a less conservative design it could become significant.
Ie. Your question does not contain enough information to give a definitive answer.
What frequency and power draw was it designed for?
What frequency and power draw do you wish to utilise it for?