Unless there is some characteristic noise or variability in one supply versus the other, IT SEEMS TO ME it would be rather difficult to differentiate the two. It is certainly possible that one source has a slightly higher peak to peak voltage than another...i.e. one may read 118 VAC with a voltmeter and the other may read 115 VAC (I'm using American standard for the example)
The next step would be to get a datalogger or chart recorder designed and scaled to read MAINS voltage. A computer with an A/D interface and software can do the same thing, or better yet, many modern digital multimeters come with software and a USB port for data logging!
Here's a voltage chart recorder if you want a paper record:
Rustrak AC Voltage Strip Chart Recorders
Here is an inexpensive digital multimeter that computer interfaces for datalogging:
Digital Multimeter with PC Interface
And many more expensive, brand name multimeters have datalogging and trend capture built into them.
So, if you could identify a time when, with certainty, you can identify the power generator, and if you can contrast the two power sources by a slight voltage difference or perhaps even a slight frequency offset (i.e. your mains might be 50 Hz, and if you analyze the two sources, one might be exactly 49.65 Hz while the other is 50.35, and assuming your meter or software has that kind of resolution)
You might notice one supply source is very stable while the other has a slight drift.
etc etc.
Once the pattern is determined, it would be an easy matter to identify bump transfers where the power drops momentarily when switching from one power source to the other. These would be your datalogger "markers", and from there the time duration under each source could be measured.