This is not as straight forward as you might first think. Measuring 6V to 16 bit accuracy (65536 steps) means you are measuring to a resolution of approximately 100uV (91.5uV in fact). The input offset voltage of your op amp will affect this accuracy, as will its drift. Therefore to measure to half an LSB (50uV, say) your input offset voltage needs to be less than 50uV.
You then have to worry about the accuracy (and drift) of the resistive divider. Even if you buy a 1% accurate resistor, this could be 'out' by a factor of 655 (1% divided by 1/65536). So you need a precision resistive divider. Have a look at this:
https://www.linear.com/product/LT5400
This will get you much closer in accuracy, but your system will still need calibrating.
So... feed the 6V into a buffer (0 Ohm feedback resistor on the op amp, non inverting gain). This will give you the high input impedance and unity gain. Then connect the LT5400 to the output of the op amp and take the centre tap of the LT5400 to give you a divide by 2.
Feed this into your ADC. I am assuming (!) the input of the ADC will not load the resistive divider (hence pull your signal down). If it does, then you will need another op amp to buffer the output of the resistive divider.
The lower you can make the input offset voltage of the op amps, the more accurate your system will be and indeed I think the input offset voltage will be the dominant source of error in your system.
To get true 16 bit performance, you will then need to feed a 16 bit accurate voltage into the input and calibrate your system to give the correct output.
Then there is the drift of the reference, but this is another issue...
It can be done quite easily, but getting true 16 bit resolution will be hard, but hopefully I have outlined the things to watch out for...
(it is interesting to note, I tried downloading the TL074 datasheet from TI's website before writing this reply. It is still trying to download... Does anyone else have a problem with TI's hopeless website??)