My excuses in advance if this is a dumb question. And am hoping it makes sense.
With 'film' cameras, the brightness of the image was related to the film speed (ISO), the exposure time and the aperture ('f' stop).
My cheap 'Flip cam' records at 30fps. I assume its 'auto exposure' simply adjust the exposure time.Understandably, I get motion blur when a object moves fast. The blurring is equal throughout the period of the frame.
But, when comparing similar footage with a Kodak Zi8 there seems to be a huge difference. For example, in MarkPeter's recent Windor castle clip, although blurs occur, the result is more like one perfect frame superimposed with a semi-transparent frame a few milliseconds later. That gives a sharper image.
I had assumed this was because of Kodak's image stabilisation system which *might* be giving more value to the beginning of frame; then lessens throughout the frame time.
Whilst it seems common for cams to record at 30 or 60fps; do proper (semi pro and better) cams offer controls over the exposure time? If so; what is the common range of speeds? With old style film cameras, exposure times of 1/1000th of a second were common. Is it just the case that electronic video cams don't commonly achieve those speeds?