What is a time constant and what is defined by the time constant?
The time constant defines the time period in which a signal is reduced to 1/e (i.e. about 37%) of the output signal. It is determined by the time constant resistor and the range capacitor (add link to Glossary 'Range capacitor').
The time constant determines the cut-off frequency of the charge amplifier's high-pass characteristic. Many charge amplifiers allow switching between different time constants, typically represented in 'Short' or 'Long'.
What does it mean when time constant 'Long' is selected?
Long time constants ('Long') enable the acquisition of low-frequency signals and thus cover a large frequency range. Measuring with a long time constant is called 'quasi-static'. However, drift, i.e. undesired change of the output signal, occurs. There are many reasons for drift, often it is the smallest leakage currents in the charge amplifier that cause this phenomenon.
The time constant 'Long' is usually chosen when using piezoelectric force sensors if the whole frequency spectrum including low-frequency signal components is of interest or if a reasonably static signal is to be measured over a longer period of time.
What does it mean when time constant 'Short' is selected?
The signal drift described above can be prevented by selecting the time constant 'Short'. In return, however, it is no longer possible to record low-frequency signals, because a short time constant makes this impossible. Measuring with short time constants is called 'dynamic'.
For dynamic or transient measurement processes, for example when measuring vibrations with acceleration sensors, the time constant 'Short' is selected.