Objective / Subjective


In everyday conversation, we use the adjective objective to mean unbiased, based on fact, as opposed to a biased subjective viewpoint or opinion. In experimental psychology, those terms are used to make a distinction between a phenomenon that is objectively measurable by an instrument and a person's subjective perception as they report experiencing it. For example, a sound's intensity (power per unit area) is measurable, whereas its loudness can only be reported by a human subject as a response to their perception. This distinction is sometimes important when discussing objective measurements of sounds—frequency, amplitude, time interval between onsets—and relating them to our subjective impressions of pitch, loudness, and musical tempo.

The just noticeable difference (JND) is a term used in psychology/physiology, and is defined as the amount of increase or decrease in a stimulus (pitch, loudness, etc.) required for the average person to detect a change. This has been shown empirically to be best expressed as a ratio (a multiplication factor) of the magnitude of the phenomenon, rather than as an arithmetic difference (an addition or subtraction).

The Weber-Fechner law is actually a pair of related principles, one by Ernst Weber and the other by Gustav Fechner, two 19th-century scientists. Weber was the originator of the concept of JND, and his law posits that the ratio of JND remains essentially constant regardless of the magnitude of the phenomenon being measured. For example, if the smallest change I can just barely notice in a 20-lb. weight is 1 lb., the JND factor is 1/20=0.05, so I could expect that the JND for a 100-lb. weight is 100*0.05=5 lb. Fechner's closely-related law is that the subjective perceptual impression of a phenomenon is proportional to the logarithm of its empirical intensity. For example, a linear increase in the measurable frequency of a tone results in a logarithmic change in one's subjective perception of the sound's musical pitch; conversely, a linear increase in musical pitch—say, an ascending chromatic scale—corresponds to an exponential increase in the tone's frequency.

An explanation of this phenomenon, with respect to the weight of an object, is presented in a Khan Academy video titled "Weber's law and thresholds".

In computer audio we commonly talk about linearly measured amplitude of a sound wave, between 1 (full amplitude) and 0 (silence) in terms of the unit called the decibel, which is proportional to the logarithm of the ratio of two amplitudes. In computer music, we talk about the linearly measured frequency of a wave in terms of its musical pitch (MIDI number), which is proportional to the logarithm of the ratio of two frequencies.


This page is by Christopher Dobrian, dobrian@uci.edu.
Last modified March 31, 2019.