A multimeter is a standard tool for all measurements in the electric and electronic field. It measures electrical parameters like voltage, current, ohms, capacity, frequency and other parameters with additional sensors. A standard multimeter shows usually only the current value of this parameter. Depending on the prize the quality and sensitivity of the multimeters vary. Specialized multimeters (e.g. digital multimeters) are able to save values, show trend parameters, are able to measure over longer periods as well as identify minimum, maximum and average values. Even more specialized multimeters log data for long term analysis and some have special trigger possibilities to detect spikes or failures in parameters.
A multimeter with standard functionality shows the parameters with a pointer on a scale (analog multimeter) or in a digital version with numbers. More specialized multimeters are supporting auto range detection and setting functions. Other multimeters have interfaces for data communication with computers, for easier analysis and further data processing. The above figure shows a laboratory digital multimeter which provides voltage and current measurements at the same time and shows the current power consumption on the display. In addition it supports mentioned functions, like minimum, maximum, average value and has communication interfaces like USB, Ethernet, GPIB and others for easy control and analysis with a PC. Usage A multimeter will be connected to the points of interest in an electric circuit in line or parallel, depending on the parameters that are measured and on the circuit properties and conditions.