Look up positivism in Wiktionary, the free dictionary.
Positivism is a philosophy which states that the only authentic knowledge is scientific knowledge. Positivism was central to the foundation of academic sociology.
Positivism may also refer to:
Logical positivism, a school of philosophy that combines empiricism with a version of rationalism