Printable poster:






The Forum is open to everyone, including students, visitors, and faculty members from all departments and institutes!

The 60 minute lecture is followed by a 10 minute break and a 30-60 minute discussion. The language of presentation is English or Hungarian.

 

The scope of the Forum includes all aspects of theoretical philosophy, including:

  • logic and philosophy of formal sciences
  • philosophy of science
  • modern metaphysics
  • epistemology
  • philosophy of language
  • problems in history of philosophy and history of science, relevant to the above topics
  • particular issues in natural and social sciences, important for the discourses in the main scope of the Forum.

Location









 
 
 

21 February (Wednesday) 5:00 PM  Room 226
Gábor Hofer-Szabó
 Institute of Philosophy, Research Centre for the Humanities, Budapest 
 
  Commutativity, commeasurabilty, and contextuality
Kochen-Specker theorems, allegedly proving quantum contextuality, are based on the assumption, standard in textbook physics, that commuting operators represent commeasurable (simultaneously measurable) observables. However, in all versions of the Kochen-Specker theorem there are commuting operators which can hardly be interpreted as representing simultaneously performable measurements. Lacking commeasurability, two extra assumptions are needed to get the contradiction, the violation of neither of which, however, implies contextuality of any sort.


28 February (Wednesday) 5:00 PM  Room 226
Maria Kronfeldner
Department of Philosophy, CEU, Budapest 
 
  Two reasons why the purist approach to science fails with respect to thick concepts
‘Thick concepts’ are value-laden, i.e., they have an evaluative and a descriptive aspect. The concept of aggression is a standard example. When a psychologist writes that “boys are more aggressive than girls” then this is not a purely descriptive claim. Purists claim that scientists can and should purify thick concepts, by establishing (what I call) ‘naked numbers,’ i.e., an operationalization, e.g. an A-index that objectively measures the respective behavioral trait meant with the term ‘aggression’. This talk will present two problems and thus two reasons why the purist approach must fail with respect to thick concepts. (1) The outsourcing problem: Reducing thick concepts to naked numbers can be rejected as a solution for making science value-free since the connection to the original conception and thus the values involved (e.g. what lay people call aggression) needs to be made somewhere in the process of knowledge production, so that the knowledge produced stays socially significant, i.e., provides reasons for action. Purification turns out to be inefficient. (2) The under-determination problem: The conceptual decisions in reducing thick concepts to naked numbers cannot be made objectively (i.e., completely free of bias) because an operationalization is always underdetermined by data. In effect, different operationalizations let one see different evidences: whenever a phenomenon is reconstituted and its definition re-operationalized, some evidence can be made to disappear. Such disappearance of evidence will be shown to be necessarily biased and often value-laden, by analyzing the history of research on aggression. Purification turns out to be impossible.