One of the main reasons quantum mechanics still feels elusive is that we do not have a complete understanding of exactly what concepts must be amended from our classical thinking. Is it the rules of classical logic that must be changed? The rules of probability theory? Or simply the way we construct physical models on top of these?
In recent work we showed that classical and quantum mechanics share a common logical structure once the correct parallel is drawn, a structure that emerges quite naturally from basic requirements of experimental verifiability.
The next question is whether the rules of probability theory need to be amended. Past approaches focused on probability spaces associated with measurements, but if we extend to probability spaces over all preparations in a way that is consistent with quantum information theory, we find the need for nonadditive measures. That is, putting together two states that are not the outcome of the same measurement will give a measure less than two. As measure theory is the foundation for both probability and information theory, this difference alone could explain the departure from classical ideas. The main task, then, is to develop a suitable extension to measure theory to thoroughly investigate and characterize this difference.
Additionally, we want to identify the conceptual foundation of the problem. We believe the core is the interplay between the mere notion of counting and contextuality. The measure is nonadditive when combining states associated with different preparation/measurement processes precisely because the choice of state is not independent of the choice of context. A choice of state, then, is not in general a choice between two elements all else being equal. This will lead us to a characterization of quantum phenomena that is more physically natural and intuitive.