This topic is going to be a bit more on the *“for people familiar with knowledge of some philosophical jargon”* level. In other words, this won’t be a typical post that I normally make for larger mass consumption. Perhaps at a later day I’ll create a whittled down version, but this one is needed to point others to who claim *ontic probability* and use such as a way to contrive free will.

I’ll be using words such as “ontic / ontological” and “epistemic / epistemological”, I’ll be talking a little bit about quantum mechanics, and I’ll be talking about the differences between “epistemic probability” and “ontic probability” and my problem with the latter.

If you can deal with a few jargon words and a post that may be a little more technical than normal, please stick with this, because it comes up more often than I’d suspect in free will debates. What am I addressing? It’s this idea that, according to *some* quantum interpretations (interpretations of the scientific findings of quantum mechanics, basically the teenie tiny particle level), some people assert that probability is ontological.

Ontic or ontological means that the probability* really exists*. For those unfamiliar with these words, just think ontic (or the ontological) addresses what “exists” or “is” and epistemic (or the epistemological) addresses what we can or cannot “know”. Ontology is the study of existence, being, what is “real”, etc. Epistemology is the study of knowledge (what is knowledge, how we obtain it, standards, and so on). This article is attaching these words to the word “probability”: ontic probability (or ontological probability) vs epistemic probability (or epistemological probability). I’ll be using “ontic” and “epistemic” for their brevity and a few other reasons.

*Note to philosophy buffs: As an analytic logician, I don’t make any Heideggerian distinction between the words “ontic” and “ontological”. Since I will be addressing probability in physics, ontic also seemed a little more appropriate but for the most part no real distinction is being made.*

So what is probability? Basically, it means that an event either **has** a specific percentage chance, or a specific chance **is assessed**. For example, a flip of a coin may be assessed with a 50% chance of landing on tails and a 50% chance of landing on heads. If we were to say such probability was *ontic* we are saying that those probabilities aren’t simply due to our lack of knowledge, but that such probability actually exists inherent in the structure of the coin and toss. If we are saying it’s an *epistemic* probability, we are saying we are just assessing a probability because we don’t *know* all of the variables that lead the coin to the result.

This gets more complex when we start talking about quantum mechanics and wave functions, but before we do that let’s keep with the coin toss just so we can understand how such works for causal events. We’ll imagine that the universe is entirely causal (every event has a cause), and talk about the two different ways (epistemiv vs. ontic) in light of the coin flip.

For epistemic probability, our knowledge is such that we can only assess a 50/50 probability for the coin toss, but in actuality, the coin must land on heads due to the way the flip happens, the velocity and trajectory of the coin, the weight of the coin, the atmospheric conditions, the gravitational pull, the shape and texture of the coin, the qualities of the surface the coin lands on, and so on. If we could know all of the different variables (causes) of the environment and coin, and all of the different variables surrounding the flip, we could theoretically understand that the coin didn’t really have a 50% chance of landing on heads or tails, but rather a 100% chance of it landing on a specific side (e.g. heads). The probability was epistemic only, meaning in our heads.

For ontic probability, the coin flip is somehow the case that, even given the variables, the coin could officially land on either heads or tails. Not just due to our lack of knowledge of the variables involved, but in actuality, the coin has a 50% chance of landing on heads and a 50% chance of landing on tails. This doesn’t just apply to 50/50 probabilities, for example, it might be that the coin is weighted so it has an 80% chance of landing on heads and a 20% chance of tails. It doesn’t matter. What matters is that such probability, if ontic, exists inherently in such, it’s not just an assessment.

Now right off the bat most people are going to recognize that, for a coin, such is most likely epistemic probability only. They will point out that this example is a classical physics example, but when we get into the itsie bitsie teensie weensie world of quantum particles, such particle behavior is *unlike the coin*. In other words, the assertion often is that, rather than being merely epistemic probability, that for quantum particles, such is ontic probability. That the behavior of quantum particles are probabilistic in the *“such probability exists”* sense, rather than the* “such probability is assessed because we can’t know the variables, only a probability for the event”* sense. In a moment I’ll be addressing quantum behavior.

First, let’s talk about the experiment and terminology of quantum mechanics that lead people to this sort of thinking. The most important experiment is the double slit experiment. To summarize just part of the experiment, particles are shot through two slits that are a certain distance from each other, and most end up on a screen that reflects where the particle ended up. What they notice when the particles build up is what’s called an “interference pattern”, where as there are bands, darker toward the center, and lighter toward the edges. Put simply, this means that the particle is acting like a wave, rather than moving in a straight line from point A to point B (which is called, behaving like a particle for some odd reason):

When we measure the location of the particle before the slit, however, the pattern converts to the more straight line formation, and no interference pattern happens.

Basically, as soon as we measure the location of a particle, it no longer behaves like a wave, but behaves like a “particle”. This has given the behavior of particles what is often referred to as “particle/wave duality” meaning that sometimes a particle will act like a wave and sometimes it will **not** act like a wave (act like a particle), given whether or not the particle was in some way interacted with. For some interpretations of quantum mechanics, when the particle is measured, it conversion from a wave to a particle behavior is called “wave function collapse”, and for others the wave function doesn’t really collapse.

The “wave function” in quantum mechanics is the function that assesses the probability for the entire system. That is because we cannot assess anything but a probability for the entire system. In other words, when we shoot an individual particle out of the gun, we can’t measure it, as the very act of doing so prevents the interference pattern. To determine where the particle will end up on the screen, the only think we can assess is that it there is a specific probability for each band, based on the wave function – but we have no idea where an individual particle will end up other than assessing a probability.

I don’t want to go too deeply here into the wacky realm of quantum mechanics, as it’s this wave function part that is often thought of as having an ontic probability rather than an epistemic probability. So we can, for now, ignore the detector scenario for now, and just understand that such problem (that we can’t measure where a particle is heading without changing it’s trajectory in the process) is part of the reason why we can’t know where it is heading.

Let’s stick with the understanding that where the particles will end up on the screen are assessed through the wave function. Another term for such is called the “probability wave”. It is this type of term that often leads people to think that the “probability” part is a real (ontic) thing.

This is when we get into what is called quantum interpretations, basically, the different ideas about what the happenings an the quantum scale could mean. There are many different interpretations, and unfortunately, one that is a favorite in physics in the Copenhagen interpretation. This interpretation is an indeterministic interpretation that basically suggests that there is no underlying variable that leads a particle to one specific band. In other words, such doesn’t truly have a cause for it, or at least there is no good reason to assert one.

Per Bell’s theorem, if such is accepted, comes the idea that for there to be a cause that leads the particle to the specific place on the screen, such would have to be non-local, meaning instantaneous action at a distance. Defenders of Copenhagen suggest that if we don’t see a cause for the behavior, and if we have to inject in something like non-locality which has it’s own counter intuitiveness, we shouldn’t be injecting something non-evident in such as a cause.

Then you have non-local hidden variable interpretations such as Bohmian mechanics which explain how particles instantaneously exchange information with other particles through a nonlocal mechanism. And to the more radical end you have a multiple worlds interpretation, which postulates that all possibilities exist, each splitting off into different universes never to be seen by the others.

There are many other interpretations, some deterministic (meaning entirely causal), others indeterministic (meaning some non-caused events), and some agnostic to both of these. If you are uncertain what the words determinism and indeterminism mean, read this article: “Determinism” and “Indeterminism” for the Free Will Debate

Each interpretation also treats the wave function (and it’s collapse or illusion of collapse) a little differently. What’s more important, is that no matter what quantum interpretation is postulated, they all are equally incompatible with ontic probability, and here is why:

**The Problem with Ontic Probability**

The main problem with ontic probability is that it is incompatible with the only two ways events can happen, causally or acausally (without a cause). It matters not if you are a determinist and think all events must have a cause, or if you are an indeterminist and think some events might not have a cause, both of these scenarios do not allow for ontic probability. To understand why, let’s assess the two types starting with acausality.

**Acasual Probability**

The first thing I’m going to address is the idea that something with a probability assessment can be due to there not being a cause (what I call “acausal” events), because I find this very idea very problematic. We need to understand what we mean by saying that the behavior of an existing thing (such as a particle) doesn’t have a cause for it. That a particle could end up in location A with such and such probability, or it could end up in location B with such and such probability. Let’s simplify down those bands on the particle screen to two, just to make this more simple to understand.

If we say that the particles could end up in location A with a 75% chance or location B with a 25% chance, what are we saying? Basically, we are saying that there is some sort of mechanism that is forcing more particles to A than to B. If we are suggesting that there is no forcing factor here, we have a problem as to what accounts for the probability distribution.

There are two problems that will be dealt with separately:

- An acausal event can’t have a probability distribution with a range.
- An acausal event can’t be caused by an existing thing.

1) An acausal event can’t have a probability distribution with a range. If there is no forcing factor for an acausal event (no cause), it will either come about at some time in some location, or never come about. In other words, if there is nothing causing it to come about at a specific time or specific location, then if such comes about it would have no spatial or temporal determinacy that would drive it to be weighted with a percentage chance. It would just be an event that “popped” into existence. As soon as we say that such an event must happen within a range, we are injecting in something causing that range. For an acausal event to have a 75% chance of being in location Y at time B, such implies some sort of variables that funnel this likelihood.

2) An acausal event can’t be caused by an existing thing. This should go without saying, if we are saying that something happens without a cause (variable) then we are saying that the object that is behaving acausally can’t be the *cause* of such behavior. If the object is the cause, then it’s not acausal.

**Causal Probability**

This leaves us with causal probability. And this is where we run into even worse problems when people think that causality and ontic probability are compatible. The fact of the matter, however, is that they are no more compatible than free will is to causality.

The main problem is that ontic probability creates a cause with self-contradictory variables within it. If, for example, we say that a specific particle result has a 25% chance to be at location X and a 75% chance to b at location Y, the variables that cause such would be such that either they lead the particle result to X or they don’t lead to X (to Y instead). This is the same reason that an event (logically) can’t be both what causes X and what doesn’t cause X (but Y instead). It imposes a self-contradiction. For more information on that read here:

Keep in mind that this 75/25 example is simplified down for the sake of keeping such to only two different end results. In actuality the wave distribution assesses a number of different probabilities for the entirety of all variables, meaning that such ontic claim is contradictory in a number of different ways. The variables within the particle of a causal event would be such that it would lead to X and the **very same** variables not lead to X, and lead to Y, and not lead to Y, and lead to Z, and not lead to Z, and so on…all with their own percentages (10% to get to X, 20% to get to Y, 40% to get to Z, and so on). The problem should be obvious, if there are causal variables that lead the result to Y, then those causal variables couldn’t lead to X or Z, in which case, ontologically, Y would be 100% and the others 0%.

Since we can’t know those variables, the probability is epistemic only. We don’t know where the particle will end up, we can only assess a probability of where it will end up. But that is only due to our lack of *knowledge* of the variables. Otherwise, asserting that such is causal and has ontic probability is no different than asserting a self contradiction (e.g. invisible visible square circles).

***********

In conclusion, if a quantum event is caused, it’s logically impossible for it to have an ontological probability of two or more different directions without holding contradictory variables. If it is acausal, then that has other problems in that the particle itself can’t cause an acausal event and such an event cannot contain a probability distribution other than *“at some point in some location”* or *“never”*. Ontic probability, like free will, is incompatible with the two possible ways events can happen.

**But what of a third “probabilistic” option between an event being caused and an event being acausal?**

Certain people will have you believe that probabilism is a third option. This, however, is not the case. In fact, causal events and acausal events are in opposition. If an event is caused, it is not uncaused. If an event is uncaused, it is not caused. There is no magical middle ground – this is a necessary dichotomy. To say that an event is probabilistic but not caused is to say it’s an uncaused probability (acausal) and to say that an event is probabilistic and *not uncaused* is to say it’s a caused probability. It’s a logical absurdity to say that something is probabilistic and neither caused nor uncaused.

**What does this have to do with the free will debate?**

Some people suggest that ontic probability gives a condition that lead to free will. They imagine the fact that if a particle has a true probability for an event, that our mind might work in the same fashion where each option we have might be a viable option, and since the ontic probability is part of the particles in your brain, such can be considered “you” choosing that option. Of course this is in itself illogical, as if you make the decision of option A over B, you couldn’t have chosen B unless there were variables that lead to B instead of A. And if we say there is no variables, we inject in an event without a cause. There is no way around this dichotomy. Ontic probability is just as logically incoherent as free will, no matter how many times we qualify it with the word “quantum”. Quantum mechanics does not sidestep logic as some would have you think, it actually depends on it.

But even if we accept the illogical magic that is ontic probability, such does not really help with free will. There would be no amount of willing that would drive a decision to a 25% probable outcome over a 75% and vice versa. Such would just be the same as throwing a weighted quantum dice in which, somehow, the probability distribution is ontic rather than epistemic.

Illogical magic put aside however, there is NO ontic probability.

*Related Post: Existence Conflated with Knowledge and the Free Will Debate*

#### 'Trick Slattery

#### Latest posts by 'Trick Slattery (see all)

- Free Will Exists!! (and Happy S-S-Day) - August 31, 2017
- A Compatibilism / Incompatibilism Transformation - April 1, 2017
- Hitler Reacts to Free Will Skepticism - March 29, 2017

Hello Trick,

I followed your post above and I found it contradictory regarding QM of particles as ontologycal behaviour.

I argue that particles behave indeterministic (as per say ontologycal) as the grounding understanding we have now irrespective of QM interpretation.

You need to ask yourself first: who/what cause the particles if you want determinism as a start to go assumption.

But in order to make an argument about determinism and the basic principle of falsifiability in science I propose you a full step by step refutation of determinism in physics as done by physicians below:”Farewell to determinism”

https://scientiasalon.wordpress.com/2014/09/11/farewell-to-determinism

Cheers

I address ontic probability as being problematic in both deterministic and indeterministic models. The fact of the matter is, we don’t know which quantum interpretation is actually the case. There are deterministic (albeit non-local) interpretations, indeterministic interpretations, and interpretations that are agnostic toward the question. The link you provide confuses the term “determinism” with epistemic models. This is to clear up confusions here:

“Determinism” and “Indeterminism” for the Free Will Debate

To be more specific about your main chapter:”The Problem with Ontic Probability”, in physics we set up the framework a bit different :

We all know that quantum mechanics is probabilistic, rather than deterministic. It describes physical systems using the wavefunction, which represents a probability amplitude for obtaining some result when measuring an observable. The evolution of the wavefunction has two parts — unitary and nonunitary — corresponding respectively to deterministic and nondeterministic. Therefore, if determinism is to be true in Nature, we have to assume that quantum mechanics is not a fundamental theory, but rather that there is some more fundamental deterministic theory which describes processes in nature, and that quantum mechanics is just a statistical approximation of that fundamental theory. Thus the concept of “state” described in the previous paragraph(https://scientiasalon.wordpress.com/2014/09/11/farewell-to-determinism/) is defined in terms of that more fundamental theory, and the wavefunction can be extracted from it by averaging the state over the hidden variables. Consequently, in this setup the “state” is more general than the wavefunction. This is also illuminated by the fact that in principle one cannot simultaneously measure both the position and the momentum of a particle, while in the definition above I have not assumed any such restriction for our alleged fundamental deterministic theory.

The mistake is in thinking these are mutually exclusive.

Which I’m saying is epistemic probability amplitude, not ontic.

Pilot wave theory is a deterministic interpretation of QM. We just have to assume contextual non-local variables. Regardless, both determinism and indeterminism are incompatible with ontic probability for the reasons I provided.

Later.

The uncertainty principle is treated differently depending on the quantum interpretation. But

Hello again,

Now back to “free will” position, I argue that you set up a false dichotomy:

having to choose between determinism and randomness is s false dichotomy:

https://www.youtube.com/watch?v=JwaDYS5XC9Y

Cheers,

It’s not a false dichotomy. If an event has variables that force it, it does, if it doesn’t have variables that force it, it doesn’t. There is no third option between variables and no variables.

“Ontological probability”

From the wave function at sometime, Schrodingers equation gives the wavefunction for all future time, and for all past time, in that sense Quantum theory is deterministic as classical physics. Quantum Mechanics, the theory plus the experimental observations, has an intrinsic randomness, which arises with observation, something unexplained within the theory. The waviness in a region is the probability of finding the object in a particular place, but the waviness is not the probability of an object being in a particular place. The difference is the object wasn’t there before you found it there. You could have chosen an interference experiment demonstrating it was spread out over a wide region.

So somehow, your looking CAUSED it to be in a particular case.

Although waviness is a probability, we must contrast quantum probability with classical probability.

Such as an example..

Suppose you’re at a carnival, a fast talking man with faster hands operates a shell game. He places a pea under one of two inverted shells. After his rapid shuffling, your eyes lose track of which shell holds the pea. There is equal probability that it is in either of the 2 places. The operator takes some bets, he lifts the shell on the right, suppose you see the pea. Instantaneously it becomes a certainty that the pea was under the right hand shell. The probability collapses to zero for the left hand shell. Even if you move the left shell a thousand miles away, the probability is still 0 that it was in that shell. There is a huge difference between this classical probability and quantum probability represented by waviness. Classical probability is a statement of ones own knowledge. The shell operator has a 100 percent certain probability and the other had a 50 percent certain probability.

In Quantum probability, the wave function is the WHOLE story. The wavefunction of the atom is the atom. Observing an atom at a particular place, CREATED its being there. If someone looked at a spot and saw the atom there that looked “collapsed”, the atom would be at that spot for everyone. If someone was at a different spot, he would not find the atom at the different spot, but the waviness of the atom existed at the different spot immediately before the observer collapsed it. Quantum theory insists this is so because an interference experiment could have establish the waviness of the atom existed there.

Just to be clear, I have no problem with the particle behaving like a wave, only the idea of ontological probability for such a wave (as such fails non-contradiction). In other words, I think the wave exists, the probability, however does not. If the particle collapses on a specific section of the target, it did not have an actual probability that it could have for another part. Quantum theory does not insist on ontic probability for the wave function, only epistemic.

“If the particle collapses on a specific section of the target, it did not have an actual probability that it could have for another part. ” It collapses ONLY because of interfering/interaction with an observer , like a photon as per double split experiment. The particle was in a wave probailistic state before all over:

https://www.youtube.com/watch?v=fwXQjRBLwsQ

I

have explained how the probability works in the example I gave you, more specific here:

In Quantum PROBABILITY, the wave function is the WHOLE story. The wavefunction of the atom is the ATOM. Observing/by interfering with photon an atom at a particular place, CREATED its being there. If someone looked at a spot and saw the atom there that looked “collapsed”, the atom would be at that spot for everyone. If someone was at a different spot, he would not find the atom at the different spot, but the WAVINESS of the atom existed at the different spot immediately before the observer collapsed it. Quantum theory insists this is so because an interference experiment could have establish the waviness of the atom existed there.

So what I intend to highlight in my example is that BEFORE it collapses it is not in e definite position/determined location(as per Heisenberg inequalities ) aka is a WAVE PROBABILITY . The double split experiment that you also referred to is another QM experiment from where we observed the probabilistic nature/ontological behavior of subatomic particles like electrons/photons .

What we observed , aka epistemic, is what we contrive/deductive reasoning about ontology of QM particles, aka probability.

I hope I clear it up, thanks for leaving me posting

The reason it collapses is irrelevant to the understanding of why the probability is not ontological.

None of this suggests that their is ontic probability. Assessing a wave does not assess an ontic probability of any part of that wave or of any part of where the wave will collapse to. Rather, each part of the wave (the whole story) must be where it is and the collapse does not have a 20%

onticchance to hit the place it did, only a 20% epistemic chance given that we can only assess the wave function.None of this I’m disagreeing with, and again, none of this implies ontic probability.

The probabilistic nature is epistemic, not ontological.

What we observed , aka epistemic, is what we contrive/deductive reasoning about ontology of QM particles, aka probability.

There is no reason to do this in QM, and to do so creates a contradiction. It’s important not to conflate epistemic assessments with ontic assessments, especially when an ontic assessment is logically incoherent.

Regarding your philosophy of “cause and effect” epistemology, as the only option:

“cause and effect” remains part of modern parlance in philosophy of Science. Sure, logic is a great guide, but there are phenomena and systems to which, I urge, it just isn’t applicable. I’d say any system which is quantitatively described as a set of coupled differential equations has escaped the realm where “cause and effect” are applicable. Which phenomena are the causes? which the effects? Feedbacks break that idea.

Things get a LOT simpler if Nature’s probabilistic roots are embraced. I tend to use “probabilistic” since “non-determinism” is ambiguous. It could mean “probabilistic” or it could mean that, at each collapse of an N-way quantum state, the Universe forks into branches with one branch following each of the possibilities. I say the latter speculation is, in Peter Woit’s terms, “not even wrong”, because it cannot be falsified.

As soon as we remove identity all knowledge breaks down, including any scientific assessments of anything. Science itself is based on deduction, induction, falsification, and so on…all differing parts of logic. And they all have the base understanding of the law of identity and non-contradiction – as without such we cannot identify a single thing. No, there is no special “phenomena” that allows for self-contradiction.

First, I don’t know why you think this coupled differential equations have anything to do with ontic events having ontic probability, and second, if an ontic event is outside of “cause and effect” it’s acausal (has no cause). My article addresses both possibilities.

You can use the term “probabilistic” but understand that you are making an epistemic claim with such a word, not an ontic one.

Even if the world forked into many branches per a many world interpretation of QM, each would decohere into the only universe they could. The ontic probability would be identical for each branch from a superpositioned state (100%). A many worlds interpretation is actually deterministic.

“First, I don’t know why you think this coupled differential equations have anything to do with ontic events having ontic probability”

I will give you an example to illustrate what we meant:

Looking at the instantaneous change in, say, the number of prey left, what causes that? An increase in the number of predators? But, if the number of prey gets too low, some of the predators will starve, so there won’t be as many prey eaten. So what causes an instantaneous change in the number of prey then?

http://www.tiem.utk.edu/~gross/bioed/bealsmodules/predator-prey.html

http://en.wikipedia.org/wiki/Lotka%E2%80%93Volterra_equation

http://www.nature.com/scitable/knowledge/library/dynamics-of-predation-13229468

http://www2.hawaii.edu/~taylor/z652/PredatorPreyModels.pdf

Claiming that epistemic description of phenomena IS probabilistic BUT ontological is not is an obfuscating way, philosophical bias, to save determinism from a rebuttal .

You can not ground anything ontological in determinism since from epistemology you get probability reasoning.

If you like we can discuss MWII and Bohemian interpretation if you want to grant determinism ontology in these two.

I have no clue what this has to do with ontic probabilty. Evolution is a causal process, and there is no ontic probability distribution.

Again, even if the universe is indeterministic (which is in no way known by you or any physicist alive), such is equally as problematic for ontic probability.

Probability and determinism aren’t mutually exclusive. In fact, I’d argue causality is needed for a probability assessment (due to the problems with acausality and probability). It’s just that probability assessments (just like dice rolls) are epistemic.

We don’t need to, as ontic probability is equally incompatible with indeterministic interpretations. Again, I’m agnostic on whether the universe is deterministic or indeterministic, because we just don’t know (even if I suspect determinism to be the case). And either way, ontic probability is incoherent and needs to be abandoned, just as locality is abandoned based on Bell’s theorem. In fact, the incoherency of ontic probability is an even stronger case as Bell’s theorem injects some points in that can be contended if we cared to.

Thank you for reply, at least we have a start from epistemology onwards:

So I will take your definition of your ontology and then start from epistemology to get you there:

As you defined it:

Ontic or ontological means that the probability really exists. For those unfamiliar with these words, just think ontic (or the ontological) addresses what “exists” or “is”

And epistemic (or the epistemological) addresses what we can or cannot “know”.

Ontology is the study of existence, being, what is “real”, etc.

But what is “exist” or “is” means in QM/physics? How can we get from what we know about QM particles to make a study it’s existence of what “is” a particle and it’s behavior? By using experiments and formalism/logic/mathematical description , aka what we know about it, in lay- term.

More specific about ontological assessment: Harald Atmanspacher writes extensively about the philosophy of science, especially as it relates to Chaos theory, determinism, causation, and stochasticity. He explains that “ONTIC STATES DESCRIBE ALL PROPERTIES OF A PHYSICAL SYSTEM EXHAUSTIVELY. (‘Exhaustive’ in this context means that an ontic state is ‘precisely the way it is,’ without any reference to epistemic knowledge or ignorance.)”

So what we know in QM about its nature/behavior of reality, aka its ontology of its components, the “quanta”?

https://scientiasalon.wordpress.com/2014/09/11/farewell-to-determinism

It is a fantastic achievement of human knowledge when it becomes apparent that a set of experiments (double split experiment, Bell inequalities , Heisenberg inequalities ,Cauchy problem and Chaos theory )

can conclusively resolve an ONTOLOGICAL question: What IS a PARTICLE and HOW IT BEHAVES.

And moreover that the resolution turns out to be in sharp contrast to the intuition of most people. Outside of superconspiracy theories and “brain in a vat”-like scenarios (which can be dismissed as cognitively unstable), experimental results tell us that the world around us is not deterministic. We know that a deterministic description of nature does not exist.

The analysis presented in the article above suggests that we have only two choices:

(1) accept that Nature is not deterministic,

or (2) accept superdeterminism and renounce all knowledge of physics

Lastly, It is a non sequtur and false analogy to rely on classical logic to refute QM since QM LOGIC is the paradigm that apply to reality, set of rules for reasoning about propositions that takes the principles of quantum theory into account not the Newtonian laws:

For example:

“Quantum logic has some properties that clearly distinguish it from classical logic, most notably, the failure of the distributive law of propositional logic:” For a start:

http://en.wikipedia.org/wiki/Quantum_logic

I have another comment in moderation, I hope extra explanation helps.

Please join us on https://scientiasalon.wordpress.com/2014/09/11/farewell-to-determinism to detail about QM and it;s ontology(since you iterated he Bohemian interpretation, the pilot project).

Regards,

If he’s suggesting that an “ontic state is ontologically probabilistic”, then I disagree with him due to such being logically incoherent.

This doesn’t imply ontic probability.

Ontic probability is NOT a conclusion of these experiments and theories. You are confusing the distinction between being able to only assess a probability for an ontological event, and that event having ontic probability. These are not the same thing.

This is A) irrelevant as ontic probability is equally problematic for indeterminism, and B) not true, as we don’t know if a deterministic interpretation or an indeterministic interpretation of QM is in actuality the case. All we can assess is that if it is deterministic it’s non-local determinism. The article mistakes what determinism and indeterminism is (for physics) with this definition

“given the state of the Universe at some moment, one can calculate a unique state of the Universe at any other moment”. It’s not about being able to “calculate” such. Again, read here:http://breakingthefreewillillusion.com/determinism-indeterminism-confusions/

The fact that the article assesses determinism as being problematic with chaos theory which is always an entirely deterministic model leads me to believe the author doesn’t know what they are talking about. The others such as the ‘inequalities’ also are not incompatible with determinism, only local determinism.

Quantum logic, is epistemic, not ontic….and it’s not something separate from “classical” logic in that it’s not a rejection of the law of identity or non-contradiction (it’s just not bivalent). The main distinction is that it’s multi-valued rather than bivalent…which only leans to it’s epistemic (rather than ontic) nature. Most muti-valued logic is used due to epistemic uncertainty. None of this says anything about there being ontic probability, nor does it suggest that QM allows for a violation of non-contradiction (if it did all theorems would collapse).

Later.

I finally have a question for you, sincere: )

How/what can get you to your ontology of QM if you just claim that all epistemology doesn’t lead to ontology at all?

What magic method do you use then to have as ontological access to QM if you refute all experiments, probability of wave-function .

How your “state” of a physical system needs to be formulated more precisely, in your terms please?

because you claim that wave-function of particle in superposition is probabilistic in an epistemic approach but not in ontological approach?:)

I’m not saying that we don’t require an epistemological standard (such as logic) to assess an ontological understanding, only that probability (if one is not illogical) is epistemic, meaning it’s only assesseds because we don’t have access to or know all of the variables. This is not just about a physical system, this applies to any system – as logical coherency is important for all states.

The probabilities assessed within a wave function are epistemic probabilities. This does not mean that the wave itself is epistemic, the wave in an ontic behavior. It’s only any notion of probability for such that needs to be epistemic.

Perhaps it’s better states as

“ontic probability is illogical/contradictory” and “quantum mechanics is not outside of logic/law of non-contradiction”. If one wants to imagine some area that can be outside of identity and non-contradiction, that is what would be “magic” as there is no other word for it. 😉I thought I understood your article, but Sile’s comments have left me confused. Flipping a coin produces an outcome with 100% probability. To aid our epistemic understanding, we have to use an abstraction: before an unbiased coin is flipped it consists of two states (heads,tales) each having a 50% probability. The act of flipping the coin is effectively a mathematical transform from the abstract dual-state domain to the real single-state outcome domain. Of course, the transform itself is unpredictable and it doesn’t matter if it’s actually deterministic or indeterministic providing that it’s stochastic.

What the heck is the coin’s ontic probability? To my mind, from on ontic perspective the coin is an item of currency, which is a domain that is totally incompatible with both probability and the heads/tails outcome after flipping it. In other words, it would be absurd to devise an ontic mathematical transform from the domain currency to the outcome domain of heads-tails. Domain errors are a subset of category mistakes. I’m fairly sure that this mistake is committed in some of the QM interpretations — hence bizarre notions such as the multiple worlds interpretation and “spooky action at a distance”. Bizarre notions are avoided by using the Hilbert space to analyse aspects of QM because this analysis domain is better suited to the subject than the space-time domain.

Similarly, we can analyse signals in the time domain, the frequency domain, or various other domains, depending on which properties of the signal or system we wish to evaluate and discuss. I might describe the source that generates the signal as being ontic, whereas the time domain and frequency domain representations of the signal are epistemic.

References

Demystifying the Delayed Choice Experiments, by Bram Gaasbeek. arXiv:1007.3977

An Epistemic Model of Quantum State with Ontic Probability Amplitude, by Arun Kumar Pati, Partha Ghose, and A. K. Rajagopal. arXiv:1401.4104v2

Excellent thoughts Pete. Thanks for the visit.

Well said. The problem is when people suggest that the time domain, frequency domain, and even location domain (when a probability is assessed for such) can be ontic.

> If we say that the particles could end up in location A with a 75% chance or location B with a 25% chance, what are we saying? Basically, we are saying that there is some sort of mechanism that is forcing more particles to A than to B.

No, there is no separate mechanism that is forcing more particles to A than to B. This is just a description of the underlying physics. It is this assumption of yours that is misleading your logic.

> If we are suggesting that there is no forcing factor here, we have a problem as to what accounts for the probability distribution.

QM describes precisely what accounts for the probability distribution.

> The problem should be obvious, if there are causal variables that lead the result to Y, then those causal variables couldn’t lead to X or Z

Indeed, which is why we don’t believe that there are causal variables that lead the result to Y, X, or Z. Experiments confirming Bell’s inequality support the position that there are no such variables. It is your assertion that such variables do exist that are the root of the problem with your logic.

Another problem is your insistence that the relations between events are either causal or acausal. Events at the quantum level don’t fit those definitions.

It isn’t an assumption. It’s a logical assessment. If there isn’t something that causes the probability distribution, then you are addressing an event that has no spatial or temporal determinacy (an acausal event). Such an event would have no probability distribution, it could happen at any time in any location or never happen at all. That is it’s only distribution possible.

If you think this then you don’t know QM. The reason for all of the different interpretations is that we don’t know what accounts for the distribution.

It only supports the position that variables cannot be local.

Only given your interpretation of QM. Mosts interpretations do fit those definitions. Even the Copenhagen interpretation that you like to point to doesn’t imply an objective collapse theory. You suggesting such (which if you didn’t know it that is what you are doing) is a meta-physical bias on your part that goes against logic.

Keep in mind that I’m only making a logical case here. If you want to pretend that events on the quantum scale are special

“outside of logic”cases even though every theory we have in QM (including Bell’s theorem) is based on mathematics and logical tautologies such as identity, then you are basically taking a self-defeating POV.If you want to stay inside of logic then there is no logical middle ground between an event that is caused and one that is not caused, as those are in opposition. This very article explains the logical reasons why probability isn’t ontic….and that includes QM.

Thanks for the visit Neal. 😉

> If you want to stay inside of logic then there is no logical middle ground between an event that is caused and one that is not caused, as those are in opposition. This very article explains the logical reasons why probability isn’t ontic….and that includes QM.

This very article has the flaws which I pointed out.

> Even the Copenhagen interpretation that you like to point to doesn’t imply an objective collapse theory.

Indeed, I didn’t mean to imply it did. It doesn’t matter whether the indeterminism is introduced as an effect of measurement (Copenhagen) or randomly/spontaneously (objective collapse), since it is the observations that we make which we interpret to be events. By your definition of the result may be a (caused) superposition of two states. But the observed state is neither causal nor acausal, because it is not an event, it is an observation. It is usually the observations of the world that we interpret to be the events that we speak of. We are never aware of observing a superposition.

My very point is that the criticisms you “pointed out” were actually addressed in the very article you are criticizing. You can assert they are “flawed” but assertions are not arguments. You have not even touched on the arguments. You simply make assertions like “events at the quantum level don’t fit those definitions (caused/not caused)” when they are logical oppositions. It would be like saying bottles do not fit in the definitions of moon and not moon. Of course a bottle would fall under one of those (namely not-moon).

Let’s be clear what we mean by “observation” in QM…we only mean “

measurement(through particle interaction)”. I think you are off on a different tangent here.> You simply make assertions like “events at the quantum level don’t fit those definitions (caused/not caused)” when they are logical oppositions. It would be like saying bottles do not fit in the definitions of moon and not moon. Of course a bottle would fall under one of those (namely not-moon).

Is the math field of statistics causal or acausal? Are bottles causal or acausal? They have to be one or the other, since the two are logical opposites.

> Let’s be clear what we mean by “observation” in QM…we only mean “measurement (through particle interaction)”. I think you are off on a different tangent here.

What, in your view, would be an example of an “event” that is not a particle interaction?

Causal and acausal refer to events. Bottles and moons are objects. Are those things you addressed “events”? Is there any example of an “event” that takes place other than your mysterious and magical quantum phenomenon that doesn’t fall under either being caused or not being caused??

We can say a single event in mathematical computation is either a causal or an acausal event. We can say that a bottle falling off a table was either caused to fall or not caused to fall (fell acausally). But make no mistake about it, when we refer to events, either something causes the event, or it happens without a cause.

As a physicalist and materialist, I think all causal events are “particle interactions” that have spatial and temporal determinacy, and if there is an acausal event it would be one that comes into existence without a cause and have no spatial or temporal determinacy. But no matter what event is postulated, it either happens causally or without a cause. This is the case even if we postulate some non-material event. Either it is an output of something else, in which case it’s causal, or it is not an output of something else, in which case it’s not causal.

You, on the other hand want to say that the probability distribution that happens in QM is not causal (but not not causal?), and the above article explains the problem with acausal probability distribution that is more than (happens at some point in time in some location in time or never)…but rather, has a 65% chance of landing in this block and a 35% chance of landing in this block, both being temporally and spatially located, and it comes from an existing particle (rather than something coming into existence acausally and effecting the trajectory).

By your terminology, I don’t believe particles *have* precise definite spacial and termporal determinacy, but are better described by a distribution of “possible” locations (which, collectively, are a description of the particle itself).

If one must divide events into those that are *caused* and those that are *uncaused*, I would say that most particle interactions that result in observable phenomena could be classified as *caused* (in the sense of influenced). More fundamentally, the *caused* outcome of the interaction is typically a superposition of what you would perceive as two (or more) distinct and mutually exclusive events. But the observation (of one or the other alternative) would be a separate event.

This is where the problem comes in. If they have definite spatial and temporal determinacy, then the probability distribution is epistemic and not ontic. Those other places that the particle does not distribute to were never real ontological possibilities. They were only a limitation on our epistemic knowledge.

Then the “collapse” or “decoherence” of the superpositioned state is caused as well, and it could not collapse to otherwise than what causality dictates. Remember, a cause cannot both be the cause of B and not the cause of B (but of C instead of B) – as that is a self-contradiction.

In such a case the probability distribution of QM is entirely epistemic. It is due to our lack of understanding the causal factors that bring the particle to the 70% location over the 30% location. The “measurement” is just a part of the causality (our perceptual observations are irrelevant).

> This is where the problem comes in. If they have definite spatial and temporal determinacy, …

Which, as I said, I don’t believe they do…

> Then the “collapse” or “decoherence” of the superpositioned state is caused as well, and it could not collapse to otherwise than what causality dictates.

“Collapse” and “Decoherence” are parts of different theories. In the Copenhagen interpretation, collapse is a process that is caused by observation, but the result is not determinate; there is no “causality” that “dictates” the outcome. In the many worlds interpretation, decoherence is a description of the evolution of distinct “worlds” (possible futures) that evolve by becoming entangled with the environment (including the observer). In either case, neither observable outcome an event is inevitable.

> Remember, a cause cannot both be the cause of B and not the cause of B (but of C instead of B) – as that is a self-contradiction.

Right… we only observe one future in which the event causes one or the other outcome (nondeterministically). Never both.

Oh sorry, my mistake. In that case we run into the other problem with what pushes the event to one location/time over another if there is no determinacy there. So, for example, if you have an event with a 50% probability of ending up at location A, a 30% probability for location B, and a 20% probability for location C…if nothing causes one over the other, how can it be the case for a larger chance for location A. If you are saying that it’s part of the existing particles wave function, then you are implying the determinacy of the wave itself. If you are saying there is no variables, then you are implying something outside of the determinacy of the wave that makes it slip into one over the other (but that implies determinacy).

I know that, this is why I said collapse OR decoherence…meaning depending on the interpretation. Pilot wave theory is also a decoherence model.

You are using “cause” and “nondeterminism” in the same sentence. An event doesn’t “cause nondeterministically”. And the contradiction has nothing to do with our observations. A cause cannot have the variables that lead to B and those same variables that do not lead to B…as that implies the cause contains contradictory variables. If the cause has the variables that lead to B, you would need an acausal event to push it so B doesn’t happen (e.g. to C instead). If there are no causal variables for either B or C, then an acausal event would need to push the state to one over the other. The problem comes when thinking that B can have a probability of say 75% while C a probability of 25% based on an acausal event that has no spatial or temporal determinacy to weight such.

> If you are saying that it’s part of the existing particles wave function, then you are implying the determinacy of the wave itself.

Sure, but that is not what we observe. We observe Y (or Z, nondeterministically), not the (deterministic) wave function that describes the superposition. And we interpret Y (or Z) to be the result of the event.

> A cause cannot have the variables that lead to B and those same variables that do not lead to B…as that implies the cause contains contradictory variables.

Right, which is why we do not believe that there exist variables that “push the result” to one or the other.

> The problem comes when thinking that B can have a probability of say 75% while C a probability of 25% based on an acausal event that has no spatial or temporal determinacy to weight such.

The equations of QM do describe the “weights” associated with various (superposed) outcomes of a causal event.

The only thing we “observe” is a probability distribution on the particle screen. I don’t know why you keep bringing that up? It has nothing to do with ontic probability, only epistemic.

Right, which is why you have a problem when it comes to probability distribution without any temporal or spatial determinacy.

“Weight” is metaphorical. A probability distribution is said to be “weighted” for a 75% compared over a 25% chance. It just addresses the distinction between higher and lower probability distributions which are entirely problematic when you are saying there is no “variables” that push one to the other.

The equations of QM simply show the distribution of the wave function. They say nothing about such being an ontic probability – in fact I’d suggest they infer an epistemic probability which is actually dependent on how the wave “ends up” on screen interaction. But that is neither here nor there. The point is, we not only do not need to assess ontic probability in QM – but it’s actually an illogical concept to suggest it.