Falsification in Math vs ScienceAre there mathematical properties a mathematical object might have only contingently?Is the conservation of energy actually a characterisation rather than an imposed or deduced law?Why must we choose an intuitionistic explanation over a paraconsistent one, given they are dual?Can science work without mathematical formulations?What is the absolute ultimate subject (like math, literature, etc)?Is the Kuhnian paradigm shift (or sublation) materialistic or idealistic?Are there two different mathematics in philosophy?Was Kant right about space and time (and wrong about knowledge)?Per Kuhn's “puzzle solving” demarcation criteria, don't Creationism and Lysenkoism simply fall into the category of “normal science”?Are the “laws” of deductive logic empirically verifiable?

Why didn't the check-in agent recognize my long term visa?

Pressure inside an infinite ocean?

What is a smasher?

Formating an equation

How can I preview an image in its original size?

How do LIGO and VIRGO know that a gravitational wave has its origin in a neutron star or a black hole?

Which module had more 'comfort' in terms of living space, the Lunar Module or the Command module?

Even some useless stuff would be of use some day

Does Tatsumaki wear panties?

I have a unique character that I'm having a problem writing. He's a virus!

29ER Road Tire?

Is the set of non invertible matrices simply connected? What are their homotopy and homology groups?

Where can I go to avoid planes overhead?

My rubber roof has developed a leak. Can it be fixed?

Building a list of products from the elements in another list

Have I damaged my car by attempting to reverse with hand/park brake up?

Why are prions in animal diets not destroyed by the digestive system?

How can I get people to remember my character's gender?

Applying a GPO to local users except local administrators on Workgroup computers

How can I support myself financially as a 17 year old with a loan?

How to use dependency injection and avoid temporal coupling?

As a Bard multi-classing into Warlock, what spells do I get?

What does this colon mean? It is not labeling, it is not ternary operator

In Russian, how do you idiomatically express the idea of the figurative "overnight"?



Falsification in Math vs Science


Are there mathematical properties a mathematical object might have only contingently?Is the conservation of energy actually a characterisation rather than an imposed or deduced law?Why must we choose an intuitionistic explanation over a paraconsistent one, given they are dual?Can science work without mathematical formulations?What is the absolute ultimate subject (like math, literature, etc)?Is the Kuhnian paradigm shift (or sublation) materialistic or idealistic?Are there two different mathematics in philosophy?Was Kant right about space and time (and wrong about knowledge)?Per Kuhn's “puzzle solving” demarcation criteria, don't Creationism and Lysenkoism simply fall into the category of “normal science”?Are the “laws” of deductive logic empirically verifiable?













21















In the beginning it was thought that the statement 1+1=0 is false, and necessarily so.
However, with the birth of modular arithmetic, it was found that indeed, 1+1 does indeed equal to 0 (in the mod 2 setting).



Now in the sciences for example physics, it's understood that Newtonian mechanics has in a sense been falsified, in the sense that, while it remains correct in its regime of validity, there are other regimes in which it is not correct.



However, while most people would say that in the second case, we have Newtonian mechanics being falsified, we do not consider 1+1=2 to have been falsified (in its own regime of validity). Why is that?










share|improve this question

















  • 55





    1+1=2 is still true in the mod 2 setting. It just so happens that 2 and 0 are names for the same object in this setting.

    – Dan Staley
    Apr 9 at 20:14






  • 5





    Newtonian mechanics is not considered to be falsified by any working physicist that I know. Where did you get that? Or do you think QM or GR falsified it? Neither is true. It is indeed possible that GR is falsified in the future, like if there is a dilaton after all. And in a sense the original version has already been falsified with the addition of a cosmological term of the opposite sign to one that Einstein considered and rejected. QM definitely does not falsify Newtonian mechanics. QM is built on the same foundation, and for sure F=ma is still true in QM.

    – Kostas
    Apr 10 at 9:03







  • 2





    @Kostas Relativity isn't accounted for in Newtonian physics, so at high velocities strictly Newtonian physics does not properly apply. Quantum mechanics also gives rise to phenomenon that are impossible in purely Newtonian mechanics. Luboš Motl has a great explanation on physics SE physics.stackexchange.com/a/97749/127931

    – JMac
    Apr 11 at 12:48







  • 6





    @Kostas "Newtonian mechanics is not considered to be falsified by any working physicist that I know" you don't know many - it's been falsified in quantum mechanics and by relativity; and probably others. However for most "everyday" physics that people want to deal with, it remains a sufficient approximation.

    – UKMonkey
    Apr 11 at 15:43







  • 2





    And once again, an argument arises due to vague use of terminology. Unless you agree on what the verb "to falsify" means, you cannot agree on whether or not it applies.

    – barbecue
    Apr 11 at 16:55















21















In the beginning it was thought that the statement 1+1=0 is false, and necessarily so.
However, with the birth of modular arithmetic, it was found that indeed, 1+1 does indeed equal to 0 (in the mod 2 setting).



Now in the sciences for example physics, it's understood that Newtonian mechanics has in a sense been falsified, in the sense that, while it remains correct in its regime of validity, there are other regimes in which it is not correct.



However, while most people would say that in the second case, we have Newtonian mechanics being falsified, we do not consider 1+1=2 to have been falsified (in its own regime of validity). Why is that?










share|improve this question

















  • 55





    1+1=2 is still true in the mod 2 setting. It just so happens that 2 and 0 are names for the same object in this setting.

    – Dan Staley
    Apr 9 at 20:14






  • 5





    Newtonian mechanics is not considered to be falsified by any working physicist that I know. Where did you get that? Or do you think QM or GR falsified it? Neither is true. It is indeed possible that GR is falsified in the future, like if there is a dilaton after all. And in a sense the original version has already been falsified with the addition of a cosmological term of the opposite sign to one that Einstein considered and rejected. QM definitely does not falsify Newtonian mechanics. QM is built on the same foundation, and for sure F=ma is still true in QM.

    – Kostas
    Apr 10 at 9:03







  • 2





    @Kostas Relativity isn't accounted for in Newtonian physics, so at high velocities strictly Newtonian physics does not properly apply. Quantum mechanics also gives rise to phenomenon that are impossible in purely Newtonian mechanics. Luboš Motl has a great explanation on physics SE physics.stackexchange.com/a/97749/127931

    – JMac
    Apr 11 at 12:48







  • 6





    @Kostas "Newtonian mechanics is not considered to be falsified by any working physicist that I know" you don't know many - it's been falsified in quantum mechanics and by relativity; and probably others. However for most "everyday" physics that people want to deal with, it remains a sufficient approximation.

    – UKMonkey
    Apr 11 at 15:43







  • 2





    And once again, an argument arises due to vague use of terminology. Unless you agree on what the verb "to falsify" means, you cannot agree on whether or not it applies.

    – barbecue
    Apr 11 at 16:55













21












21








21


10






In the beginning it was thought that the statement 1+1=0 is false, and necessarily so.
However, with the birth of modular arithmetic, it was found that indeed, 1+1 does indeed equal to 0 (in the mod 2 setting).



Now in the sciences for example physics, it's understood that Newtonian mechanics has in a sense been falsified, in the sense that, while it remains correct in its regime of validity, there are other regimes in which it is not correct.



However, while most people would say that in the second case, we have Newtonian mechanics being falsified, we do not consider 1+1=2 to have been falsified (in its own regime of validity). Why is that?










share|improve this question














In the beginning it was thought that the statement 1+1=0 is false, and necessarily so.
However, with the birth of modular arithmetic, it was found that indeed, 1+1 does indeed equal to 0 (in the mod 2 setting).



Now in the sciences for example physics, it's understood that Newtonian mechanics has in a sense been falsified, in the sense that, while it remains correct in its regime of validity, there are other regimes in which it is not correct.



However, while most people would say that in the second case, we have Newtonian mechanics being falsified, we do not consider 1+1=2 to have been falsified (in its own regime of validity). Why is that?







philosophy-of-science philosophy-of-mathematics falsifiability






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Apr 9 at 16:31









K9LucarioK9Lucario

31519




31519







  • 55





    1+1=2 is still true in the mod 2 setting. It just so happens that 2 and 0 are names for the same object in this setting.

    – Dan Staley
    Apr 9 at 20:14






  • 5





    Newtonian mechanics is not considered to be falsified by any working physicist that I know. Where did you get that? Or do you think QM or GR falsified it? Neither is true. It is indeed possible that GR is falsified in the future, like if there is a dilaton after all. And in a sense the original version has already been falsified with the addition of a cosmological term of the opposite sign to one that Einstein considered and rejected. QM definitely does not falsify Newtonian mechanics. QM is built on the same foundation, and for sure F=ma is still true in QM.

    – Kostas
    Apr 10 at 9:03







  • 2





    @Kostas Relativity isn't accounted for in Newtonian physics, so at high velocities strictly Newtonian physics does not properly apply. Quantum mechanics also gives rise to phenomenon that are impossible in purely Newtonian mechanics. Luboš Motl has a great explanation on physics SE physics.stackexchange.com/a/97749/127931

    – JMac
    Apr 11 at 12:48







  • 6





    @Kostas "Newtonian mechanics is not considered to be falsified by any working physicist that I know" you don't know many - it's been falsified in quantum mechanics and by relativity; and probably others. However for most "everyday" physics that people want to deal with, it remains a sufficient approximation.

    – UKMonkey
    Apr 11 at 15:43







  • 2





    And once again, an argument arises due to vague use of terminology. Unless you agree on what the verb "to falsify" means, you cannot agree on whether or not it applies.

    – barbecue
    Apr 11 at 16:55












  • 55





    1+1=2 is still true in the mod 2 setting. It just so happens that 2 and 0 are names for the same object in this setting.

    – Dan Staley
    Apr 9 at 20:14






  • 5





    Newtonian mechanics is not considered to be falsified by any working physicist that I know. Where did you get that? Or do you think QM or GR falsified it? Neither is true. It is indeed possible that GR is falsified in the future, like if there is a dilaton after all. And in a sense the original version has already been falsified with the addition of a cosmological term of the opposite sign to one that Einstein considered and rejected. QM definitely does not falsify Newtonian mechanics. QM is built on the same foundation, and for sure F=ma is still true in QM.

    – Kostas
    Apr 10 at 9:03







  • 2





    @Kostas Relativity isn't accounted for in Newtonian physics, so at high velocities strictly Newtonian physics does not properly apply. Quantum mechanics also gives rise to phenomenon that are impossible in purely Newtonian mechanics. Luboš Motl has a great explanation on physics SE physics.stackexchange.com/a/97749/127931

    – JMac
    Apr 11 at 12:48







  • 6





    @Kostas "Newtonian mechanics is not considered to be falsified by any working physicist that I know" you don't know many - it's been falsified in quantum mechanics and by relativity; and probably others. However for most "everyday" physics that people want to deal with, it remains a sufficient approximation.

    – UKMonkey
    Apr 11 at 15:43







  • 2





    And once again, an argument arises due to vague use of terminology. Unless you agree on what the verb "to falsify" means, you cannot agree on whether or not it applies.

    – barbecue
    Apr 11 at 16:55







55




55





1+1=2 is still true in the mod 2 setting. It just so happens that 2 and 0 are names for the same object in this setting.

– Dan Staley
Apr 9 at 20:14





1+1=2 is still true in the mod 2 setting. It just so happens that 2 and 0 are names for the same object in this setting.

– Dan Staley
Apr 9 at 20:14




5




5





Newtonian mechanics is not considered to be falsified by any working physicist that I know. Where did you get that? Or do you think QM or GR falsified it? Neither is true. It is indeed possible that GR is falsified in the future, like if there is a dilaton after all. And in a sense the original version has already been falsified with the addition of a cosmological term of the opposite sign to one that Einstein considered and rejected. QM definitely does not falsify Newtonian mechanics. QM is built on the same foundation, and for sure F=ma is still true in QM.

– Kostas
Apr 10 at 9:03






Newtonian mechanics is not considered to be falsified by any working physicist that I know. Where did you get that? Or do you think QM or GR falsified it? Neither is true. It is indeed possible that GR is falsified in the future, like if there is a dilaton after all. And in a sense the original version has already been falsified with the addition of a cosmological term of the opposite sign to one that Einstein considered and rejected. QM definitely does not falsify Newtonian mechanics. QM is built on the same foundation, and for sure F=ma is still true in QM.

– Kostas
Apr 10 at 9:03





2




2





@Kostas Relativity isn't accounted for in Newtonian physics, so at high velocities strictly Newtonian physics does not properly apply. Quantum mechanics also gives rise to phenomenon that are impossible in purely Newtonian mechanics. Luboš Motl has a great explanation on physics SE physics.stackexchange.com/a/97749/127931

– JMac
Apr 11 at 12:48






@Kostas Relativity isn't accounted for in Newtonian physics, so at high velocities strictly Newtonian physics does not properly apply. Quantum mechanics also gives rise to phenomenon that are impossible in purely Newtonian mechanics. Luboš Motl has a great explanation on physics SE physics.stackexchange.com/a/97749/127931

– JMac
Apr 11 at 12:48





6




6





@Kostas "Newtonian mechanics is not considered to be falsified by any working physicist that I know" you don't know many - it's been falsified in quantum mechanics and by relativity; and probably others. However for most "everyday" physics that people want to deal with, it remains a sufficient approximation.

– UKMonkey
Apr 11 at 15:43






@Kostas "Newtonian mechanics is not considered to be falsified by any working physicist that I know" you don't know many - it's been falsified in quantum mechanics and by relativity; and probably others. However for most "everyday" physics that people want to deal with, it remains a sufficient approximation.

– UKMonkey
Apr 11 at 15:43





2




2





And once again, an argument arises due to vague use of terminology. Unless you agree on what the verb "to falsify" means, you cannot agree on whether or not it applies.

– barbecue
Apr 11 at 16:55





And once again, an argument arises due to vague use of terminology. Unless you agree on what the verb "to falsify" means, you cannot agree on whether or not it applies.

– barbecue
Apr 11 at 16:55










6 Answers
6






active

oldest

votes


















58














In math, we define stuff like numbers and operators, then we go on to prove other stuff from those premises.



When you ask: "Is 1 + 1 = 0?", a mathematician will just ask back: "With what definition of +?"



  • If you assume natural numbers and the common definition of +, then this statement is false.


  • If you assume numbers modulo 2 and + meaning XOR, then this statement is perfectly true.


You cannot say that we falsified the claim that 1 + 1 = 2; we just came up with new definitions for what + could mean.




For physics, the situation is a little different: Here we measure stuff we want to explain, then we whip up some fancy theory to explain the measurements, and finally, we test the theory by measuring more stuff, trying to falsify our theory.



In the example you gave, Newton had some measurements (like falling apples) that he wanted to explain, so he came up with his theory of force, acceleration and movement, and he could explain his measurements. We continued testing his theory until finally the Michelson-Morley Experiment produced numbers that could not be explained with Newtonian Physics anymore. So, Einstein came up with a new theory that could explain all that the old theory could explain, and which also explained the result of the Michelson-Morley Experiment.



Note that I said "could explain all that the old theory could explain". Newton's theory worked fine for small numbers, and Einstein's theory had to make the same predictions for these small numbers. More precisely, Newton's theory is nothing more or less than a convenient approximation of Einstein's theory for small numbers.



We do this a lot in physics: We know that some stuff obeys some complex rules, but we don't want to bother with deriving mathematically correct results, so we just use an approximation (and hopefully check that this approximation is indeed not too far off). The point is, in the end physicists can only falsify stuff by measurement which includes measurement errors, and it does not help our cause to calculate stuff we can't measure. But the approximations allow us to make conclusions we cannot derive with mathematical rigor.



So, a physicist with a quartz-clock, a ruler, and a scale will simply use Newtonian Mechanics to predict their measurements. A physicist listening in to gravitational waves does not have this luxury, they must use General Relativity to derive their predictions. The first physicist simply uses an approximation (Newtonian Mechanics) to an approximation (Special Relativity) to a theory (General Relativity) which we do not know what theory it approximates yet (String Theory? Loop Quantum Gravity? Something else?)



In this sense, no physical theory is fully correct or incorrect. For some theories we know where they start producing numbers that actually disagree with our experiments, and for others we may not have discovered those places yet. But that does not reduce their usefulness when we apply them to problems where we know that they yield sufficiently precise results. In the end, any physical theory is just an approximation of reality.






share|improve this answer

























  • Comments are not for extended discussion; this conversation has been moved to chat.

    – Geoffrey Thomas
    Apr 12 at 7:15











  • Very nicely put. I've bookmarked this so I can refer others to it in future.

    – Neil_UK
    Apr 12 at 10:49











  • For example we use Pi. We rarely use every digit of Pi.

    – candied_orange
    Apr 12 at 12:12











  • Awesome! You should title your post something like "induction vs deduction".

    – Joshua Ronis
    Apr 12 at 14:24






  • 1





    @candied_orange we never use every digit of Pi, and never will. :D

    – Cássio Renan
    Apr 12 at 20:54



















11














The hypothesis 1+1=0 is false in the domain of natural numbers. If the domain is the finite field of the integers mod 2, then one is no longer in the domain of the natural numbers and the statement 1+1=0 would be true in that domain.



The question is why do we not consider these to be falsifications of each other?



These are not contradictions or falsifications if we view these statements in their separate domains. The domain of the natural numbers is not the domain of the integers mod 2. Although the statements may look the same, they are statements derived or not from different domains and so are different.




Falsification would be involved in mathematics by assuming something with the intent to arrive at a contradiction. If one can arrive at the contradiction, then one can conclude that what one assumed true was false. One name for this inference rule would be proof by contradiction.



Wikipedia describes it as follows:




In logic, proof by contradiction is a form of proof that establishes the truth or validity of a proposition by first assuming that the opposite proposition is true, and then shows that such an assumption leads to a contradiction. Proof by contradiction is also known as indirect proof, proof by assuming the opposite, and reductio ad impossibile.





Wikipedia contributors. (2019, March 26). Proof by contradiction. In Wikipedia, The Free Encyclopedia. Retrieved 16:55, April 9, 2019, from https://en.wikipedia.org/w/index.php?title=Proof_by_contradiction&oldid=889548940






share|improve this answer






























    6














    1 + 1 = 0 is false.



    Meanwhile, (1_2) +_2 (1_2) = 0_2 is true. Here +_2 is a different operation than +, and 1_2 and 0_2 are different things than 1 and 0. So it's not surprising that one equation is true while the other is false.



    The problem is that we do not like to write "_2" everywhere, so we often write 1 + 1 = 0 when we mean 1_2 +_2 1_2 = 0_2. This can possibly lead to confusion, though hopefully the author (or context) will make it clear what is meant by the equation 1 + 1 = 0, whenever it is written, so that ambiguity is avoided.



    I would not say that Newtonian physics is "false", but I would say that it does not accurately predict certain observations we make about our universe, while Einstein's Relativity does seem to predict these observations quite well. So Newtonian physics is apparently not the best theory for our physical universe.



    However, since there are no other universes(?), physicists frequently omit the phrase "for our physical universe" for convenience.






    share|improve this answer























    • Hmmm. I appreciate the effort @Eric Towers. Yet, when I wondered how other universes could matter to me, I wasn't referring to "mechanisms of action," but rather about how another universe could have any impact on my experiences, my purposes, my plans or my actions. I deal with whatever gravity affects me in any case, and when I do need to figure it into my calculations (very rarely!) its origin isn't relevant. I hope that's clear enough...

      – Rortian
      Apr 11 at 1:10











    • @Graipher, I understand your point, and I appreciate your participation. "Maybe how you commute to work in the year 3000 depends on FTL travel exploiting that. " Who? Me???? I don't think so...doesn't that seem rather unlikely to you? I won't be looking forward to that! I'm concerned about more immediate and pressing matters than considering theoretical aspects of abstract descriptions of events. "fundamental research...might lead to something new." I get that. I'm simply more concerned with people than I am with physics.

      – Rortian
      Apr 11 at 11:19












    • @Rortian Even as such, you ignore the possibility of the butterfly effect. It may be the case that due to some essentially untraceable chain of events linked to a slight perturbation in an alternate universe, your interactions with your friends is vastly different. I don't see your point. This sounds like the "I don't get it, so it doesn't matter" argument.

      – Don Thousand
      Apr 12 at 4:38











    • @DonThousand I appreciate your thoughts on this. I’m not ignoring chaos and complexity, I’m dealing with it. I’m simply focussing on what seems to be much more important to me and the people around me, rather than spending my (precious) time (I’m much older than you!) considering the theoretical effects of other possible worlds on our current activities. I’m not a physicist and I don’t aspire to be one. … “It may be the case”…Do you believe that anything is possible? If so, then the notion of possibility would lose its meaning…

      – Rortian
      Apr 12 at 9:52












    • @DonThousand I think I “get” what you’ve said; can you tell me why I ought to believe that it’s more important to me than it is right now? I’m working to promote educational philosophy and educational psychology, critical thinking and ethics. Would you care discuss any topic within those categories? I'd be happy to!

      – Rortian
      Apr 12 at 9:53


















    4














    Your example from mathematics shows: To assess a mathematical statement one should first fix the context, the domain of validity of the symbols. Because in the context of natural numbers the statement 1+1=0 is false. While in the context of Z/2Z the statement is correct. Except the rare case of undecidable questions, in mathematics one can prove or disprove the correctness of a statement.



    In natural science, e.g. in physics, one can never prove a general statement. All „laws of nature“ have the status of hypotheses. One can confirm a hypothesis or one can disprove the hypothesis by a counter example. But one cannot prove it.



    IMO Newton’s laws are a good approximation in a broad domain, i.e. for small energy and medium dimension. One can deduce Newton’s laws as first order approximation from the field equations of General Relativity. But anyhow, Newton’s basic concepts like space, time and gravitation have quite a different meaning in Einstein’s theory of curved spacetime.



    To say that a statement turns out as a first approximation is a refined and better version than saying that the statement is false. I prefer the refined version :-)






    share|improve this answer
































      4














      Hmmm. What about 1 + 1 = 10 ?



      Is that equation, expressed in binary arithmetic, "false in the domain of natural numbers"?



      My grounding in math and logic isn't very strong, but I understand the Wikipedia entry...I just don't think that the notions of truth and falsity can coherently apply to inductive inferences (abstract descriptions of unobservable things).



      I've also heard people (philosophy professors!) say that Einstein disproved Newton's stuff, but that seems incorrect to me in the postmodern age of philosophy. Newton wasn't mistaken, his principles describe his observations very well. His theoretical model wasn't poor or wrong, and scientific proof seems to me to be an oxymoron. People who have faith in that idea reject the notion of fallibilism, which is commensurate with the postmodern approach:




      Fallibilism is the epistemological thesis that no belief (theory,
      view, thesis, and so on) can ever be rationally supported or justified
      in a conclusive way. Always, there remains a possible doubt as to the
      truth of the belief. Fallibilism applies that assessment even to
      science’s best-entrenched claims and to people’s best-loved
      commonsense views.




      Stephen Hetherington, Internet Encyclopedia of Philosophy



      I've learned (is this correct?) that true/false distinctions are properties of formal languages ( arithmetic and logic), where axioms and operations are strictly defined and the symbols are unrelated to observable phenomena (only to each other).



      As for discourses in natural languages, the idea of epistemic truth (especially the theory of correspondence to reality) has been quite thoroughly discredited...or so I've been told...



      I'm new here; I wouldn't be happy to hear that I (and my wisest philosophy instructors) have misinterpreted Kant, Kuhn and Popper, but if a wise expert disagrees I'll listen. I believe that it's theoretically impossible to denote the absolute truth about unobservable phenomena or complex abstractions, but I'm still learning...



      In any case, to me coherency is the gold standard of human understanding, not truth. Subjective beliefs and public discourses may be assessed according to so-called objective criteria: the reliability of relevant evidence and the justification for one's presumptions. I think that that's one thing upon which scientists and philosophers might agree.






      share|improve this answer
































        2














        Two understand why this is so, you have to understand the distinction between "false" and "incorrect".nYour question is caused by conflation of the two. I must admit that my usage of the two terms is not universal, but the distinction is common even when people use other terms for it (confusingly enough, textbooks on formal logic tend to use "false" for what I call "incorrect" here!).



        All branches of science make models, which are representations of reality. These models contain statements. A statement is correct or incorrect with regard to a model, and true or false with regard to reality. Correctness here means a good fit to the rest of the model (usually, being derivable within the model, since scientists are usually interested in models as means of deriving predictions), and trueness means being a good fit to reality.



        As a first simple example, take the model of classical conditioning, demonstrated in the famous experiment of Pavlov. The model makes the prediction that "the dog will salivate when the light goes on", and this statement is both correct within its model, and true in reality.



        Note that a statement can be correct without being true, and vice versa. Imagine a situation in which a person of average build is standing upright in the middle of a room. I walk up to the person and push them forcefully on the sternum. A physics model predicts that the person will topple over. A model of interpersonal psychology predicts that the person will remain upright, and will slap me. Each of these statements is correct within its own model, but at most one is true in a given situation. This is not just about the domain of the model - a second model of interpersonal psychology can predict that I will get shouted at, but not slapped. They are simply different models, producing different predictions (statements). And note that, while in most real-life situations the psychological models are more likely to be true, one could imagine situations where the physics model is true, for example performing in a theater.



        Sometimes people naively assume that science is (or should be) looking for the truest models, that is, the ones which fit reality the closest. This is not true (no pun intended). Scientists value parsimony, which means that they prefer the simplest model whose prediction is true enough for their intended use case. As a famous statistician quipped, all models are wrong, but some are more useful than others. This is why we are using widely both Newtonian physics and quantum mechanics - each of them is the preferred model for a given application.



        It is also entirely possible that a model which describes one situation in reality perfectly (it produces true statements in that situation) is not at all applicable to other situations. The inverse square law is a true model when you consider the intensity of light, and not a true model when you describe the speed of a ball rolling on a horizontal plane.



        Now on to your mathematics example. Mathematical models are very abstract. They can be used to represent some situation present in reality, or they can be used as mere mental constructs without any claims to be representative of a real situation. Let's first use your examples as representations of reality. In situation A, you get some sheep delivered and put them in a paddock. You want to predict the number of sheep you will have at the end. In that case, the model of addition of natural numbers will provide you with a true statement, while the second model, that of modulo arithmetic, will provide you with a false statement. Note that both statements are correct within their own models - an incorrect statement would be "1 + 1 = 1" in modulo 2 arithmetic. But one is true and one is false in your use case. If your neighbour is getting sheep delivered to two paddocks and has to ensure equal number of sheep between the two paddocks, then I also have a model which will predict how many sheep she has left over - here the modulo 2 arithmetic produces the true statement, and the addition over natural numbers produces a false one.



        But if there is a pure mathematician who has no aspiration of applying your mathematics to real situations, the question of "is it true or false" doesn't even arise. Since true/false is about fitting a model to reality, and she is not fitting your model to any reality, she cannot measure the statement's trueness anymore than she can measure its physical weight or its radioactivity - it just doesn't have this quality. All she can decide about the statement is whether it is correct within its model or not.



        If you think that there is just "truth", it is easy to see why you would compare the two situations you described, "1 + 1 = 2" being false* in modulo 2 arithmetic, and "each electron is in one place at a time" being false in quantum mechanics. But in fact, the two statements are not false, but incorrect in their respective models. Besides, the second one is false in the physical reality of the universe we live in. The first one is not traditionally attached to any specific use case in reality, so cannot be said to be true or false before you also provide such a use case.



        * OK, not strictly false, but my point still holds with examples that are strictly false.






        share|improve this answer























          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "265"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphilosophy.stackexchange.com%2fquestions%2f61712%2ffalsification-in-math-vs-science%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          6 Answers
          6






          active

          oldest

          votes








          6 Answers
          6






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          58














          In math, we define stuff like numbers and operators, then we go on to prove other stuff from those premises.



          When you ask: "Is 1 + 1 = 0?", a mathematician will just ask back: "With what definition of +?"



          • If you assume natural numbers and the common definition of +, then this statement is false.


          • If you assume numbers modulo 2 and + meaning XOR, then this statement is perfectly true.


          You cannot say that we falsified the claim that 1 + 1 = 2; we just came up with new definitions for what + could mean.




          For physics, the situation is a little different: Here we measure stuff we want to explain, then we whip up some fancy theory to explain the measurements, and finally, we test the theory by measuring more stuff, trying to falsify our theory.



          In the example you gave, Newton had some measurements (like falling apples) that he wanted to explain, so he came up with his theory of force, acceleration and movement, and he could explain his measurements. We continued testing his theory until finally the Michelson-Morley Experiment produced numbers that could not be explained with Newtonian Physics anymore. So, Einstein came up with a new theory that could explain all that the old theory could explain, and which also explained the result of the Michelson-Morley Experiment.



          Note that I said "could explain all that the old theory could explain". Newton's theory worked fine for small numbers, and Einstein's theory had to make the same predictions for these small numbers. More precisely, Newton's theory is nothing more or less than a convenient approximation of Einstein's theory for small numbers.



          We do this a lot in physics: We know that some stuff obeys some complex rules, but we don't want to bother with deriving mathematically correct results, so we just use an approximation (and hopefully check that this approximation is indeed not too far off). The point is, in the end physicists can only falsify stuff by measurement which includes measurement errors, and it does not help our cause to calculate stuff we can't measure. But the approximations allow us to make conclusions we cannot derive with mathematical rigor.



          So, a physicist with a quartz-clock, a ruler, and a scale will simply use Newtonian Mechanics to predict their measurements. A physicist listening in to gravitational waves does not have this luxury, they must use General Relativity to derive their predictions. The first physicist simply uses an approximation (Newtonian Mechanics) to an approximation (Special Relativity) to a theory (General Relativity) which we do not know what theory it approximates yet (String Theory? Loop Quantum Gravity? Something else?)



          In this sense, no physical theory is fully correct or incorrect. For some theories we know where they start producing numbers that actually disagree with our experiments, and for others we may not have discovered those places yet. But that does not reduce their usefulness when we apply them to problems where we know that they yield sufficiently precise results. In the end, any physical theory is just an approximation of reality.






          share|improve this answer

























          • Comments are not for extended discussion; this conversation has been moved to chat.

            – Geoffrey Thomas
            Apr 12 at 7:15











          • Very nicely put. I've bookmarked this so I can refer others to it in future.

            – Neil_UK
            Apr 12 at 10:49











          • For example we use Pi. We rarely use every digit of Pi.

            – candied_orange
            Apr 12 at 12:12











          • Awesome! You should title your post something like "induction vs deduction".

            – Joshua Ronis
            Apr 12 at 14:24






          • 1





            @candied_orange we never use every digit of Pi, and never will. :D

            – Cássio Renan
            Apr 12 at 20:54
















          58














          In math, we define stuff like numbers and operators, then we go on to prove other stuff from those premises.



          When you ask: "Is 1 + 1 = 0?", a mathematician will just ask back: "With what definition of +?"



          • If you assume natural numbers and the common definition of +, then this statement is false.


          • If you assume numbers modulo 2 and + meaning XOR, then this statement is perfectly true.


          You cannot say that we falsified the claim that 1 + 1 = 2; we just came up with new definitions for what + could mean.




          For physics, the situation is a little different: Here we measure stuff we want to explain, then we whip up some fancy theory to explain the measurements, and finally, we test the theory by measuring more stuff, trying to falsify our theory.



          In the example you gave, Newton had some measurements (like falling apples) that he wanted to explain, so he came up with his theory of force, acceleration and movement, and he could explain his measurements. We continued testing his theory until finally the Michelson-Morley Experiment produced numbers that could not be explained with Newtonian Physics anymore. So, Einstein came up with a new theory that could explain all that the old theory could explain, and which also explained the result of the Michelson-Morley Experiment.



          Note that I said "could explain all that the old theory could explain". Newton's theory worked fine for small numbers, and Einstein's theory had to make the same predictions for these small numbers. More precisely, Newton's theory is nothing more or less than a convenient approximation of Einstein's theory for small numbers.



          We do this a lot in physics: We know that some stuff obeys some complex rules, but we don't want to bother with deriving mathematically correct results, so we just use an approximation (and hopefully check that this approximation is indeed not too far off). The point is, in the end physicists can only falsify stuff by measurement which includes measurement errors, and it does not help our cause to calculate stuff we can't measure. But the approximations allow us to make conclusions we cannot derive with mathematical rigor.



          So, a physicist with a quartz-clock, a ruler, and a scale will simply use Newtonian Mechanics to predict their measurements. A physicist listening in to gravitational waves does not have this luxury, they must use General Relativity to derive their predictions. The first physicist simply uses an approximation (Newtonian Mechanics) to an approximation (Special Relativity) to a theory (General Relativity) which we do not know what theory it approximates yet (String Theory? Loop Quantum Gravity? Something else?)



          In this sense, no physical theory is fully correct or incorrect. For some theories we know where they start producing numbers that actually disagree with our experiments, and for others we may not have discovered those places yet. But that does not reduce their usefulness when we apply them to problems where we know that they yield sufficiently precise results. In the end, any physical theory is just an approximation of reality.






          share|improve this answer

























          • Comments are not for extended discussion; this conversation has been moved to chat.

            – Geoffrey Thomas
            Apr 12 at 7:15











          • Very nicely put. I've bookmarked this so I can refer others to it in future.

            – Neil_UK
            Apr 12 at 10:49











          • For example we use Pi. We rarely use every digit of Pi.

            – candied_orange
            Apr 12 at 12:12











          • Awesome! You should title your post something like "induction vs deduction".

            – Joshua Ronis
            Apr 12 at 14:24






          • 1





            @candied_orange we never use every digit of Pi, and never will. :D

            – Cássio Renan
            Apr 12 at 20:54














          58












          58








          58







          In math, we define stuff like numbers and operators, then we go on to prove other stuff from those premises.



          When you ask: "Is 1 + 1 = 0?", a mathematician will just ask back: "With what definition of +?"



          • If you assume natural numbers and the common definition of +, then this statement is false.


          • If you assume numbers modulo 2 and + meaning XOR, then this statement is perfectly true.


          You cannot say that we falsified the claim that 1 + 1 = 2; we just came up with new definitions for what + could mean.




          For physics, the situation is a little different: Here we measure stuff we want to explain, then we whip up some fancy theory to explain the measurements, and finally, we test the theory by measuring more stuff, trying to falsify our theory.



          In the example you gave, Newton had some measurements (like falling apples) that he wanted to explain, so he came up with his theory of force, acceleration and movement, and he could explain his measurements. We continued testing his theory until finally the Michelson-Morley Experiment produced numbers that could not be explained with Newtonian Physics anymore. So, Einstein came up with a new theory that could explain all that the old theory could explain, and which also explained the result of the Michelson-Morley Experiment.



          Note that I said "could explain all that the old theory could explain". Newton's theory worked fine for small numbers, and Einstein's theory had to make the same predictions for these small numbers. More precisely, Newton's theory is nothing more or less than a convenient approximation of Einstein's theory for small numbers.



          We do this a lot in physics: We know that some stuff obeys some complex rules, but we don't want to bother with deriving mathematically correct results, so we just use an approximation (and hopefully check that this approximation is indeed not too far off). The point is, in the end physicists can only falsify stuff by measurement which includes measurement errors, and it does not help our cause to calculate stuff we can't measure. But the approximations allow us to make conclusions we cannot derive with mathematical rigor.



          So, a physicist with a quartz-clock, a ruler, and a scale will simply use Newtonian Mechanics to predict their measurements. A physicist listening in to gravitational waves does not have this luxury, they must use General Relativity to derive their predictions. The first physicist simply uses an approximation (Newtonian Mechanics) to an approximation (Special Relativity) to a theory (General Relativity) which we do not know what theory it approximates yet (String Theory? Loop Quantum Gravity? Something else?)



          In this sense, no physical theory is fully correct or incorrect. For some theories we know where they start producing numbers that actually disagree with our experiments, and for others we may not have discovered those places yet. But that does not reduce their usefulness when we apply them to problems where we know that they yield sufficiently precise results. In the end, any physical theory is just an approximation of reality.






          share|improve this answer















          In math, we define stuff like numbers and operators, then we go on to prove other stuff from those premises.



          When you ask: "Is 1 + 1 = 0?", a mathematician will just ask back: "With what definition of +?"



          • If you assume natural numbers and the common definition of +, then this statement is false.


          • If you assume numbers modulo 2 and + meaning XOR, then this statement is perfectly true.


          You cannot say that we falsified the claim that 1 + 1 = 2; we just came up with new definitions for what + could mean.




          For physics, the situation is a little different: Here we measure stuff we want to explain, then we whip up some fancy theory to explain the measurements, and finally, we test the theory by measuring more stuff, trying to falsify our theory.



          In the example you gave, Newton had some measurements (like falling apples) that he wanted to explain, so he came up with his theory of force, acceleration and movement, and he could explain his measurements. We continued testing his theory until finally the Michelson-Morley Experiment produced numbers that could not be explained with Newtonian Physics anymore. So, Einstein came up with a new theory that could explain all that the old theory could explain, and which also explained the result of the Michelson-Morley Experiment.



          Note that I said "could explain all that the old theory could explain". Newton's theory worked fine for small numbers, and Einstein's theory had to make the same predictions for these small numbers. More precisely, Newton's theory is nothing more or less than a convenient approximation of Einstein's theory for small numbers.



          We do this a lot in physics: We know that some stuff obeys some complex rules, but we don't want to bother with deriving mathematically correct results, so we just use an approximation (and hopefully check that this approximation is indeed not too far off). The point is, in the end physicists can only falsify stuff by measurement which includes measurement errors, and it does not help our cause to calculate stuff we can't measure. But the approximations allow us to make conclusions we cannot derive with mathematical rigor.



          So, a physicist with a quartz-clock, a ruler, and a scale will simply use Newtonian Mechanics to predict their measurements. A physicist listening in to gravitational waves does not have this luxury, they must use General Relativity to derive their predictions. The first physicist simply uses an approximation (Newtonian Mechanics) to an approximation (Special Relativity) to a theory (General Relativity) which we do not know what theory it approximates yet (String Theory? Loop Quantum Gravity? Something else?)



          In this sense, no physical theory is fully correct or incorrect. For some theories we know where they start producing numbers that actually disagree with our experiments, and for others we may not have discovered those places yet. But that does not reduce their usefulness when we apply them to problems where we know that they yield sufficiently precise results. In the end, any physical theory is just an approximation of reality.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Apr 10 at 8:29

























          answered Apr 9 at 22:12









          cmastercmaster

          62625




          62625












          • Comments are not for extended discussion; this conversation has been moved to chat.

            – Geoffrey Thomas
            Apr 12 at 7:15











          • Very nicely put. I've bookmarked this so I can refer others to it in future.

            – Neil_UK
            Apr 12 at 10:49











          • For example we use Pi. We rarely use every digit of Pi.

            – candied_orange
            Apr 12 at 12:12











          • Awesome! You should title your post something like "induction vs deduction".

            – Joshua Ronis
            Apr 12 at 14:24






          • 1





            @candied_orange we never use every digit of Pi, and never will. :D

            – Cássio Renan
            Apr 12 at 20:54


















          • Comments are not for extended discussion; this conversation has been moved to chat.

            – Geoffrey Thomas
            Apr 12 at 7:15











          • Very nicely put. I've bookmarked this so I can refer others to it in future.

            – Neil_UK
            Apr 12 at 10:49











          • For example we use Pi. We rarely use every digit of Pi.

            – candied_orange
            Apr 12 at 12:12











          • Awesome! You should title your post something like "induction vs deduction".

            – Joshua Ronis
            Apr 12 at 14:24






          • 1





            @candied_orange we never use every digit of Pi, and never will. :D

            – Cássio Renan
            Apr 12 at 20:54

















          Comments are not for extended discussion; this conversation has been moved to chat.

          – Geoffrey Thomas
          Apr 12 at 7:15





          Comments are not for extended discussion; this conversation has been moved to chat.

          – Geoffrey Thomas
          Apr 12 at 7:15













          Very nicely put. I've bookmarked this so I can refer others to it in future.

          – Neil_UK
          Apr 12 at 10:49





          Very nicely put. I've bookmarked this so I can refer others to it in future.

          – Neil_UK
          Apr 12 at 10:49













          For example we use Pi. We rarely use every digit of Pi.

          – candied_orange
          Apr 12 at 12:12





          For example we use Pi. We rarely use every digit of Pi.

          – candied_orange
          Apr 12 at 12:12













          Awesome! You should title your post something like "induction vs deduction".

          – Joshua Ronis
          Apr 12 at 14:24





          Awesome! You should title your post something like "induction vs deduction".

          – Joshua Ronis
          Apr 12 at 14:24




          1




          1





          @candied_orange we never use every digit of Pi, and never will. :D

          – Cássio Renan
          Apr 12 at 20:54






          @candied_orange we never use every digit of Pi, and never will. :D

          – Cássio Renan
          Apr 12 at 20:54












          11














          The hypothesis 1+1=0 is false in the domain of natural numbers. If the domain is the finite field of the integers mod 2, then one is no longer in the domain of the natural numbers and the statement 1+1=0 would be true in that domain.



          The question is why do we not consider these to be falsifications of each other?



          These are not contradictions or falsifications if we view these statements in their separate domains. The domain of the natural numbers is not the domain of the integers mod 2. Although the statements may look the same, they are statements derived or not from different domains and so are different.




          Falsification would be involved in mathematics by assuming something with the intent to arrive at a contradiction. If one can arrive at the contradiction, then one can conclude that what one assumed true was false. One name for this inference rule would be proof by contradiction.



          Wikipedia describes it as follows:




          In logic, proof by contradiction is a form of proof that establishes the truth or validity of a proposition by first assuming that the opposite proposition is true, and then shows that such an assumption leads to a contradiction. Proof by contradiction is also known as indirect proof, proof by assuming the opposite, and reductio ad impossibile.





          Wikipedia contributors. (2019, March 26). Proof by contradiction. In Wikipedia, The Free Encyclopedia. Retrieved 16:55, April 9, 2019, from https://en.wikipedia.org/w/index.php?title=Proof_by_contradiction&oldid=889548940






          share|improve this answer



























            11














            The hypothesis 1+1=0 is false in the domain of natural numbers. If the domain is the finite field of the integers mod 2, then one is no longer in the domain of the natural numbers and the statement 1+1=0 would be true in that domain.



            The question is why do we not consider these to be falsifications of each other?



            These are not contradictions or falsifications if we view these statements in their separate domains. The domain of the natural numbers is not the domain of the integers mod 2. Although the statements may look the same, they are statements derived or not from different domains and so are different.




            Falsification would be involved in mathematics by assuming something with the intent to arrive at a contradiction. If one can arrive at the contradiction, then one can conclude that what one assumed true was false. One name for this inference rule would be proof by contradiction.



            Wikipedia describes it as follows:




            In logic, proof by contradiction is a form of proof that establishes the truth or validity of a proposition by first assuming that the opposite proposition is true, and then shows that such an assumption leads to a contradiction. Proof by contradiction is also known as indirect proof, proof by assuming the opposite, and reductio ad impossibile.





            Wikipedia contributors. (2019, March 26). Proof by contradiction. In Wikipedia, The Free Encyclopedia. Retrieved 16:55, April 9, 2019, from https://en.wikipedia.org/w/index.php?title=Proof_by_contradiction&oldid=889548940






            share|improve this answer

























              11












              11








              11







              The hypothesis 1+1=0 is false in the domain of natural numbers. If the domain is the finite field of the integers mod 2, then one is no longer in the domain of the natural numbers and the statement 1+1=0 would be true in that domain.



              The question is why do we not consider these to be falsifications of each other?



              These are not contradictions or falsifications if we view these statements in their separate domains. The domain of the natural numbers is not the domain of the integers mod 2. Although the statements may look the same, they are statements derived or not from different domains and so are different.




              Falsification would be involved in mathematics by assuming something with the intent to arrive at a contradiction. If one can arrive at the contradiction, then one can conclude that what one assumed true was false. One name for this inference rule would be proof by contradiction.



              Wikipedia describes it as follows:




              In logic, proof by contradiction is a form of proof that establishes the truth or validity of a proposition by first assuming that the opposite proposition is true, and then shows that such an assumption leads to a contradiction. Proof by contradiction is also known as indirect proof, proof by assuming the opposite, and reductio ad impossibile.





              Wikipedia contributors. (2019, March 26). Proof by contradiction. In Wikipedia, The Free Encyclopedia. Retrieved 16:55, April 9, 2019, from https://en.wikipedia.org/w/index.php?title=Proof_by_contradiction&oldid=889548940






              share|improve this answer













              The hypothesis 1+1=0 is false in the domain of natural numbers. If the domain is the finite field of the integers mod 2, then one is no longer in the domain of the natural numbers and the statement 1+1=0 would be true in that domain.



              The question is why do we not consider these to be falsifications of each other?



              These are not contradictions or falsifications if we view these statements in their separate domains. The domain of the natural numbers is not the domain of the integers mod 2. Although the statements may look the same, they are statements derived or not from different domains and so are different.




              Falsification would be involved in mathematics by assuming something with the intent to arrive at a contradiction. If one can arrive at the contradiction, then one can conclude that what one assumed true was false. One name for this inference rule would be proof by contradiction.



              Wikipedia describes it as follows:




              In logic, proof by contradiction is a form of proof that establishes the truth or validity of a proposition by first assuming that the opposite proposition is true, and then shows that such an assumption leads to a contradiction. Proof by contradiction is also known as indirect proof, proof by assuming the opposite, and reductio ad impossibile.





              Wikipedia contributors. (2019, March 26). Proof by contradiction. In Wikipedia, The Free Encyclopedia. Retrieved 16:55, April 9, 2019, from https://en.wikipedia.org/w/index.php?title=Proof_by_contradiction&oldid=889548940







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Apr 9 at 17:04









              Frank HubenyFrank Hubeny

              11k51559




              11k51559





















                  6














                  1 + 1 = 0 is false.



                  Meanwhile, (1_2) +_2 (1_2) = 0_2 is true. Here +_2 is a different operation than +, and 1_2 and 0_2 are different things than 1 and 0. So it's not surprising that one equation is true while the other is false.



                  The problem is that we do not like to write "_2" everywhere, so we often write 1 + 1 = 0 when we mean 1_2 +_2 1_2 = 0_2. This can possibly lead to confusion, though hopefully the author (or context) will make it clear what is meant by the equation 1 + 1 = 0, whenever it is written, so that ambiguity is avoided.



                  I would not say that Newtonian physics is "false", but I would say that it does not accurately predict certain observations we make about our universe, while Einstein's Relativity does seem to predict these observations quite well. So Newtonian physics is apparently not the best theory for our physical universe.



                  However, since there are no other universes(?), physicists frequently omit the phrase "for our physical universe" for convenience.






                  share|improve this answer























                  • Hmmm. I appreciate the effort @Eric Towers. Yet, when I wondered how other universes could matter to me, I wasn't referring to "mechanisms of action," but rather about how another universe could have any impact on my experiences, my purposes, my plans or my actions. I deal with whatever gravity affects me in any case, and when I do need to figure it into my calculations (very rarely!) its origin isn't relevant. I hope that's clear enough...

                    – Rortian
                    Apr 11 at 1:10











                  • @Graipher, I understand your point, and I appreciate your participation. "Maybe how you commute to work in the year 3000 depends on FTL travel exploiting that. " Who? Me???? I don't think so...doesn't that seem rather unlikely to you? I won't be looking forward to that! I'm concerned about more immediate and pressing matters than considering theoretical aspects of abstract descriptions of events. "fundamental research...might lead to something new." I get that. I'm simply more concerned with people than I am with physics.

                    – Rortian
                    Apr 11 at 11:19












                  • @Rortian Even as such, you ignore the possibility of the butterfly effect. It may be the case that due to some essentially untraceable chain of events linked to a slight perturbation in an alternate universe, your interactions with your friends is vastly different. I don't see your point. This sounds like the "I don't get it, so it doesn't matter" argument.

                    – Don Thousand
                    Apr 12 at 4:38











                  • @DonThousand I appreciate your thoughts on this. I’m not ignoring chaos and complexity, I’m dealing with it. I’m simply focussing on what seems to be much more important to me and the people around me, rather than spending my (precious) time (I’m much older than you!) considering the theoretical effects of other possible worlds on our current activities. I’m not a physicist and I don’t aspire to be one. … “It may be the case”…Do you believe that anything is possible? If so, then the notion of possibility would lose its meaning…

                    – Rortian
                    Apr 12 at 9:52












                  • @DonThousand I think I “get” what you’ve said; can you tell me why I ought to believe that it’s more important to me than it is right now? I’m working to promote educational philosophy and educational psychology, critical thinking and ethics. Would you care discuss any topic within those categories? I'd be happy to!

                    – Rortian
                    Apr 12 at 9:53















                  6














                  1 + 1 = 0 is false.



                  Meanwhile, (1_2) +_2 (1_2) = 0_2 is true. Here +_2 is a different operation than +, and 1_2 and 0_2 are different things than 1 and 0. So it's not surprising that one equation is true while the other is false.



                  The problem is that we do not like to write "_2" everywhere, so we often write 1 + 1 = 0 when we mean 1_2 +_2 1_2 = 0_2. This can possibly lead to confusion, though hopefully the author (or context) will make it clear what is meant by the equation 1 + 1 = 0, whenever it is written, so that ambiguity is avoided.



                  I would not say that Newtonian physics is "false", but I would say that it does not accurately predict certain observations we make about our universe, while Einstein's Relativity does seem to predict these observations quite well. So Newtonian physics is apparently not the best theory for our physical universe.



                  However, since there are no other universes(?), physicists frequently omit the phrase "for our physical universe" for convenience.






                  share|improve this answer























                  • Hmmm. I appreciate the effort @Eric Towers. Yet, when I wondered how other universes could matter to me, I wasn't referring to "mechanisms of action," but rather about how another universe could have any impact on my experiences, my purposes, my plans or my actions. I deal with whatever gravity affects me in any case, and when I do need to figure it into my calculations (very rarely!) its origin isn't relevant. I hope that's clear enough...

                    – Rortian
                    Apr 11 at 1:10











                  • @Graipher, I understand your point, and I appreciate your participation. "Maybe how you commute to work in the year 3000 depends on FTL travel exploiting that. " Who? Me???? I don't think so...doesn't that seem rather unlikely to you? I won't be looking forward to that! I'm concerned about more immediate and pressing matters than considering theoretical aspects of abstract descriptions of events. "fundamental research...might lead to something new." I get that. I'm simply more concerned with people than I am with physics.

                    – Rortian
                    Apr 11 at 11:19












                  • @Rortian Even as such, you ignore the possibility of the butterfly effect. It may be the case that due to some essentially untraceable chain of events linked to a slight perturbation in an alternate universe, your interactions with your friends is vastly different. I don't see your point. This sounds like the "I don't get it, so it doesn't matter" argument.

                    – Don Thousand
                    Apr 12 at 4:38











                  • @DonThousand I appreciate your thoughts on this. I’m not ignoring chaos and complexity, I’m dealing with it. I’m simply focussing on what seems to be much more important to me and the people around me, rather than spending my (precious) time (I’m much older than you!) considering the theoretical effects of other possible worlds on our current activities. I’m not a physicist and I don’t aspire to be one. … “It may be the case”…Do you believe that anything is possible? If so, then the notion of possibility would lose its meaning…

                    – Rortian
                    Apr 12 at 9:52












                  • @DonThousand I think I “get” what you’ve said; can you tell me why I ought to believe that it’s more important to me than it is right now? I’m working to promote educational philosophy and educational psychology, critical thinking and ethics. Would you care discuss any topic within those categories? I'd be happy to!

                    – Rortian
                    Apr 12 at 9:53













                  6












                  6








                  6







                  1 + 1 = 0 is false.



                  Meanwhile, (1_2) +_2 (1_2) = 0_2 is true. Here +_2 is a different operation than +, and 1_2 and 0_2 are different things than 1 and 0. So it's not surprising that one equation is true while the other is false.



                  The problem is that we do not like to write "_2" everywhere, so we often write 1 + 1 = 0 when we mean 1_2 +_2 1_2 = 0_2. This can possibly lead to confusion, though hopefully the author (or context) will make it clear what is meant by the equation 1 + 1 = 0, whenever it is written, so that ambiguity is avoided.



                  I would not say that Newtonian physics is "false", but I would say that it does not accurately predict certain observations we make about our universe, while Einstein's Relativity does seem to predict these observations quite well. So Newtonian physics is apparently not the best theory for our physical universe.



                  However, since there are no other universes(?), physicists frequently omit the phrase "for our physical universe" for convenience.






                  share|improve this answer













                  1 + 1 = 0 is false.



                  Meanwhile, (1_2) +_2 (1_2) = 0_2 is true. Here +_2 is a different operation than +, and 1_2 and 0_2 are different things than 1 and 0. So it's not surprising that one equation is true while the other is false.



                  The problem is that we do not like to write "_2" everywhere, so we often write 1 + 1 = 0 when we mean 1_2 +_2 1_2 = 0_2. This can possibly lead to confusion, though hopefully the author (or context) will make it clear what is meant by the equation 1 + 1 = 0, whenever it is written, so that ambiguity is avoided.



                  I would not say that Newtonian physics is "false", but I would say that it does not accurately predict certain observations we make about our universe, while Einstein's Relativity does seem to predict these observations quite well. So Newtonian physics is apparently not the best theory for our physical universe.



                  However, since there are no other universes(?), physicists frequently omit the phrase "for our physical universe" for convenience.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Apr 9 at 20:41









                  mathmandanmathmandan

                  20113




                  20113












                  • Hmmm. I appreciate the effort @Eric Towers. Yet, when I wondered how other universes could matter to me, I wasn't referring to "mechanisms of action," but rather about how another universe could have any impact on my experiences, my purposes, my plans or my actions. I deal with whatever gravity affects me in any case, and when I do need to figure it into my calculations (very rarely!) its origin isn't relevant. I hope that's clear enough...

                    – Rortian
                    Apr 11 at 1:10











                  • @Graipher, I understand your point, and I appreciate your participation. "Maybe how you commute to work in the year 3000 depends on FTL travel exploiting that. " Who? Me???? I don't think so...doesn't that seem rather unlikely to you? I won't be looking forward to that! I'm concerned about more immediate and pressing matters than considering theoretical aspects of abstract descriptions of events. "fundamental research...might lead to something new." I get that. I'm simply more concerned with people than I am with physics.

                    – Rortian
                    Apr 11 at 11:19












                  • @Rortian Even as such, you ignore the possibility of the butterfly effect. It may be the case that due to some essentially untraceable chain of events linked to a slight perturbation in an alternate universe, your interactions with your friends is vastly different. I don't see your point. This sounds like the "I don't get it, so it doesn't matter" argument.

                    – Don Thousand
                    Apr 12 at 4:38











                  • @DonThousand I appreciate your thoughts on this. I’m not ignoring chaos and complexity, I’m dealing with it. I’m simply focussing on what seems to be much more important to me and the people around me, rather than spending my (precious) time (I’m much older than you!) considering the theoretical effects of other possible worlds on our current activities. I’m not a physicist and I don’t aspire to be one. … “It may be the case”…Do you believe that anything is possible? If so, then the notion of possibility would lose its meaning…

                    – Rortian
                    Apr 12 at 9:52












                  • @DonThousand I think I “get” what you’ve said; can you tell me why I ought to believe that it’s more important to me than it is right now? I’m working to promote educational philosophy and educational psychology, critical thinking and ethics. Would you care discuss any topic within those categories? I'd be happy to!

                    – Rortian
                    Apr 12 at 9:53

















                  • Hmmm. I appreciate the effort @Eric Towers. Yet, when I wondered how other universes could matter to me, I wasn't referring to "mechanisms of action," but rather about how another universe could have any impact on my experiences, my purposes, my plans or my actions. I deal with whatever gravity affects me in any case, and when I do need to figure it into my calculations (very rarely!) its origin isn't relevant. I hope that's clear enough...

                    – Rortian
                    Apr 11 at 1:10











                  • @Graipher, I understand your point, and I appreciate your participation. "Maybe how you commute to work in the year 3000 depends on FTL travel exploiting that. " Who? Me???? I don't think so...doesn't that seem rather unlikely to you? I won't be looking forward to that! I'm concerned about more immediate and pressing matters than considering theoretical aspects of abstract descriptions of events. "fundamental research...might lead to something new." I get that. I'm simply more concerned with people than I am with physics.

                    – Rortian
                    Apr 11 at 11:19












                  • @Rortian Even as such, you ignore the possibility of the butterfly effect. It may be the case that due to some essentially untraceable chain of events linked to a slight perturbation in an alternate universe, your interactions with your friends is vastly different. I don't see your point. This sounds like the "I don't get it, so it doesn't matter" argument.

                    – Don Thousand
                    Apr 12 at 4:38











                  • @DonThousand I appreciate your thoughts on this. I’m not ignoring chaos and complexity, I’m dealing with it. I’m simply focussing on what seems to be much more important to me and the people around me, rather than spending my (precious) time (I’m much older than you!) considering the theoretical effects of other possible worlds on our current activities. I’m not a physicist and I don’t aspire to be one. … “It may be the case”…Do you believe that anything is possible? If so, then the notion of possibility would lose its meaning…

                    – Rortian
                    Apr 12 at 9:52












                  • @DonThousand I think I “get” what you’ve said; can you tell me why I ought to believe that it’s more important to me than it is right now? I’m working to promote educational philosophy and educational psychology, critical thinking and ethics. Would you care discuss any topic within those categories? I'd be happy to!

                    – Rortian
                    Apr 12 at 9:53
















                  Hmmm. I appreciate the effort @Eric Towers. Yet, when I wondered how other universes could matter to me, I wasn't referring to "mechanisms of action," but rather about how another universe could have any impact on my experiences, my purposes, my plans or my actions. I deal with whatever gravity affects me in any case, and when I do need to figure it into my calculations (very rarely!) its origin isn't relevant. I hope that's clear enough...

                  – Rortian
                  Apr 11 at 1:10





                  Hmmm. I appreciate the effort @Eric Towers. Yet, when I wondered how other universes could matter to me, I wasn't referring to "mechanisms of action," but rather about how another universe could have any impact on my experiences, my purposes, my plans or my actions. I deal with whatever gravity affects me in any case, and when I do need to figure it into my calculations (very rarely!) its origin isn't relevant. I hope that's clear enough...

                  – Rortian
                  Apr 11 at 1:10













                  @Graipher, I understand your point, and I appreciate your participation. "Maybe how you commute to work in the year 3000 depends on FTL travel exploiting that. " Who? Me???? I don't think so...doesn't that seem rather unlikely to you? I won't be looking forward to that! I'm concerned about more immediate and pressing matters than considering theoretical aspects of abstract descriptions of events. "fundamental research...might lead to something new." I get that. I'm simply more concerned with people than I am with physics.

                  – Rortian
                  Apr 11 at 11:19






                  @Graipher, I understand your point, and I appreciate your participation. "Maybe how you commute to work in the year 3000 depends on FTL travel exploiting that. " Who? Me???? I don't think so...doesn't that seem rather unlikely to you? I won't be looking forward to that! I'm concerned about more immediate and pressing matters than considering theoretical aspects of abstract descriptions of events. "fundamental research...might lead to something new." I get that. I'm simply more concerned with people than I am with physics.

                  – Rortian
                  Apr 11 at 11:19














                  @Rortian Even as such, you ignore the possibility of the butterfly effect. It may be the case that due to some essentially untraceable chain of events linked to a slight perturbation in an alternate universe, your interactions with your friends is vastly different. I don't see your point. This sounds like the "I don't get it, so it doesn't matter" argument.

                  – Don Thousand
                  Apr 12 at 4:38





                  @Rortian Even as such, you ignore the possibility of the butterfly effect. It may be the case that due to some essentially untraceable chain of events linked to a slight perturbation in an alternate universe, your interactions with your friends is vastly different. I don't see your point. This sounds like the "I don't get it, so it doesn't matter" argument.

                  – Don Thousand
                  Apr 12 at 4:38













                  @DonThousand I appreciate your thoughts on this. I’m not ignoring chaos and complexity, I’m dealing with it. I’m simply focussing on what seems to be much more important to me and the people around me, rather than spending my (precious) time (I’m much older than you!) considering the theoretical effects of other possible worlds on our current activities. I’m not a physicist and I don’t aspire to be one. … “It may be the case”…Do you believe that anything is possible? If so, then the notion of possibility would lose its meaning…

                  – Rortian
                  Apr 12 at 9:52






                  @DonThousand I appreciate your thoughts on this. I’m not ignoring chaos and complexity, I’m dealing with it. I’m simply focussing on what seems to be much more important to me and the people around me, rather than spending my (precious) time (I’m much older than you!) considering the theoretical effects of other possible worlds on our current activities. I’m not a physicist and I don’t aspire to be one. … “It may be the case”…Do you believe that anything is possible? If so, then the notion of possibility would lose its meaning…

                  – Rortian
                  Apr 12 at 9:52














                  @DonThousand I think I “get” what you’ve said; can you tell me why I ought to believe that it’s more important to me than it is right now? I’m working to promote educational philosophy and educational psychology, critical thinking and ethics. Would you care discuss any topic within those categories? I'd be happy to!

                  – Rortian
                  Apr 12 at 9:53





                  @DonThousand I think I “get” what you’ve said; can you tell me why I ought to believe that it’s more important to me than it is right now? I’m working to promote educational philosophy and educational psychology, critical thinking and ethics. Would you care discuss any topic within those categories? I'd be happy to!

                  – Rortian
                  Apr 12 at 9:53











                  4














                  Your example from mathematics shows: To assess a mathematical statement one should first fix the context, the domain of validity of the symbols. Because in the context of natural numbers the statement 1+1=0 is false. While in the context of Z/2Z the statement is correct. Except the rare case of undecidable questions, in mathematics one can prove or disprove the correctness of a statement.



                  In natural science, e.g. in physics, one can never prove a general statement. All „laws of nature“ have the status of hypotheses. One can confirm a hypothesis or one can disprove the hypothesis by a counter example. But one cannot prove it.



                  IMO Newton’s laws are a good approximation in a broad domain, i.e. for small energy and medium dimension. One can deduce Newton’s laws as first order approximation from the field equations of General Relativity. But anyhow, Newton’s basic concepts like space, time and gravitation have quite a different meaning in Einstein’s theory of curved spacetime.



                  To say that a statement turns out as a first approximation is a refined and better version than saying that the statement is false. I prefer the refined version :-)






                  share|improve this answer





























                    4














                    Your example from mathematics shows: To assess a mathematical statement one should first fix the context, the domain of validity of the symbols. Because in the context of natural numbers the statement 1+1=0 is false. While in the context of Z/2Z the statement is correct. Except the rare case of undecidable questions, in mathematics one can prove or disprove the correctness of a statement.



                    In natural science, e.g. in physics, one can never prove a general statement. All „laws of nature“ have the status of hypotheses. One can confirm a hypothesis or one can disprove the hypothesis by a counter example. But one cannot prove it.



                    IMO Newton’s laws are a good approximation in a broad domain, i.e. for small energy and medium dimension. One can deduce Newton’s laws as first order approximation from the field equations of General Relativity. But anyhow, Newton’s basic concepts like space, time and gravitation have quite a different meaning in Einstein’s theory of curved spacetime.



                    To say that a statement turns out as a first approximation is a refined and better version than saying that the statement is false. I prefer the refined version :-)






                    share|improve this answer



























                      4












                      4








                      4







                      Your example from mathematics shows: To assess a mathematical statement one should first fix the context, the domain of validity of the symbols. Because in the context of natural numbers the statement 1+1=0 is false. While in the context of Z/2Z the statement is correct. Except the rare case of undecidable questions, in mathematics one can prove or disprove the correctness of a statement.



                      In natural science, e.g. in physics, one can never prove a general statement. All „laws of nature“ have the status of hypotheses. One can confirm a hypothesis or one can disprove the hypothesis by a counter example. But one cannot prove it.



                      IMO Newton’s laws are a good approximation in a broad domain, i.e. for small energy and medium dimension. One can deduce Newton’s laws as first order approximation from the field equations of General Relativity. But anyhow, Newton’s basic concepts like space, time and gravitation have quite a different meaning in Einstein’s theory of curved spacetime.



                      To say that a statement turns out as a first approximation is a refined and better version than saying that the statement is false. I prefer the refined version :-)






                      share|improve this answer















                      Your example from mathematics shows: To assess a mathematical statement one should first fix the context, the domain of validity of the symbols. Because in the context of natural numbers the statement 1+1=0 is false. While in the context of Z/2Z the statement is correct. Except the rare case of undecidable questions, in mathematics one can prove or disprove the correctness of a statement.



                      In natural science, e.g. in physics, one can never prove a general statement. All „laws of nature“ have the status of hypotheses. One can confirm a hypothesis or one can disprove the hypothesis by a counter example. But one cannot prove it.



                      IMO Newton’s laws are a good approximation in a broad domain, i.e. for small energy and medium dimension. One can deduce Newton’s laws as first order approximation from the field equations of General Relativity. But anyhow, Newton’s basic concepts like space, time and gravitation have quite a different meaning in Einstein’s theory of curved spacetime.



                      To say that a statement turns out as a first approximation is a refined and better version than saying that the statement is false. I prefer the refined version :-)







                      share|improve this answer














                      share|improve this answer



                      share|improve this answer








                      edited Apr 10 at 17:13

























                      answered Apr 10 at 14:51









                      Jo WehlerJo Wehler

                      17.8k21864




                      17.8k21864





















                          4














                          Hmmm. What about 1 + 1 = 10 ?



                          Is that equation, expressed in binary arithmetic, "false in the domain of natural numbers"?



                          My grounding in math and logic isn't very strong, but I understand the Wikipedia entry...I just don't think that the notions of truth and falsity can coherently apply to inductive inferences (abstract descriptions of unobservable things).



                          I've also heard people (philosophy professors!) say that Einstein disproved Newton's stuff, but that seems incorrect to me in the postmodern age of philosophy. Newton wasn't mistaken, his principles describe his observations very well. His theoretical model wasn't poor or wrong, and scientific proof seems to me to be an oxymoron. People who have faith in that idea reject the notion of fallibilism, which is commensurate with the postmodern approach:




                          Fallibilism is the epistemological thesis that no belief (theory,
                          view, thesis, and so on) can ever be rationally supported or justified
                          in a conclusive way. Always, there remains a possible doubt as to the
                          truth of the belief. Fallibilism applies that assessment even to
                          science’s best-entrenched claims and to people’s best-loved
                          commonsense views.




                          Stephen Hetherington, Internet Encyclopedia of Philosophy



                          I've learned (is this correct?) that true/false distinctions are properties of formal languages ( arithmetic and logic), where axioms and operations are strictly defined and the symbols are unrelated to observable phenomena (only to each other).



                          As for discourses in natural languages, the idea of epistemic truth (especially the theory of correspondence to reality) has been quite thoroughly discredited...or so I've been told...



                          I'm new here; I wouldn't be happy to hear that I (and my wisest philosophy instructors) have misinterpreted Kant, Kuhn and Popper, but if a wise expert disagrees I'll listen. I believe that it's theoretically impossible to denote the absolute truth about unobservable phenomena or complex abstractions, but I'm still learning...



                          In any case, to me coherency is the gold standard of human understanding, not truth. Subjective beliefs and public discourses may be assessed according to so-called objective criteria: the reliability of relevant evidence and the justification for one's presumptions. I think that that's one thing upon which scientists and philosophers might agree.






                          share|improve this answer





























                            4














                            Hmmm. What about 1 + 1 = 10 ?



                            Is that equation, expressed in binary arithmetic, "false in the domain of natural numbers"?



                            My grounding in math and logic isn't very strong, but I understand the Wikipedia entry...I just don't think that the notions of truth and falsity can coherently apply to inductive inferences (abstract descriptions of unobservable things).



                            I've also heard people (philosophy professors!) say that Einstein disproved Newton's stuff, but that seems incorrect to me in the postmodern age of philosophy. Newton wasn't mistaken, his principles describe his observations very well. His theoretical model wasn't poor or wrong, and scientific proof seems to me to be an oxymoron. People who have faith in that idea reject the notion of fallibilism, which is commensurate with the postmodern approach:




                            Fallibilism is the epistemological thesis that no belief (theory,
                            view, thesis, and so on) can ever be rationally supported or justified
                            in a conclusive way. Always, there remains a possible doubt as to the
                            truth of the belief. Fallibilism applies that assessment even to
                            science’s best-entrenched claims and to people’s best-loved
                            commonsense views.




                            Stephen Hetherington, Internet Encyclopedia of Philosophy



                            I've learned (is this correct?) that true/false distinctions are properties of formal languages ( arithmetic and logic), where axioms and operations are strictly defined and the symbols are unrelated to observable phenomena (only to each other).



                            As for discourses in natural languages, the idea of epistemic truth (especially the theory of correspondence to reality) has been quite thoroughly discredited...or so I've been told...



                            I'm new here; I wouldn't be happy to hear that I (and my wisest philosophy instructors) have misinterpreted Kant, Kuhn and Popper, but if a wise expert disagrees I'll listen. I believe that it's theoretically impossible to denote the absolute truth about unobservable phenomena or complex abstractions, but I'm still learning...



                            In any case, to me coherency is the gold standard of human understanding, not truth. Subjective beliefs and public discourses may be assessed according to so-called objective criteria: the reliability of relevant evidence and the justification for one's presumptions. I think that that's one thing upon which scientists and philosophers might agree.






                            share|improve this answer



























                              4












                              4








                              4







                              Hmmm. What about 1 + 1 = 10 ?



                              Is that equation, expressed in binary arithmetic, "false in the domain of natural numbers"?



                              My grounding in math and logic isn't very strong, but I understand the Wikipedia entry...I just don't think that the notions of truth and falsity can coherently apply to inductive inferences (abstract descriptions of unobservable things).



                              I've also heard people (philosophy professors!) say that Einstein disproved Newton's stuff, but that seems incorrect to me in the postmodern age of philosophy. Newton wasn't mistaken, his principles describe his observations very well. His theoretical model wasn't poor or wrong, and scientific proof seems to me to be an oxymoron. People who have faith in that idea reject the notion of fallibilism, which is commensurate with the postmodern approach:




                              Fallibilism is the epistemological thesis that no belief (theory,
                              view, thesis, and so on) can ever be rationally supported or justified
                              in a conclusive way. Always, there remains a possible doubt as to the
                              truth of the belief. Fallibilism applies that assessment even to
                              science’s best-entrenched claims and to people’s best-loved
                              commonsense views.




                              Stephen Hetherington, Internet Encyclopedia of Philosophy



                              I've learned (is this correct?) that true/false distinctions are properties of formal languages ( arithmetic and logic), where axioms and operations are strictly defined and the symbols are unrelated to observable phenomena (only to each other).



                              As for discourses in natural languages, the idea of epistemic truth (especially the theory of correspondence to reality) has been quite thoroughly discredited...or so I've been told...



                              I'm new here; I wouldn't be happy to hear that I (and my wisest philosophy instructors) have misinterpreted Kant, Kuhn and Popper, but if a wise expert disagrees I'll listen. I believe that it's theoretically impossible to denote the absolute truth about unobservable phenomena or complex abstractions, but I'm still learning...



                              In any case, to me coherency is the gold standard of human understanding, not truth. Subjective beliefs and public discourses may be assessed according to so-called objective criteria: the reliability of relevant evidence and the justification for one's presumptions. I think that that's one thing upon which scientists and philosophers might agree.






                              share|improve this answer















                              Hmmm. What about 1 + 1 = 10 ?



                              Is that equation, expressed in binary arithmetic, "false in the domain of natural numbers"?



                              My grounding in math and logic isn't very strong, but I understand the Wikipedia entry...I just don't think that the notions of truth and falsity can coherently apply to inductive inferences (abstract descriptions of unobservable things).



                              I've also heard people (philosophy professors!) say that Einstein disproved Newton's stuff, but that seems incorrect to me in the postmodern age of philosophy. Newton wasn't mistaken, his principles describe his observations very well. His theoretical model wasn't poor or wrong, and scientific proof seems to me to be an oxymoron. People who have faith in that idea reject the notion of fallibilism, which is commensurate with the postmodern approach:




                              Fallibilism is the epistemological thesis that no belief (theory,
                              view, thesis, and so on) can ever be rationally supported or justified
                              in a conclusive way. Always, there remains a possible doubt as to the
                              truth of the belief. Fallibilism applies that assessment even to
                              science’s best-entrenched claims and to people’s best-loved
                              commonsense views.




                              Stephen Hetherington, Internet Encyclopedia of Philosophy



                              I've learned (is this correct?) that true/false distinctions are properties of formal languages ( arithmetic and logic), where axioms and operations are strictly defined and the symbols are unrelated to observable phenomena (only to each other).



                              As for discourses in natural languages, the idea of epistemic truth (especially the theory of correspondence to reality) has been quite thoroughly discredited...or so I've been told...



                              I'm new here; I wouldn't be happy to hear that I (and my wisest philosophy instructors) have misinterpreted Kant, Kuhn and Popper, but if a wise expert disagrees I'll listen. I believe that it's theoretically impossible to denote the absolute truth about unobservable phenomena or complex abstractions, but I'm still learning...



                              In any case, to me coherency is the gold standard of human understanding, not truth. Subjective beliefs and public discourses may be assessed according to so-called objective criteria: the reliability of relevant evidence and the justification for one's presumptions. I think that that's one thing upon which scientists and philosophers might agree.







                              share|improve this answer














                              share|improve this answer



                              share|improve this answer








                              edited Apr 10 at 19:18

























                              answered Apr 9 at 18:19









                              RortianRortian

                              18911




                              18911





















                                  2














                                  Two understand why this is so, you have to understand the distinction between "false" and "incorrect".nYour question is caused by conflation of the two. I must admit that my usage of the two terms is not universal, but the distinction is common even when people use other terms for it (confusingly enough, textbooks on formal logic tend to use "false" for what I call "incorrect" here!).



                                  All branches of science make models, which are representations of reality. These models contain statements. A statement is correct or incorrect with regard to a model, and true or false with regard to reality. Correctness here means a good fit to the rest of the model (usually, being derivable within the model, since scientists are usually interested in models as means of deriving predictions), and trueness means being a good fit to reality.



                                  As a first simple example, take the model of classical conditioning, demonstrated in the famous experiment of Pavlov. The model makes the prediction that "the dog will salivate when the light goes on", and this statement is both correct within its model, and true in reality.



                                  Note that a statement can be correct without being true, and vice versa. Imagine a situation in which a person of average build is standing upright in the middle of a room. I walk up to the person and push them forcefully on the sternum. A physics model predicts that the person will topple over. A model of interpersonal psychology predicts that the person will remain upright, and will slap me. Each of these statements is correct within its own model, but at most one is true in a given situation. This is not just about the domain of the model - a second model of interpersonal psychology can predict that I will get shouted at, but not slapped. They are simply different models, producing different predictions (statements). And note that, while in most real-life situations the psychological models are more likely to be true, one could imagine situations where the physics model is true, for example performing in a theater.



                                  Sometimes people naively assume that science is (or should be) looking for the truest models, that is, the ones which fit reality the closest. This is not true (no pun intended). Scientists value parsimony, which means that they prefer the simplest model whose prediction is true enough for their intended use case. As a famous statistician quipped, all models are wrong, but some are more useful than others. This is why we are using widely both Newtonian physics and quantum mechanics - each of them is the preferred model for a given application.



                                  It is also entirely possible that a model which describes one situation in reality perfectly (it produces true statements in that situation) is not at all applicable to other situations. The inverse square law is a true model when you consider the intensity of light, and not a true model when you describe the speed of a ball rolling on a horizontal plane.



                                  Now on to your mathematics example. Mathematical models are very abstract. They can be used to represent some situation present in reality, or they can be used as mere mental constructs without any claims to be representative of a real situation. Let's first use your examples as representations of reality. In situation A, you get some sheep delivered and put them in a paddock. You want to predict the number of sheep you will have at the end. In that case, the model of addition of natural numbers will provide you with a true statement, while the second model, that of modulo arithmetic, will provide you with a false statement. Note that both statements are correct within their own models - an incorrect statement would be "1 + 1 = 1" in modulo 2 arithmetic. But one is true and one is false in your use case. If your neighbour is getting sheep delivered to two paddocks and has to ensure equal number of sheep between the two paddocks, then I also have a model which will predict how many sheep she has left over - here the modulo 2 arithmetic produces the true statement, and the addition over natural numbers produces a false one.



                                  But if there is a pure mathematician who has no aspiration of applying your mathematics to real situations, the question of "is it true or false" doesn't even arise. Since true/false is about fitting a model to reality, and she is not fitting your model to any reality, she cannot measure the statement's trueness anymore than she can measure its physical weight or its radioactivity - it just doesn't have this quality. All she can decide about the statement is whether it is correct within its model or not.



                                  If you think that there is just "truth", it is easy to see why you would compare the two situations you described, "1 + 1 = 2" being false* in modulo 2 arithmetic, and "each electron is in one place at a time" being false in quantum mechanics. But in fact, the two statements are not false, but incorrect in their respective models. Besides, the second one is false in the physical reality of the universe we live in. The first one is not traditionally attached to any specific use case in reality, so cannot be said to be true or false before you also provide such a use case.



                                  * OK, not strictly false, but my point still holds with examples that are strictly false.






                                  share|improve this answer



























                                    2














                                    Two understand why this is so, you have to understand the distinction between "false" and "incorrect".nYour question is caused by conflation of the two. I must admit that my usage of the two terms is not universal, but the distinction is common even when people use other terms for it (confusingly enough, textbooks on formal logic tend to use "false" for what I call "incorrect" here!).



                                    All branches of science make models, which are representations of reality. These models contain statements. A statement is correct or incorrect with regard to a model, and true or false with regard to reality. Correctness here means a good fit to the rest of the model (usually, being derivable within the model, since scientists are usually interested in models as means of deriving predictions), and trueness means being a good fit to reality.



                                    As a first simple example, take the model of classical conditioning, demonstrated in the famous experiment of Pavlov. The model makes the prediction that "the dog will salivate when the light goes on", and this statement is both correct within its model, and true in reality.



                                    Note that a statement can be correct without being true, and vice versa. Imagine a situation in which a person of average build is standing upright in the middle of a room. I walk up to the person and push them forcefully on the sternum. A physics model predicts that the person will topple over. A model of interpersonal psychology predicts that the person will remain upright, and will slap me. Each of these statements is correct within its own model, but at most one is true in a given situation. This is not just about the domain of the model - a second model of interpersonal psychology can predict that I will get shouted at, but not slapped. They are simply different models, producing different predictions (statements). And note that, while in most real-life situations the psychological models are more likely to be true, one could imagine situations where the physics model is true, for example performing in a theater.



                                    Sometimes people naively assume that science is (or should be) looking for the truest models, that is, the ones which fit reality the closest. This is not true (no pun intended). Scientists value parsimony, which means that they prefer the simplest model whose prediction is true enough for their intended use case. As a famous statistician quipped, all models are wrong, but some are more useful than others. This is why we are using widely both Newtonian physics and quantum mechanics - each of them is the preferred model for a given application.



                                    It is also entirely possible that a model which describes one situation in reality perfectly (it produces true statements in that situation) is not at all applicable to other situations. The inverse square law is a true model when you consider the intensity of light, and not a true model when you describe the speed of a ball rolling on a horizontal plane.



                                    Now on to your mathematics example. Mathematical models are very abstract. They can be used to represent some situation present in reality, or they can be used as mere mental constructs without any claims to be representative of a real situation. Let's first use your examples as representations of reality. In situation A, you get some sheep delivered and put them in a paddock. You want to predict the number of sheep you will have at the end. In that case, the model of addition of natural numbers will provide you with a true statement, while the second model, that of modulo arithmetic, will provide you with a false statement. Note that both statements are correct within their own models - an incorrect statement would be "1 + 1 = 1" in modulo 2 arithmetic. But one is true and one is false in your use case. If your neighbour is getting sheep delivered to two paddocks and has to ensure equal number of sheep between the two paddocks, then I also have a model which will predict how many sheep she has left over - here the modulo 2 arithmetic produces the true statement, and the addition over natural numbers produces a false one.



                                    But if there is a pure mathematician who has no aspiration of applying your mathematics to real situations, the question of "is it true or false" doesn't even arise. Since true/false is about fitting a model to reality, and she is not fitting your model to any reality, she cannot measure the statement's trueness anymore than she can measure its physical weight or its radioactivity - it just doesn't have this quality. All she can decide about the statement is whether it is correct within its model or not.



                                    If you think that there is just "truth", it is easy to see why you would compare the two situations you described, "1 + 1 = 2" being false* in modulo 2 arithmetic, and "each electron is in one place at a time" being false in quantum mechanics. But in fact, the two statements are not false, but incorrect in their respective models. Besides, the second one is false in the physical reality of the universe we live in. The first one is not traditionally attached to any specific use case in reality, so cannot be said to be true or false before you also provide such a use case.



                                    * OK, not strictly false, but my point still holds with examples that are strictly false.






                                    share|improve this answer

























                                      2












                                      2








                                      2







                                      Two understand why this is so, you have to understand the distinction between "false" and "incorrect".nYour question is caused by conflation of the two. I must admit that my usage of the two terms is not universal, but the distinction is common even when people use other terms for it (confusingly enough, textbooks on formal logic tend to use "false" for what I call "incorrect" here!).



                                      All branches of science make models, which are representations of reality. These models contain statements. A statement is correct or incorrect with regard to a model, and true or false with regard to reality. Correctness here means a good fit to the rest of the model (usually, being derivable within the model, since scientists are usually interested in models as means of deriving predictions), and trueness means being a good fit to reality.



                                      As a first simple example, take the model of classical conditioning, demonstrated in the famous experiment of Pavlov. The model makes the prediction that "the dog will salivate when the light goes on", and this statement is both correct within its model, and true in reality.



                                      Note that a statement can be correct without being true, and vice versa. Imagine a situation in which a person of average build is standing upright in the middle of a room. I walk up to the person and push them forcefully on the sternum. A physics model predicts that the person will topple over. A model of interpersonal psychology predicts that the person will remain upright, and will slap me. Each of these statements is correct within its own model, but at most one is true in a given situation. This is not just about the domain of the model - a second model of interpersonal psychology can predict that I will get shouted at, but not slapped. They are simply different models, producing different predictions (statements). And note that, while in most real-life situations the psychological models are more likely to be true, one could imagine situations where the physics model is true, for example performing in a theater.



                                      Sometimes people naively assume that science is (or should be) looking for the truest models, that is, the ones which fit reality the closest. This is not true (no pun intended). Scientists value parsimony, which means that they prefer the simplest model whose prediction is true enough for their intended use case. As a famous statistician quipped, all models are wrong, but some are more useful than others. This is why we are using widely both Newtonian physics and quantum mechanics - each of them is the preferred model for a given application.



                                      It is also entirely possible that a model which describes one situation in reality perfectly (it produces true statements in that situation) is not at all applicable to other situations. The inverse square law is a true model when you consider the intensity of light, and not a true model when you describe the speed of a ball rolling on a horizontal plane.



                                      Now on to your mathematics example. Mathematical models are very abstract. They can be used to represent some situation present in reality, or they can be used as mere mental constructs without any claims to be representative of a real situation. Let's first use your examples as representations of reality. In situation A, you get some sheep delivered and put them in a paddock. You want to predict the number of sheep you will have at the end. In that case, the model of addition of natural numbers will provide you with a true statement, while the second model, that of modulo arithmetic, will provide you with a false statement. Note that both statements are correct within their own models - an incorrect statement would be "1 + 1 = 1" in modulo 2 arithmetic. But one is true and one is false in your use case. If your neighbour is getting sheep delivered to two paddocks and has to ensure equal number of sheep between the two paddocks, then I also have a model which will predict how many sheep she has left over - here the modulo 2 arithmetic produces the true statement, and the addition over natural numbers produces a false one.



                                      But if there is a pure mathematician who has no aspiration of applying your mathematics to real situations, the question of "is it true or false" doesn't even arise. Since true/false is about fitting a model to reality, and she is not fitting your model to any reality, she cannot measure the statement's trueness anymore than she can measure its physical weight or its radioactivity - it just doesn't have this quality. All she can decide about the statement is whether it is correct within its model or not.



                                      If you think that there is just "truth", it is easy to see why you would compare the two situations you described, "1 + 1 = 2" being false* in modulo 2 arithmetic, and "each electron is in one place at a time" being false in quantum mechanics. But in fact, the two statements are not false, but incorrect in their respective models. Besides, the second one is false in the physical reality of the universe we live in. The first one is not traditionally attached to any specific use case in reality, so cannot be said to be true or false before you also provide such a use case.



                                      * OK, not strictly false, but my point still holds with examples that are strictly false.






                                      share|improve this answer













                                      Two understand why this is so, you have to understand the distinction between "false" and "incorrect".nYour question is caused by conflation of the two. I must admit that my usage of the two terms is not universal, but the distinction is common even when people use other terms for it (confusingly enough, textbooks on formal logic tend to use "false" for what I call "incorrect" here!).



                                      All branches of science make models, which are representations of reality. These models contain statements. A statement is correct or incorrect with regard to a model, and true or false with regard to reality. Correctness here means a good fit to the rest of the model (usually, being derivable within the model, since scientists are usually interested in models as means of deriving predictions), and trueness means being a good fit to reality.



                                      As a first simple example, take the model of classical conditioning, demonstrated in the famous experiment of Pavlov. The model makes the prediction that "the dog will salivate when the light goes on", and this statement is both correct within its model, and true in reality.



                                      Note that a statement can be correct without being true, and vice versa. Imagine a situation in which a person of average build is standing upright in the middle of a room. I walk up to the person and push them forcefully on the sternum. A physics model predicts that the person will topple over. A model of interpersonal psychology predicts that the person will remain upright, and will slap me. Each of these statements is correct within its own model, but at most one is true in a given situation. This is not just about the domain of the model - a second model of interpersonal psychology can predict that I will get shouted at, but not slapped. They are simply different models, producing different predictions (statements). And note that, while in most real-life situations the psychological models are more likely to be true, one could imagine situations where the physics model is true, for example performing in a theater.



                                      Sometimes people naively assume that science is (or should be) looking for the truest models, that is, the ones which fit reality the closest. This is not true (no pun intended). Scientists value parsimony, which means that they prefer the simplest model whose prediction is true enough for their intended use case. As a famous statistician quipped, all models are wrong, but some are more useful than others. This is why we are using widely both Newtonian physics and quantum mechanics - each of them is the preferred model for a given application.



                                      It is also entirely possible that a model which describes one situation in reality perfectly (it produces true statements in that situation) is not at all applicable to other situations. The inverse square law is a true model when you consider the intensity of light, and not a true model when you describe the speed of a ball rolling on a horizontal plane.



                                      Now on to your mathematics example. Mathematical models are very abstract. They can be used to represent some situation present in reality, or they can be used as mere mental constructs without any claims to be representative of a real situation. Let's first use your examples as representations of reality. In situation A, you get some sheep delivered and put them in a paddock. You want to predict the number of sheep you will have at the end. In that case, the model of addition of natural numbers will provide you with a true statement, while the second model, that of modulo arithmetic, will provide you with a false statement. Note that both statements are correct within their own models - an incorrect statement would be "1 + 1 = 1" in modulo 2 arithmetic. But one is true and one is false in your use case. If your neighbour is getting sheep delivered to two paddocks and has to ensure equal number of sheep between the two paddocks, then I also have a model which will predict how many sheep she has left over - here the modulo 2 arithmetic produces the true statement, and the addition over natural numbers produces a false one.



                                      But if there is a pure mathematician who has no aspiration of applying your mathematics to real situations, the question of "is it true or false" doesn't even arise. Since true/false is about fitting a model to reality, and she is not fitting your model to any reality, she cannot measure the statement's trueness anymore than she can measure its physical weight or its radioactivity - it just doesn't have this quality. All she can decide about the statement is whether it is correct within its model or not.



                                      If you think that there is just "truth", it is easy to see why you would compare the two situations you described, "1 + 1 = 2" being false* in modulo 2 arithmetic, and "each electron is in one place at a time" being false in quantum mechanics. But in fact, the two statements are not false, but incorrect in their respective models. Besides, the second one is false in the physical reality of the universe we live in. The first one is not traditionally attached to any specific use case in reality, so cannot be said to be true or false before you also provide such a use case.



                                      * OK, not strictly false, but my point still holds with examples that are strictly false.







                                      share|improve this answer












                                      share|improve this answer



                                      share|improve this answer










                                      answered Apr 11 at 19:59









                                      rumtschorumtscho

                                      28038




                                      28038



























                                          draft saved

                                          draft discarded
















































                                          Thanks for contributing an answer to Philosophy Stack Exchange!


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid


                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.

                                          To learn more, see our tips on writing great answers.




                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function ()
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphilosophy.stackexchange.com%2fquestions%2f61712%2ffalsification-in-math-vs-science%23new-answer', 'question_page');

                                          );

                                          Post as a guest















                                          Required, but never shown





















































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown

































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown







                                          Popular posts from this blog

                                          Adding axes to figuresAdding axes labels to LaTeX figuresLaTeX equivalent of ConTeXt buffersRotate a node but not its content: the case of the ellipse decorationHow to define the default vertical distance between nodes?TikZ scaling graphic and adjust node position and keep font sizeNumerical conditional within tikz keys?adding axes to shapesAlign axes across subfiguresAdding figures with a certain orderLine up nested tikz enviroments or how to get rid of themAdding axes labels to LaTeX figures

                                          Luettelo Yhdysvaltain laivaston lentotukialuksista Lähteet | Navigointivalikko

                                          Gary (muusikko) Sisällysluettelo Historia | Rockin' High | Lähteet | Aiheesta muualla | NavigointivalikkoInfobox OKTuomas "Gary" Keskinen Ancaran kitaristiksiProjekti Rockin' High