User Tag List

Results 1 to 4 of 4

  1. #1
    Senior Member reason's Avatar
    Join Date
    Apr 2007
    MBTI
    ESFJ
    Posts
    1,211

    Default Hypotheses and Probability

    For Evan,

    Note: c(e) refers to the logical content of e (not including tautologies).

    If p(h|e) is greater than p(h), then p(e|h) = 1.

    If p(e|h) = 1, then c(e) is a subset of c(h).

    If c(h1) = c(h) - c(e), then c(h1) + c(e) = c(h)

    Therefore, h = h1 + e

    We can now replace any instance of h with h1 + e.

    If p(h1 + e|e) is greater than p(h1 + e), then p(e|h1 + e) = 1.

    However, if we break apart h1 + e, we can isolate h1: that part of h which is not equal to e. We can then ask whether h1 is supported by e,

    Is p(e|h1) = 1?

    Since h1 is derived from c(h) minus c(e), then the answer is no.

    In other words, that portion of h which goes beyond e is not made more probable by e. The apparent increase in probability of all of h arises from treating hypotheses as indivisible elements -- a fallacy of composition. What probabilities you get, and what supports what, just depends on how you say it.

    Edit: I haven't actually tried writing this out in this form before, so I probably (heh) made some mistakes, knowing me.
    A criticism that can be brought against everything ought not to be brought against anything.

  2. #2
    FigerPuppet
    Guest

    Default

    Why didn't you just PM this?

  3. #3
    Senior Member reason's Avatar
    Join Date
    Apr 2007
    MBTI
    ESFJ
    Posts
    1,211

    Default

    Quote Originally Posted by SmileyMan View Post
    Why didn't you just PM this?
    Because I felt like it?
    A criticism that can be brought against everything ought not to be brought against anything.

  4. #4
    Occasional Member Evan's Avatar
    Join Date
    Nov 2007
    MBTI
    INFJ
    Enneagram
    1
    Posts
    4,223

    Default

    Quote Originally Posted by reason View Post
    For Evan,

    Note: c(e) refers to the logical content of e (not including tautologies).

    If p(h|e) is greater than p(h), then p(e|h) = 1.

    If p(e|h) = 1, then c(e) is a subset of c(h).

    If c(h1) = c(h) - c(e), then c(h1) + c(e) = c(h)

    Therefore, h = h1 + e

    We can now replace any instance of h with h1 + e.

    If p(h1 + e|e) is greater than p(h1 + e), then p(e|h1 + e) = 1.

    However, if we break apart h1 + e, we can isolate h1: that part of h which is not equal to e. We can then ask whether h1 is supported by e,

    Is p(e|h1) = 1?

    Since h1 is derived from c(h) minus c(e), then the answer is no.

    In other words, that portion of h which goes beyond e is not made more probable by e. The apparent increase in probability of all of h arises from treating hypotheses as indivisible elements -- a fallacy of composition.
    Ah, I see what you were trying to say yesterday. That's true.

    The cool part of Bayes' rule is when you have mutually exclusive hypotheses, because then you can add all of them to 1. When evidence makes the probability of one of those hypotheses go to zero, the probability of others go up.

    Also, if you think about coin flips, you could have 3 hypotheses before you see an outcome: a head/head coin, a head/tail coin, and a tail/tail coin.

    You see a head. So tail/tail is out, and head/tail goes down compared to head/head because head/tail is less likely to produce heads.

    You see another head. Now the fair coin only has a 25% chance of having produced that but the head/head has a 100%.

    So say you see 10 heads in a row and no tails. A fair coin would've had a 1/1024 chance of producing it. Whereas a heads/heads would have a 100% chance. Factor in how likely heads/heads coins are in general, and you have a sweet decision making engine.

    Abstract from that, and you can at least produce a descriptive account of human reasoning.


    Edit: some math --

    h1 = heads/head coin produced the sequence
    h2 = heads/tails coin produced the sequence

    p(h1 |d) / p(h2 | d) = (p(d | h1) * p(h1)) / (p(d | h2) * p(h2))

    if you assume that before you see any evidence, a heads/heads coin is just less likely, you could start with something like this:
    p(h1) = .01
    p(h2) = .99

    So you see 5 heads in a row (the sequence HHHHH), and you're trying to figure out which of your two hypotheses is more likely given the assumptions.

    p(h1 |d) / p(h2 | d) = (1*.01) / (.03125 * .99)
    = .323

    Say you see 10 heads in a row:

    p(h1 |d) / p(h2 | d) = (1*.01) / (0.0009765625 * .99)
    = 10.34

    At about 7 heads in a row, you should favor the heads/heads hypothesis. Before that, you should favor the fair coin hypothesis. The further you go from the point where the ratio is 1, the more you should weigh those beliefs.



    Say you saw someone cough. You could think they had a random cold. Or you could think they had some chronic long lasting cough. Since colds are more likely, you'd probably think they had a cold. But if you see them cough every day for some months, you'd probably reassess your original belief.

    This logic works for like everything, bro.

Similar Threads

  1. [E4] Type 4 (4w3.. probably) and ENTP
    By annnie in forum Enneatypes
    Replies: 8
    Last Post: 12-23-2010, 03:33 PM
  2. MBTI certainty, functions and probability
    By Asterion in forum Myers-Briggs and Jungian Cognitive Functions
    Replies: 7
    Last Post: 03-04-2010, 07:02 AM
  3. [NT] Probability Relations and Induction
    By Provoker in forum The NT Rationale (ENTP, INTP, ENTJ, INTJ)
    Replies: 51
    Last Post: 09-30-2009, 06:54 PM
  4. The Ethics of Probability: A Poker Metaphor for Faith and Ethics
    By simulatedworld in forum Philosophy and Spirituality
    Replies: 1
    Last Post: 12-17-2008, 08:00 AM
  5. Hypotheses on self-dislike and type
    By ygolo in forum Myers-Briggs and Jungian Cognitive Functions
    Replies: 39
    Last Post: 11-20-2008, 05:41 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
Single Sign On provided by vBSSO