Page 11 of 24 FirstFirst ... 67891011121314151621 ... LastLast
Results 201 to 220 of 474
  1.    #201  
    Originally posted by Toby
    [B]I'm more of a Jeopardy person.[B]No, it doesn't. What on earth would make you think that knowing one of the wrong doors means anything? Giving you a _perceived_ 50/50 shot at it hasn't increased the original odds of you winning.[B]Let's see the math. If they always show a wrong choice that you didn't pick, your odds have not improved at all, and there's no incentive to switch, and AAMOF, the odds of pursuing a strategy of always _not_ switching are identical.If there is only one winner, what makes you think it isn't behind the door you already picked?
    Hey it worked. I backed you into a corner. The answer is you always switch.

    What're the odds of initially picking the right door? One in three.
    What're the odds of picking the right door when you switch? One in two. Why? Because one incorrect door was shown and so you won't switch to that!
  2. #202  
    Originally posted by KRamsauer
    Now what if this disease kills people within 10 seconds of some sign (say a seizure) which I am experiencing if $Medication isn't injected and the test to determine my allergy takes half an hour.
    OK, so you're only brain-damaged now.
    Presume there is $Medication2 which cures the 1% allergic to $Medication but kills the 99% who aren't.
    Why don't we presume that little fairies will fly down from the ceiling while we're at it.
    What do you do? Clearly you use statistics!
    Glad you're not a doctor.
    That is all I'm saying here. You're going to be wrong often, but you will save more people (like me!) than if you were randomly administer (in 50% - 50% ratios) the two or administer none whatsoever. You should use your knowledge of the poplution to cure me.
    And if a few people here and there die, no problem.
    ‎"Is that suck and salvage the Kevin Costner method?" - Chris Matthews on Hardball, July 6, 2010. Wonder if he's talking about his oil device or his movie career...
  3.    #203  
    Originally posted by Toby
    And if a few people here and there die, no problem.
    But many more would have died had you followed a blind strategy of using $MEDICINE2 as much as $MEDICINE1. BTW, my added details were put in to solidify the math behind the problem and enumerate penalties for both type one and type two error (H0: new patient is a member of the 99%; H1: he's allergic to $MEDICINE).
  4. #204  
    Originally posted by KRamsauer
    Hey it worked. I backed you into a corner. The answer is you always switch.
    No.
    What're the odds of initially picking the right door? One in three.
    You're correct here.
    What're the odds of picking the right door when you switch? One in two. Why? Because one incorrect door was shown and so you won't switch to that!
    You're wrong here. Why? The odds of the door you stay with being the right door are also one in two now. If I flip three coins and the first two come up heads, there is no higher chance that the next flip will come up tails.
    ‎"Is that suck and salvage the Kevin Costner method?" - Chris Matthews on Hardball, July 6, 2010. Wonder if he's talking about his oil device or his movie career...
  5. #205  
    Originally posted by KRamsauer
    But many more would have died had you followed a blind strategy of using $MEDICINE2 as much as $MEDICINE1. BTW, my added details were put in to solidify the math behind the problem and enumerate penalties for both type one and type two error (H0: new patient is a member of the 99%; H1: he's allergic to $MEDICINE).
    I suggested no such blind strategy, nor even the existence of $MEDICINE2. The point was that assuming that a particular person fit into a mold simply because a certain percent of the population does is flawed. I don't know why I bothered though.
    ‎"Is that suck and salvage the Kevin Costner method?" - Chris Matthews on Hardball, July 6, 2010. Wonder if he's talking about his oil device or his movie career...
  6.    #206  
    Originally posted by Toby
    You're wrong here. Why? The odds of the door you stay with being the right door are also one in two now. If I flip three coins and the first two come up heads, there is no higher chance that the next flip will come up tails.
    That's not equivalent because the events in Let's Make a Deal are not independent. You know there's only one prize, and the prize is placed there before you choose your first door. So when you choose your first door there is a 1/3 chance of being right, which we agree on. Now, you also agree that there is no way of telling which door is the right one, so if you are set on a strategy of never switching, the odds of you winning are 1 in 3. But if you always switch you only lose when you pick the right one right off, or 1/3 of the time. So you win 2/3 of the time. Understand? I'd like to make sure you understand it because it is such a wonderful problem that when you really understand it, it's a great feeling.
    Last edited by KRamsauer; 10/17/2002 at 02:51 PM.
  7.    #207  
    Originally posted by Toby
    I suggested no such blind strategy, nor even the existence of $MEDICINE2. The point was that assuming that a particular person fit into a mold simply because a certain percent of the population does is flawed. I don't know why I bothered though.
    Flawed in the sense that it isn't perfect? Yes. But that's like saying chemo-therapy is flawed because it doesn't save everyone. YOu do the best with what you have.
  8. #208  
    Originally posted by KRamsauer
    That's not equivalent because the events in Let's Make a Deal are not independent. You know there's only one prize, and the prize is placed there before you choose your first door. So when you choose your first door there is a 1/3 chance of being right, which we agree on. Now, you also agree that there is no way of telling which door is the right one, so if you are set on a strategy of never switching, the odds of you winning are 1 in 3. However, if you always switch, you effectively choose between two (because the one revealed will never be chosen) and the winner is always in that set of two. Understand? I'd like to make sure you understand it because it is such a wonderful problem that when you really understand it, it's a great feeling.
    I bet it's going to be a terrible feeling when you realize that you're wrong. Such a scenario is probably where Sam Clemens's saying came from, no doubt. The non-independence doesn't only favor the switch.
    ‎"Is that suck and salvage the Kevin Costner method?" - Chris Matthews on Hardball, July 6, 2010. Wonder if he's talking about his oil device or his movie career...
  9.    #209  
    Originally posted by Toby
    I bet it's going to be a terrible feeling when you realize that you're wrong. Such a scenario is probably where Sam Clemens's saying came from, no doubt. The non-independence doesn't only favor the switch.
    Okay, here you go:
    You stand a 1/3 chance of getting it right on the first pick.
    So if you don't switch you have a 1/3 chance of getting it right.
    If you switch, the only time you're going to get it wrong is if you happen to pick the right one the first time. However, we already found that the odds of that is 1/3. Thereofre you have a 2/3 chance of winning when you switch.
  10. #210  
    Originally posted by KRamsauer
    Okay, here you go:
    You stand a 1/3 chance of getting it right on the first pick.
    So if you don't switch you have a 1/3 chance of getting it right.
    If you switch, the only time you're going to get it wrong is if you happen to pick the right one the first time. However, we already found that the odds of that is 1/3. Thereofre you have a 2/3 chance of winning when you switch.
    You don't understand a damned thing about odds obviously, and I can't believe I let myself waste this much time on it. Even _if_ you stood a higher chance of getting it right by switching (which you don't), the _best_ your odds would be is one in two. This sounds like something you read out of a Marilyn Vos Savant column, and were gullible enough to buy into.
    ‎"Is that suck and salvage the Kevin Costner method?" - Chris Matthews on Hardball, July 6, 2010. Wonder if he's talking about his oil device or his movie career...
  11.    #211  
    Originally posted by Toby
    You don't understand a damned thing about odds obviously, and I can't believe I let myself waste this much time on it. Even _if_ you stood a higher chance of getting it right by switching (which you don't), the _best_ your odds would be is one in two. This sounds like something you read out of a Marilyn Vos Savant column, and were gullible enough to buy into.
    Why would your odds be 1 in 2 (still better than 1/3)? They are actually 2 in 3 (because the only way you can lose is to pick the "right" door initially)! Call up your local university and ask to speak to a stat prof. They will mention that this is one thing they go over extensively in class.

    Do you understand why you can only lose switching by picking the "right" door initially? It's because having picked the right one originally, the "host" will reveal one of two remaining losers, and you will pick the loser when you switch. However if you pick a loser initially (2/3 chance) when the host reveals the other loser you will always pick the winner when you switch.

    Okay, here are a whole host of links describing why it's best to switch:
    http://cartalk.cars.com/Tools/monty.pl
    http://cartalk.cars.com/About/Monty/proof.html
    http://dir.yahoo.com/Science/Mathema..._Hall_Problem/
    http://homepage.ntlworld.com/mr.foo/monty.html
  12.    #212  
    Originally posted by Toby
    You don't understand a damned thing about odds obviously, and I can't believe I let myself waste this much time on it. Even _if_ you stood a higher chance of getting it right by switching (which you don't), the _best_ your odds would be is one in two. This sounds like something you read out of a Marilyn Vos Savant column, and were gullible enough to buy into.
    Not knowing who that character was, I went searching. I came across the following, linked to from a site that shows all examples of her being wrong:
    http://www.dartmouth.edu/~chance/cou...onty_Hall.html

    Some highlights (emphasis added):
    " Mr. Hall said he realized the contestants were wrong, because the odds on Door 1
    were still only 1 in 3 even after he opened another door. Since the only other place
    the car could be was behind Door 2, the odds on that door must now be 2 in 3. "

    " "So her answer's right: you should switch," Mr. Hall said, reaching the same
    conclusion as the tens of thousands of students who conducted similar
    experiments at Ms. vos Savant's suggestion. That conclusion was also reached
    eventually by many of her critics in academia, although most did not bother to
    write letters of retraction. Dr. Sachs, whose letter was published in her
    column, was one of the few with the grace to concede his mistake.

    "I wrote her another letter," Dr. Sachs said last week, "telling her that
    after removing my foot from my mouth I'm now eating humble pie. I vowed as
    penance to answer all the people who wrote to castigate me. It's been an intense
    professional embarrassment.""

    " The results contradict most people's intution that, when there are only two
    unopened doors left, the odds on each one must be 1/2. But the fact that Mr.
    Hall opens another door doesn't affect the odds on Door 1: You had a one-third
    chance of being right to begin with, and you still have a one-third chance after
    he opens, say, Door 3.
    You knew he was going to open another door and reveal a
    goat regardless of what was behind Door 1, so his action provides no new
    information about Door 1. Therefore, since the odds on Door 1 are still
    one-third, and the only other place the car could be is behind Door 2, the odds
    on Door 2 must now be two-thirds. "
  13. #213  
    Originally posted by KRamsauer
    Why would your odds be 1 in 2 (still better than 1/3)? They are actually 2 in 3 (because the only way you can lose is to pick the "right" door initially)! Call up your local university and ask to speak to a stat prof. They will mention that this is one thing they go over extensively in class.
    The only time that the odds are 2 out of 3 is if the host is required to offer the switch _all_the_time_ (which was not the case on LMAD) and has foreknowledge of the location of the prize. Again, you're letting raw statistics fool you into a conclusion not supported by reality. I don't need a stat prof. I used to watch the show. Again, the flaw is trying to impose statistics on reality. It just doesn't work that way.
    ‎"Is that suck and salvage the Kevin Costner method?" - Chris Matthews on Hardball, July 6, 2010. Wonder if he's talking about his oil device or his movie career...
  14. #214  
    Originally posted by KRamsauer
    Not knowing who that character was, I went searching. I came across the following, linked to from a site that shows all examples of her being wrong:
    http://www.dartmouth.edu/~chance/cou...onty_Hall.html

    Some highlights (emphasis added):
    " Mr. Hall said he realized the contestants were wrong, because the odds on Door 1
    were still only 1 in 3 even after he opened another door. Since the only other place
    the car could be was behind Door 2, the odds on that door must now be 2 in 3. "

    " "So her answer's right: you should switch," Mr. Hall said, reaching the same
    conclusion as the tens of thousands of students who conducted similar
    experiments at Ms. vos Savant's suggestion. That conclusion was also reached
    eventually by many of her critics in academia, although most did not bother to
    write letters of retraction. Dr. Sachs, whose letter was published in her
    column, was one of the few with the grace to concede his mistake.

    "I wrote her another letter," Dr. Sachs said last week, "telling her that
    after removing my foot from my mouth I'm now eating humble pie. I vowed as
    penance to answer all the people who wrote to castigate me. It's been an intense
    professional embarrassment.""

    " The results contradict most people's intution that, when there are only two
    unopened doors left, the odds on each one must be 1/2. But the fact that Mr.
    Hall opens another door doesn't affect the odds on Door 1: You had a one-third
    chance of being right to begin with, and you still have a one-third chance after
    he opens, say, Door 3.
    You knew he was going to open another door and reveal a
    goat regardless of what was behind Door 1, so his action provides no new
    information about Door 1. Therefore, since the odds on Door 1 are still
    one-third, and the only other place the car could be is behind Door 2, the odds
    on Door 2 must now be two-thirds. "
    And all of that is irrelevant because Monty was not required to offer a switch. To quote Monty himself, "If the host is required to open a door all the time and offer you a switch, then you should take the switch," he said. "But if he has the choice whether to allow a switch or not, beware. Caveat emptor. It all depends on his mood."
    ‎"Is that suck and salvage the Kevin Costner method?" - Chris Matthews on Hardball, July 6, 2010. Wonder if he's talking about his oil device or his movie career...
  15.    #215  
    Originally posted by Toby
    The only time that the odds are 2 out of 3 is if the host is required to offer the switch _all_the_time_ (which was not the case on LMAD) and has foreknowledge of the location of the prize.
    I'm sorry if you misunderstood, but the host does always offer the switch. Additionally, the host will only ever reveal an empty door. Going back to my original formulation:
    "Behind one of three doors is a prize. You get to pick one door. Then I will show you a door that doesn't have the prize behind it (of course I won't reveal your door). "
    Now I did say "when given the chance" regarding your switch, which is misleading, perhaps. I give you the chance, always. Do you switch? And of course I know where the prize is, otherwise I wouldn't be able to show you always a door without a prize. Does this make it clear? If so, you can see why it is always better to switch.
  16.    #216  
    Originally posted by Toby


    And all of that is irrelevant because Monty was not required to offer a switch. To quote Monty himself, "If the host is required to open a door all the time and offer you a switch, then you should take the switch," he said. "But if he has the choice whether to allow a switch or not, beware. Caveat emptor. It all depends on his mood."
    right, I guess I should have made clear I will always offer you the option of switching. I've gone over this problem so many times that sometimes I forget to completely clarify things because it's so clear to me.
  17. #217  
    Originally posted by KRamsauer
    I'm sorry if you misunderstood, but the host does always offer the switch.
    No, on Let's Make A Deal, he didn't.
    Additionally, the host will only ever reveal an empty door. Going back to my original formulation:
    "Behind one of three doors is a prize. You get to pick one door. Then I will show you a door that doesn't have the prize behind it (of course I won't reveal your door). "
    Now I did say "when given the chance" regarding your switch, which is misleading, perhaps. I give you the chance, always. Do you switch? And of course I know where the prize is, otherwise I wouldn't be able to show you always a door without a prize. Does this make it clear? If so, you can see why it is always better to switch.
    Only in the artificial reality of a manipulated example, and only in the largest aggregate sense, and to illustrate the point, when I went to the "proof", I switched and lost. Again, you seem to think that being correct in the aggregate makes up for it. The point that has been beaten to death here is that when dealing with an individual, the aggregate goes out the window. Even if statistically, people who switch win, that doesn't make up for the one chance you had at it and lost. People aren't statistics is the point you keep seeming to miss.
    ‎"Is that suck and salvage the Kevin Costner method?" - Chris Matthews on Hardball, July 6, 2010. Wonder if he's talking about his oil device or his movie career...
  18. #218  
    Originally posted by KRamsauer
    right, I guess I should have made clear I will always offer you the option of switching. I've gone over this problem so many times that sometimes I forget to completely clarify things because it's so clear to me.
    Therein lies the rub. You're so concentrated on the math, that you lose sight of the reality and drown.
    ‎"Is that suck and salvage the Kevin Costner method?" - Chris Matthews on Hardball, July 6, 2010. Wonder if he's talking about his oil device or his movie career...
  19.    #219  
    Originally posted by Toby
    [B]No, on Let's Make A Deal, he didn't.Only in the artificial reality of a manipulated example, and only in the largest aggregate sense, and to illustrate the point, when I went to the "proof", I switched and lost. Again, you seem to think that being correct in the aggregate makes up for it. The point that has been beaten to death here is that when dealing with an individual, the aggregate goes out the window. Even if statistically, people who switch win, that doesn't make up for the one chance you had at it and lost. People aren't statistics is the point you keep seeming to miss.
    Hm.... So you're saying we can say nothing about advantageous strategies because they don't work always? Seems like you're demonstrating a great example of one well documented irrationality in human thought that values things lost much too highly, relative to things won. For instance, if a stock is overvalued at $70 a share, someone who bought at 50 is more likely to sell than someone who bought at $90 even though they both have the exact same value stock at the moment. Because it feels bad to admit you lost something, you take an irrational approach.
    Last edited by KRamsauer; 10/17/2002 at 03:39 PM.
  20.    #220  
    Originally posted by Toby
    Therein lies the rub. You're so concentrated on the math, that you lose sight of the reality and drown.
    Drown? You're getting your examples mixed up. Analyzing either problem (river or monty) correctly will always yield solutions at least as good as someone who looks at it with a gut feeling.

Posting Permissions