Biology, images, analysis, design...
Use/Abuse Principles How To Related
"It has long been an axiom of mine that the little things are infinitely the most important" (Sherlock Holmes)

Search this site



Just a note

Now if, in fact, our 7 observed values really represented a population of P successes, the likelihood of ending up with that combination would simply be the product of their individual probabilities of success or failure. For 3 successes and 4 failures their combined likelihood would be P × P × P   ×   Q × Q × Q × Q or, more concisely P3 × Q4 or, if you prefer the log-likelihood, it is 3.log( P ) + 4.log( Q ).

None of which assumes the population these 7 observations arose from actually contains P (=3/7) successes - we are merely saying that, given we observe p=3/7, that is the most likely situation to produce that result (assuming P could lie anywhere between 0 and 1).

Notice that, whether you use likelihoods or log-likelihoods, both of these functions level off at their maximum likelihood. Maximum likelihood estimating algorithms assume the max likelihood is where likelihood is high and the slope equals zero. Unsmooth likelihood distributions upset this reasoning.