InfluentialPoints.com Biology, images, analysis, design... 

"It has long been an axiom of mine that the little things are infinitely the most important" 

Just a note Now if, in fact, our 7 observed values really represented a population of P successes, the likelihood of ending up with that combination would simply be the product of their individual probabilities of success or failure. For 3 successes and 4 failures their combined likelihood would be P × P × P × Q × Q × Q × Q or, more concisely None of which assumes the population these 7 observations arose from actually contains P (=3/7) successes  we are merely saying that, given we observe p=3/7, that is the most likely situation to produce that result (assuming P could lie anywhere between 0 and 1). Notice that, whether you use likelihoods or loglikelihoods, both of these functions level off at their maximum likelihood. Maximum likelihood estimating algorithms assume the max likelihood is where likelihood is high and the slope equals zero. Unsmooth likelihood distributions upset this reasoning.
