Actually, Everett spent considerable effort in his original paper to
address this issue. (Later authors have muddied the waters, though,
particularly DeWitt's claim that the probabilities follow from the
formalism itself, which no one takes seriously today.)
The idea is that as the state function evolves into a mixture of
non-coherent states, a measure function can be applied to the various
components of the mixture, based on the probability amplitudes. Everett
shows that, in the limit, the probability results observed from a series
of measurements will match the squared-amplitude of the relative state,
exactly as required by conventional QM; this is true except in a set of
branches of total measure zero. So you then only have to assume that
branches with amplitude zero never exist, and you derive that observed
probabilities will follow the predictions of conventional QM. (This is
the basis for DeWitt's claim, since he takes the premise as obvious.)
Other authors have suggested that this solution is not quite enough, since
it only applies in the limit of an infinite series of measurements. They
suggest that a more powerful assumption is needed, namely that the
measure of a branch, defined as amplitude-squared, determines the subjective
likelihood of a conscious observer finding himself in the branch.
(You might think of this as indicating that the "degree of reality" of
a branch is proportional to the square of the amplitude.)
In either case, it is necessary to add a postulate to the simple QM
rules which describe evolution by means of the Schrodinger equation. We
have to say something about this measure function and its implications.
Some argue that this destroys the Occam's razor argument that the Everett
interpretation is simpler than conventional QM. In place of the projection
postulate (state function collapse) we introduce a statement about
subjective probabilities. Both interpretations need something beyond
the deterministic evolution of the Schrodinger equation.
I feel though that the Everett interpretation still wins out on grounds
of simplicity. The projection postulate claims that an actual physical
process occurs during measurement, and that the results are distributed
with a certain probability. With the Everett interpretation we need
instead to append an understanding of how the different amplitudes
affect our perceptions of reality. Both include some comment about
squared-amplitude and observed probability, but Everett leaves out the
part about the universe changing. So it does have a simplicity advantage,
although not as great as some people claim.
The bigger problem with the conventional interpretation IMO is that it is
inconsistent; the two different processes produce different results, and
it is not clear exactly when we should use one or the other. Furthermore,
as more delicate measuring devices are used, it may be that the definition
of what consistutes a measurement will become less clear. Then there are
potential future experiments such as the one described by John Clark which
intentionally muddy the waters so much that the conventional interpretation
is pretty much at a loss.
Hal