statisticians and electrical engineers are familiar with an analogous uncertainty between time and frequency in the analysis of time-series , and this obviously suggests the query : can a frequency n be associated with an energy E ? physicists appeal to the relation E = hn , where h is Planck &apos;s constant , but quite apart from the qualms expressed by Schro&quot;dinger ( 1958 ) about this relation , it is at least arguable that the frequency n is as fundamental in it as the energy E . I can therefore sympathize with ( though I am sceptical of ) the proposals by Bohm and de Broglie for a return to the interpretation of ps in terms of real ( deterministic ) waves ; I do not think these proposals will be rebutted until the statistical approach has been put on a more rational basis . interesting attempts have been made by various writers , but none of these attempts so far has , to my knowledge , been wholly successful or very useful technically . for example , Land&amp;eacute; keeps to a particle formulation , whereas it is the particle , and its associated energy E , which seem to be becoming the nebulous concepts . let me refer again to time-series theory , which tells us that the quantization of a frequency n arises automatically for circularly-defined series - for , if you will allow me to call it this , periodic time ( more precisely in a physical context , for the angle variables which appear in the dynamics of bound systems ) . a probabilistic approach via random fields thus has the more promising start of including naturally two of the features of quantum phenomena which were once regarded as most paradoxical and empirical - the uncertainty principle and quantization . this switch to fields is of course not new ; the real professionals in this subject have been immersed in fields for quite a while . however , I am not sure that what probabilists and what physicists mean here by fields are quite synonymous , and in any case it is the old probabilistic interpretation in terms of particles that we lay public still get fobbed off with . it would seem to me useful at this stage to make quite clear to us where , if anywhere , the particle aspect is unequivocal - certainly discreteness and discontinuity are not very relevant . here I must leave this fascinating problem of probability in quantum mechanics , as I would like to turn to its function in the theory of information . ( 3 ) . the concept of information . information theory as technically defined nowadays refers to a theory first developed in detail in connection with electrical communication theory by C Shannon and others , but recognized from the beginning as having wider implications as a conceptual tool . from its origin it was probably most familiar at first to electrical engineers , but its more general and its essentially statistical content made it a natural adjunct to the parts of probability theory hitherto studied by the statistician . this is recognized , for example , in an advertisement for a mathematical statistician from which I quote : applicants should possess a degree in statistics or mathematics , and should if possible be able to show evidence of an interest in some specialized aspect of the subject such as , for example , decision theory , information theory or stochastic processes . it has not , I think , been recognized sufficiently in some of the recent conferences on information theory , to which mathematical statisticians per se have not always been invited . the close connection of the information concept with probability is emphasized by its technical definition in relation to an ensemble or population , and indeed , it may usefully be defined ( cf Good ( 1950 ) , Barnard ( 1951 ) ) as - log p ( a simple and direct measure of uncertainty which is reduced when the event with probability p has occurred ) , although the more orthodox definition is the average information - Sp log p , averaged over the various possibilities or states that may occur . it is also possible to extend this definition to partial or relative information , in relation to a change of ensembles or distributions from one to another . with this extended definition of - log p/p&amp;prime; , where p&amp;prime; relates to the new ensemble , the information can be positive or negative , and as the logarithm of a probability ratio will look familiar to statisticians , although it should be stressed that the probabilities refer to fully specified distributions , and the likelihood ratio of the statistician ( made use of so extensively by Neyman and E S Pearson ) only enters if the probabilities p and p&amp;prime; are interpreted as dependent on different hypotheses H and H&amp;prime; . for example , if p&amp;prime; is near p , differing only in regard to a single unknown parameter th , then &amp;formula; where I(th) is R A Fisher &apos;s information function , under conditions for which this function exists . formally , the concept of information in Shannon &apos;s sense can be employed more directly for inferring the value of th . to take the simplest case shorn of inessentials , if we make use Bayes &apos;s theorem to infer the value of a parameter thr which can take one of only k discrete values , then our prior probability distribution about thr will be modified by our data to a posterior probability distribution . if we measure the uncertainty in each such distribution by - Sp log p , we could in general expect the uncertainty to be reduced , but we can easily think of an example where the data would contradict our a priori notions and make us less certain than before . this seems to me to stress the subjective or personal element in prior probabilities used in this way , and my own view is that the only way to eliminate this element would be deliberately to employ a convention that prior distributions are to be maximized with respect to uncertainty . in the present example this would imply assuming a uniform prior distribution for thr , and ensure that information was always gained from a sample of data ; it is somewhat reminiscent of arguments used by Jeffreys in recent years for standardizing prior distributions , but I think it important to realize that such conventions weaken any claim that these methods are the only rational ones possible . whether or not the information concept in this sense finds any permanent place in statistical inference , there is no doubts of its potential value in two very important scientific fields , biology and physics . this claim in respect to biology is exemplified by the symposium on information theory in biology held in Tennessee in 1956 ; and while we must be careful not to confuse the general function of new concepts in stimulating further research with the particular one of making a particular branch or aspect of a science more precise and unified , the use of the information concept in discussing capacities of nerve fibres transmitting messages to the brain , or coding genetic information for realization in the developed organism , should be sufficient demonstration of its quantitative value . as another illustration of the trend to more explicit and precise uses of the information concept in biology , we may consider the familiar saying that life has evolved to a high degree of organization , that in contrast to the ultimate degradation of dead matter , living organisms function by reducing uncertainty , that the significant feature of their relation with their environment is not their absorption of energy ( vital of course as this is ) , but their absorption of negative entropy . an attempt to measure the rate of accumulation of genetic information in evolution due to natural selection has recently been made by Kimura ( 1961 ) , who points out that a statement by R A Fisher that natural selection is a mechanism for generating an exceedingly high degree of improbability indicates how the increase in genetic information may be quantitatively measured . while his estimate is still to be regarded as provisional in character , it is interesting that Kimura arrives at an amount , accumulated in the last 500 million years up to man , of the order of 108 bits , compared with something of the order of 1010 bits estimated as available in the diploid human chromosome set . he suggests that part of the difference , in so far as it is real , should be put down to some redundancy in the genetic coding mechanism . with regard to physics , I have already mentioned negative entropy as a synonym for information , and this is in fact the link . again we have the danger of imprecise analysis , and the occurrence of a similar probabilistic formula for information and physical entropy does not by itself justify any identification of these concepts . nevertheless , physical entropy is a statistical measure of disorganization or uncertainty , and information in this context a reduction of uncertainty , so that the possibility of the link is evident enough . to my mind one of the most convincing demonstrations for the need of this link lies in the resolution of the paradox of Maxwell &apos;s demon , who circumvented the second law of thermodynamics and the inevitable increase in entropy by letting only fast molecules move from one gas chamber to another through a trap-door . it has been pointed out by Rosenfeld ( 1955 ) that Clausius in 1879 went some way to explaining the paradox by realizing that the demon was hardly human in being able to discern individual atomic processes , but logically the paradox remains unless we grant that such discernment , while in principle feasible , at the same time creates further uncertainty or entropy at least equal ( on the average ) to the information gained . that this is so emerges from a detailed discussion of the problem by various writers such as Szilard , Gabor , and Brillouin ( as described in Brillouin &apos;s book ) . ( 4 ) . the r&amp;ocirc;le of time . I might have noted in my remarks on quantum theory that , whether or not time is sometimes cyclic , it appears in that theory in a geometrical r&amp;ocirc;le , reminiscent of time in special relativity , and not in any way synonymous with our idea of time as implying evolution and irreversible change . it is usually suggested that this latter r&amp;ocirc;le must be related to the increase of physical entropy , but when we remember that entropy is defined statistically in terms of uncertainty we realize not only that evolutionary time itself then becomes statistical , but that there are a host of further points to be sorted out . let me try to list these : ( a ) . in the early days of statistical mechanics , at the end of the last century , Maxwell &apos;s paradox was not the only one raised . two others were Loschmidt &apos;s reversibility paradox , in which the reversibility of microscopic processes appeared to contradict the second law , and Zermelo &apos;s recurrence paradox , in which the cyclical behaviour of finite dynamic systems again contravened the second law . it should be emphasized that , while these paradoxes were formulated in terms of deterministic dynamics , they were not immediately dissipated by the advent either of quantum theory or of the idea of statistical processes . for I have just reminded you that time in quantum mechanics is geometrical and reversible ; and stationary statistical processes based on microscopic reversible processes are themselves still reversible and recurrent . the explanations of the paradoxes are based , in the first place , on the difference between absolute and conditional probabilities , and in the second , on the theory of recurrence times . the apparent irreversibility of a system is due to its being started from an initial state a long way removed from the more typical states in equilibrium and the apparent non-recurrence of such a state to the inordinately long recurrence time needed before such a state will return . ( b ) . so far so good - but this conclusion applies to a system of reasonable size . we conclude that microscopic phenomena have no intrinsic time-direction , at least if this can only be defined in relation to internal entropy increase ( cf Bartlett , 1956 ) . this is consistent with theoretical formulations in recent years of sub-atomic phenomena involving time-reversals . ( c ) . we have also to notice that while the entropy of our given system will increase with external or given time , this relation is not reciprocal , for , if we first choose our time , a rare state in our stationary process will just as likely be being approached as being departed from . 