Peter Cochrane OBE, technologist & entrepreneur

There was a time when ‘the truth’ would last a long time!  The earth is flat; No it is a sphere; No it is flat, No no, it is a sphere; Whoops it is an oblate spheroid; Hmm, a dynamic oscillatory oblate spheroid.  This one took thousands of years to settle down, but ‘a dynamic oscillatory oblate spheroid’ is the best picture we now have.  And so it is for scientific truths; they are only as good as our models and experimental verification, and often, a closer look with better equipment changes our perspective.  But, that is what science is about: achieving the most accurate picture of our universe – and therefore ‘truths’ tend to be transitory!

Science is only one class of truth, and there are many more. Conventional truth set by man made rules of Law: incest is illegal.  Grammar: ’i’ before ‘e’ except after ‘c’.  Mathematics: the inverse of zero is infinity or it may be indeterminate.  Then we have political, managerial and agency truths: this politician/manager said this or that, a government enacted this or that.  Then there are belief systems and doctrinal truths:  there is only one God and he created heaven and earth.  Other belief systems embrace UFOs, Ghosts, Vampires, the SuperNatural, Communication with the dead et al.

So, in the age of the internet and an exponential explosion of information, how do we know something is true ?  We don’t!  But it looks as though we might get our machines to ‘quantify’ truth in the form of a probability statement. Several universities and Google are on this case, and the general line of attack takes a published ‘truth’ on the net; seeks out every related item; and then maps a derivation tree (representing the structure of the truth) based on the date of posting, authorship, organisation, and content. Filtering out ‘the noise’ can easily see more than 20 million apparently original postings reduced to less than 3 million credible ones.

Now it is possible to apply a meaningful statistical analysis to reveal a ‘mean opinion’ with a  standard deviation that says there is an X% confidence that this particular truth is correct.  If that percentage is greater  than 99% for example we can assume great credibility, but if it is significantly less we should be very wary in making our judgement call!

Rumour has it that Google will deploy ‘Truth Engine Technology’ in its search engine so that we get the ‘best picture’ on any search.  What a life saver!  However, there are still many algorithmic obstacles to overcome, and more refinement is required.  And watch out; the process could introduce a tendency to become a strange attractor for falsehoods and errors that persist for far longer than they should.  Any truth test also has to be dynamic and continuous: truth is not static it has to be tested in light of new evidence non-stop!

As a professional scientist, engineer, technologist and entrepreneur with over 40 years of management, technology and operational experience, Peter has been involved in a number of fundamental developments and discoveries in the field of optics and acoustics.

At BT he progressed to Head of R&D and then CTO. Appointed the UK’s first Professor for the Public Understanding of Science & Technology in 1988 at the University of Bristol, he currently holds a Visiting Chair in Complex Systems at The University of Hertfordshire.

A graduate of Nottingham Trent and Essex Universities, Peter has received notable recognition with the Queen’s Award for Innovation & Export in 1990, numerous Honorary Doctorates, and an OBE in 1999 for contributions to international communications

Check out more of the 100 ideas