Research Interests

Wave Particle Duality

According to Quantum Mechanics, energy is not continuous but is in discrete levels. An analogy would be that in trying to get from one level of the house to another; if we go up the stairs we can only be on one tread or the next, whereas, if there were a slope, we could work our a way up or down gradually. It was always thought that Energy was continuous like the slope but at the start of the 20th century there was experimental evidence that it was not. A quantum model was needed to explain the electronic structure of atoms and the photoelectric effect could not be explained in terms of classical energy distribution. Quantum Mechanics as it is generally interpreted has lead to some very strange predictions. The idea that an event does not happen until it's observed. That at every instant and every place in the universe an infinite number of other possibilities is generated than the one we actually experience and more to the point, that light can have a weave and a particle nature. Conventionally this is explained on the basis that there is a probability wave function that gives rise to the particle like photons. However, my contention is that if an additional dimension is added there is no necessity to bring in a probabilistic explanation and we can stay deterministic just like pretty well everywhere else in science. Introducing higher dimensions in this way is not so far-fetched. In the unification of Field theories we have reached M theory which has at last count 13 Dimensions. To include one dimension more in order to remove this probabilistic interpretation of wave particle duality to seems a small price to pay. Thus far I have only this intuitive idea but it is one of my research interests to try to produce a formal account of how the extra dimension would be added to Quantum Mechanics. It's generally accepted that imaginary numbers are needed to give an account of quantum wave particle systems and in this sense this does introduce another dimension. I do think ‘dimension’ is rather inflationary and ‘degrees of freedom’ is a much better phrase.

 

Quantum Teleology

It was mentioned in my previous research section on wave particle duality that the accepted ‘Copenhagen’ interpretation of quantum mechanics leads to some very strange predictions. An event does not happen unless it's observed. At every point and at every instant in the universe an infinite number of possibilities exists. It's been said that if you think you've understood quantum mechanics either you haven't or you’re crazy. The trouble is quantum mechanics not only works but is used in just about every high-tech piece of electronic equipment on the planet. There is an alternative interpretation of quantum mechanics which also works but doesn't lead to such bizarre predictions however. It uses something called a Two State Vector. This removes these strange predictions but it assumes that time is symmetrical at the quantum level. One result from this is that not only does the past affect the present, as we all accept but there is a probability of the future affecting the present. The idea that the future affects the present is loosely used by biologists when they speak of an organism having evolved in a certain way to fulfil particular functions. The theoretical basis for the Two State Vector of quantum mechanics is all worked out but as yet no experiment has been designed to test it.

 

Spatiotemporal Communication with Gravitational Waves

When Newton’s theory of gravity failed to explain the motion of the perihelion of the planet Mercury, a new theory was needed. Einstein's General Theory of Relativity not only gave us an alternative theory to Newton’s and explained this problem but revolutionised the way that we look at the fabric of space and time. In his Theory of Special Relativity he had already joined space and time together. This was not a mere whim. After a great many experiments by Michelson and Morley it was clear that, against the prevailing theory, there was no luminiferous aether permeating the whole universe. An interferometer was used that was extremely accurate. The object of the experiments to see what the difference in its interference pattern would be when the interferometer was rotated through 90°. Despite countless tests over a 10 year period no change in the interference pattern was discovered, the opposite of what the prevailing aether view predicted. This result seemed absurd and it was only sometime later when the Lorentz transform was used to analyse it, that it started to make sense. Einstein realised that the implication was that the speed of light was constant for any initial frame, a frame that was not accelerating. That no matter how fast an inertial frame was travelling the speed of light within it would always be the same. That meant that time itself depended on the speed of the inertial frame! This is now generally accepted and there is good experimental evidence to show that it's true. This meant that space and time interact with one another and are to some extent interdependent. In Einstein's General Theory he showed the effect of gravity could be explained by the curvature of this space time. Einstein postulated the possibility of gravitational waves. These might be produced, for instance, by two massive stars rotating around one another. There have been claims that such gravitational waves have been detected but there is still great uncertainty as to whether they actually have. Electromagnetic waves are a useful, even essential, means for communication: radio waves, microwaves and light are all types of electromagnetic wave. “Is it possible that the gravitational waves might be used for communication?” I asked myself. According to Einstein's equation for gravitational waves, the masses involved would have to be absolutely enormous in order to detect such gravitational waves, let alone use them for communication. Nonetheless, if you look back to the discovery of wireless without special resonant circuitry, even radio waves would not have been detectable. Indeed resonant mass systems for detecting gravitational waves have been employed. I am considering the possibility that gravitational waves might be produced in a much higher frequency band than those produced cosmologically. The receiving system should be multiply resonant. The signals should be digitally encoded to prevent back ground noise from moving masses being picked up as noise. If gravitational waves could be used for communication they would have the advantage over electromagnetic waves in that they cannot be screened and will pass through anything. Even more intriguing is the possibility, however controversial, that the gravitational waves would pass through time, given that space and time may be the same ‘thing’. Of course for us time is not the same as space. This may be due to the fact that there seems to be a direction to time, at least at the macro level because the universe changes to higher entropy or greater disorder. What about the grandfather paradox? If I receive a message from my future self to destroy the transmitter and I do, then how could my future self have sent the message? Provided you are prepared to think in four dimensions instead of three there is no problem, you are just looking at a space-time line through a different set of coordinates. I do wonder whether cosmological gravitational waves can in principle be detected. If the waves are perturbation in time as well as space, we cannot use time as an independent variable. We would observe nothing. For my hypothetical transmitter, it would be necessary that the space and time coordinates be transmitted out of phase. Although I can think of no experiment to prove that gravitational waves travel through time, if they are indisputably detected from binary stars, I concede that they do not, as space and time components would be in phase.

 

Logical Structure of Scientific Theories

It’s easy to think that when a scientific theory is being used it is giving us a reality. Actually it's just giving us one way of looking at phenomena. This is why we often find more than one theory can be applied to a particular to explaining and predicting observational and experimental results. Traditionally in Logical Empiricism individual terms were related to objects in the real world on a one to one basis. However, this left no room for theory change in which the objects were seen as different. Kuhn distinguished normal science which adhered to a set of fixed rules and concepts from revolutionary science in which the rules and concepts have to be radically changed. The reason for such a ‘revolution’ would typically be that the original theory could not solve important problems in some new domain of applications. For example; mass for classical mechanics is different from that for relativistic physics and the gene for traditional probabilistic genetics is different from that for molecular genetics. Kuhn’s views do not provide a formal rigorous account of scientific theories and theory change in the way that Logical Empiricism does though. Sneed formalised his ideas using modern logic and model theory. A number of scientific theories have been reconstructed using his approach and the relations between them made clearer. My own research in this area was in Genetics where I was able to make the relation between classical and modern Genetics clearer. I have written a book and papers with Professor Balzer from Munich University on this topic. Many of the arguments between adversaries in science are due to the fact that each believes their theory to be the real one, rather than just a model of what is really there- whatever that is.

 

Expert Computer Systems

Computer languages have been developed that are able to access a database of accepted knowledge to discover the truth of a conclusion and the validity of an argument. They allow the computer to behave as if it was an unfailing expert. These are called Expert Computer Systems. The computer languages themselves are often called Logic Programming Languages, as they use the principles of modern symbolic logic. Provided the database and the rules are not changing, the correct answers to enquiries, even of great complexity, can be given within a second. One of the foremost of such logical programming languages is Prolog. I did in fact write a book for Springer with my elder son called ‘Prolog for Computer Sciences’. This was in the early 90’s but I am pleased to see it has stayed in print and was recently reprinted.

 

Glasses free Lenticular 3D TV and Movies

Stereoscopic vision or vision in depth requires each eye to see a slightly different image of an object or scene. This can be achieved by using coloured or polarised glasses and on modern 3-D televisions shutter glasses. An approach that does not require special glasses has been used for many years in printing. It is to split the image into fine strips each containing a number of different views. A sheet of similarly spaced striped lenses is then placed in front of this. Different eye positions then see different views of the original object or scene. This gives the impression of depth. It's currently used on the boxes of many 3D Blu Ray videos. Until recently the resolution from flat screen video displays was not high enough to give an interesting number of views at a worthwhile resolution. With the development of 2K, 4K and now 8K displays, multiple views with good resolution are now possible. The number of views can be increased by a factor of three by using colour multiplexing, in which each RGB sub-pixel corresponds to a slightly different view. Storing and transmitting the vast amounts of information in production would appear to be a problem. Differences between neighbouring views are generally small however. A high level of electronic compression can therefore be applied. Production of material for viewing has largely been computer generated so far and it would seem that for live filming as many cameras would be needed as there are potential eye positions. This is not necessarily so. Central and extreme left and right views can be computer analysed to provide the missing views to an accuracy that is sufficient for normal viewing. Such technology is already well established for the production of ‘panoramograms’ such as those used for printed displays, as previously mentioned.

 

Artificial Neural Networks

From looking at the way in which impulses are passed from neurone to neurone within the brain, an electronic analogue was devised. The details are quite complicated and you are referred to my video on this topic. Traditionally the ‘Perceptron’ was the starting point. People have argued that it is static, whereas the human brain is dynamic, it is not only a question of whether impulses should be passed from one neurone and what their effective intensity, but also whether they arrive at the same time. Most researchers have taken this as the death Knoll for Perceptron like Artificial Neural Networks and decided a completely new approach was needed. I'm not convinced that the basic Perceptron idea can't be modified in some way so that it can account for this dynamic aspect. A neural network evolved from the Perceptron is actually used in Google Dreams for deep analysis, so it's by no means only of historical interest. Artificial Neural Networks are important in so called ‘data mining’. In this, although the information is in the data base, it may be difficult to retrieve or very context dependent. Originally a special electronic circuit was needed for Artificial Neural Networks but as conventional sequential computers became faster and memory became cheaper and less bulky, emulator programs evolved that would do what a an Artificial Neural Network could do but on a conventional computer.

 

Superconductivity

All metals have electrical resistance. This can be good if it is intended to control the flow of electrical current or if electricity is being used for heating. Electrical resistance can be a nuisance, leading to unwanted power losses in the transmission of electricity and loss of energy with overheating in electric motors and electromagnets. An electric current could be stored indefinitely in a loop of a superconducting material with no resistance so that heavy and cumbersome chemical batteries would not be needed. Metals generally lose their resistance and become superconducting near absolute zero but materials are being developed all the time, to raise the ‘critical’ temperature at which superconductivity occurs. Indeed, there have been claims that superconductivity has been achieved in materials such as carbon nanotubules even at room temperature.

 

Non Invasive Blood Glucose Testing

As type 2 diabetes becomes increasingly prevalent, the need for more convenient and less expensive ways to check blood sugar levels increases. Normally, the body secretes more insulin when blood sugars are too high, causing the glucose to be stored as glycogen until the blood sugar levels fall. In diabetes, either not enough insulin is being produced or cells are not receptive to it. Medication involves artificially reducing the blood sugar level by injection or tablet and frequently checking that the levels are not too low ‘hypoglycaemia’ or too high ‘hyperglycaemia’. Checking is by pricking the skin to obtain a small blood sample and then using a kit to check the concentration of glucose chemically. Theoretically there is a non invasive alternative. Infra Red Spectrometry is regularly used to identify the presence of organic chemicals such as glucose and to assay their concentration. It should be possible to analyse either the infra red emission or absorption spectrum of a diabetic to check their blood sugar level. I have carried out experiments on this but the problem is one of variability. For the equipment to detect the small variations in infra red level that occur, it must be very sensitive. However, it is then very sensitive to small changes in temperature in the immediate environment. For instance in in vtro studies in which distilled water was compared with concentrated glucose solution, should the central heating system switch on, the measurements would go haywire.

 

THE GENERAL RELATIVISTIC TIME COMPONENT OF GRAVITATIONAL WAVES: IMPLICATIONS FOR THE DETECTION OF GRAVITATIONAL WAVES.

Submitted for publication in General Relativity and Gravitation. Springer. 27th September 2015
Link: http://link.springer.com/journal/10714

ABSTRACT

The existence of gravitational waves was predicted by Einstein from the theory of general relativity. Despite extensive attempts over many years, direct observation is still elusive. Within a gravitational wave time dilation occurs as gravitational field strength is modulated. There is modulation of time in phase with the modulation of space curvature. Measuring instruments generally take time as an independent variable in attempts to detect gravitational waves. Since time is oscillating in phase with the gravitational wave it would be theoretically very difficult if not impossible to directly detect gravitational waves by any measuring device that assumes time is an independent variable. A non-linear equation for gravitational waves with general relativistic time dilation is derived.

 

DAWE QUANTUM EFFECT IN NEURONS

21st September 2015. Submitted for publication in the International Journal of Biophysics.

ABSTRACT
Quantum fluctuations are known to occur in aqueous hydrogen bonds. Recent NMR and other observations have shown these to affect the clustering of water molecules. Further studies have shown an effect on the diffusion rate of dissolved ions. Measurements of action potential speed in isolated neurons, using micro electrodes have a greater spread than expected. In myelenated neurons the action potential speed is largely controlled by the diffusion rate of ions. It is thus reasonable to expect that the observed spread in action potential speed is at least partly due to quantum fluctuations of hydrogen bonds in the water of neuronal axons. This is of interest to biophysicists as it provides a link between neuronal activity and quantum effects. Although the quantum fluctuations are small, they will be amplified in at least two ways: First, positive feedback at Sodium ion channels in the axonal membrane. Second, by the nature of a neuronal network in which small changes in input may give rise to large changes in output

 

MEMORY. MICROTUBULE THEORY. DEMENTIA, ALZHEIMER'S DISEASE

The possibility that Alzheimers disease has an underlying cause in the defibrilatiion of microtubules based on the possibility that the ordered arrangement of tubulinns may provide coded information for the destination of protein ina non mitotic neurones.

 

 

QUANTUM BIOLOGY. TELEPATHY IN MICROORGANISMS

A controversial look at the possibility that die to quantum entanglement and the quantum structure of microtubules quantum telepathy may exist..between microorganisms. An experiment is proposed.

 

 

 

 

Sponsored Content