This is an automated archive.
The original was posted on /r/singularity by /u/epikcentre on 2024-01-11 04:32:47+00:00.
Genuine nanotech research versus 'quantum computing' (31 Dec 2023)
Recently [the Department of Industry, Science and Resources (Australia)] announced that applications will be considered for $18.5 million in funding for local ‘quantum computing’ companies for the purpose of establishing the so-called Australian Centre for Quantum Growth (1). I write this today to make the point that quantum computing is not a worthwhile field of research, that those engaged in it are practising obfuscation in order to win grants, and that there is no realistic expectation that it will ever yield decent computers.
This argument has been made a number of times by those who have investigated the subject, for example physicist Mikhail Dyakonov who said:
Experts estimate that the number of qubits needed for a useful quantum computer, one that could compete with your laptop in solving certain kinds of interesting problems, is between 1,000 and 100,000. So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 21000, which is to say about 10300. That's a very big number indeed. How big? It is much, much greater than the number of subatomic particles in the observable universe.
To repeat: A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe. At this point in a description of a possible future technology, a hardheaded engineer loses interest... To my mind, quantum computing researchers should still heed an admonition that IBM physicist Rolf Landauer made decades ago when the field heated up for the first time. He urged proponents of quantum computing to include in their publications a disclaimer along these lines: “This scheme, like all other schemes for quantum computation, relies on speculative technology, does not in its current form take into account all possible sources of noise, unreliability and manufacturing error, and probably will not work." [Emphasis in original] (2)
But one does not need to be a practising theoretical physicist to arrive at such a conclusion. Anyone objectively assessing the facts would be sceptical. The idea that quantum computing should be considered a possibility dates back to a speech (and subsequent paper) made by physicist Richard Feynman forty-two years ago, although he had previously speculated about it. Feynman was concerned that conventional computers could not effectively simulate molecules and chemical reactions, and he seems to have been sceptical that predicted increases in computer processing power would be of much help in this. In Section 4 of the paper he speculates about developing ‘quantum computers’ as a way around the problem, but he ends by saying this:
The question is, if we wrote a Hamiltonian which involved only these operators, locally coupled to corresponding operators on the other space-time points, could we imitate every quantum mechanical system which is discrete and has a finite number of degrees of freedom? I know, almost certainly, that we could do that for any quantum mechanical system which involves Bose particles. I’m not sure whether Fermi particles could be described by such a system. So I leave that open. Well, that’s an example of what I meant by a general quantum mechanical simulator. I’m not sure that it’s sufficient, because I’m not sure that it takes care of Fermi particles. (3)
Fermi particles - protons, neutrons and electrons - are what everything is made of. Feynman is saying that he is not sure this proposal would work for the one thing he wants to use it for. He could obviously envisage the massive benefits that would flow from being able to take a more systematic approach to designing chemicals and materials at the molecular level, but seems to have been frustrated by living at a time when computers were simply not good enough for the task. This is obvious in hindsight mainly because we can now simulate molecular structures and predict their properties to a level that is far in excess of our ability to physically control them - due to having better computers. Recent notable examples of this include the use of artificial intelligence (AI) - in the form of neural networks - to solve protein folding (4) and to identify hundreds of thousands of new materials (5). So clearly Feynman’s concerns are out of date - increases in computing power have rendered worries about the limits of ‘classical’ computing obsolete. And arguably, although Feynman made a great contribution to physics, in the case of quantum computing - due to something akin to desperation - he made a proposal that does not make sense.
He did though feel compelled to give his scepticism the last word - a caution promptly ignored by various researchers eager for the next easy breakthrough (and who as a group have a distinct tendency to hype their work). It would therefore be advisable to list some of the other problems with quantum computing (QC):
- Firstly, it seems to be motivated by magical thinking (6). Computing power has increased since transistors were invented by means of their continued miniaturization. It follows that once we are building computers with molecular scale components, there can be no further improvements (except for those based on design changes). Some people seem to have an ideological objection to that fact, and some of those have become advocates of QC as a result;
- Secondly, QC has no basis in the theory of computing. The data computers process either directly (and approximately) represents some aspect of reality or is a digital simplification of that reality. All computers therefore run on either analog or digital logic, and almost all modern computers are digital devices based on interconnected binary logic gates. In contrast, the earliest computer (the Antikythera mechanism) was analog, as are today’s specialized AI chips - although until recently neural networks ran exclusively on digital hardware. But in spite of the all encompassing nature of these categories QC does not really belong to either (7);
- Thirdly, even if they do become viable quantum computers can only ever be special purpose devices. As one commentator put it: “Just a few years until the plastic-bag-full-of-wasps computer achieves supremacy in simulating the behavior of bags full of wasps.” To head off a broader awareness of such deficiencies QC advocates are often reluctant to explain the details of their prototypes (by providing circuit diagrams for example). Instead they rely on people’s incomprehension of quantum mechanics to neuter criticism. But understanding this branch of physics depends above all on an appreciation of the double slit experiment (Feynman said this) and many who do, disagree with QC;
- And fourthly, QC research and development has been a complete failure - its researchers have nothing useful to offer despite over four decades of work. Compare this to every other new computing technology in history, which either worked right away or quickly led to an improved device that worked (they were also often profitable right away). QC advocates cannot even describe a path whereby their ideas become viable in theory.
There is no conceivable excuse for a failure of this magnitude, and it would be a tragedy if more public money were to be ploughed into the field. By contrast, developing technologies for arranging the molecular structure of materials (as Feynman ultimately wanted to do) has massive potential. Here is what he said in 1959:
But I am not afraid to consider the final question as to whether, ultimately - in the great future - we can arrange the atoms the way we want; the very atoms, all the way down! What would happen if we could arrange the atoms one by one the way we want them (within reason, of course; you can’t put them so that they are chemically unstable, for example).
Up to now, we have been content to dig in the ground to find minerals. We heat them and we do things on a large scale with them, and we hope to get a pure substance with just so much impurity, and so on. But we must always accept some atomic arrangement that nature gives us. We haven’t got anything, say, with a “checkerboard” arrangement, with the impurity atoms exactly arranged 1,000 angstroms apart, or in some other particular pattern.
What could we do with layered structures with just the right layers? What would the properties of materials be if we could really arrange the atoms the way we want them? They would be very interesting to investigate theoretically. I can’t see exactly what would happen, but I can hardly doubt that when we have some control of the arrangement of things on a small scale we will get an enormously greater range of possible properties that substances can have, and of different things that we can do. [Emphasis added] (8)
Note that the methods today’s material scientists have available to synthesize any of the 380,000 newly identified materials mentioned earlier are roughly equivalent to the natural conditions that Feynman implicitly complains about (heat, pressure, and elemental composition). That is to say, it is currently impossible to synthesize most of those materials, and even when we can synthesize the unit cell of a mineral (that arrangement of atoms which, when repeated...
Content cut off. Read original on https://www.reddit.com/r/singularity/comments/193t115/genuine_nanotech_research_versus_quantum_computing/