Concerning Dark Matters

Best Selling RPGs - Available Now @ DriveThruRPG.com

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
One of the bigger papers in foundational physics in the last few years, the Frauchiger-Renner paradox concerning the impossibility of using quantum theory to describe observers themselves and their actions:


In a common trend recently it's one of those things suspected by Heisenberg, Bohr and other early developers of QM, but only recently proven.
 
Last edited:

Nobby-W

Expert in the Dunning-Kruger effect
Joined
Oct 7, 2018
Messages
6,788
Reaction score
14,480
One of the bigger papers in foundational physics in the last few years, the Frauchiger-Renner paradox concerning the impossibility of using quantum theory to describe observers themselves and their actions:


In a common trend recently it's one of those things suspected by Heisenberg, Bohr and other early developers of QM, but only recently proven.

Sounds like a quantum version of Godel's incompleteness theorem, which states something to the effect that a formal (i.e. 100% rules-based) system can never be fully self-describing. The most famous application of this is perhaps Turing's Halting Problem, which was a mathematical proof done by Turing in the 1930s where (at least in the general case) you can't predict what an algorithm or program will do by statically analysing the code without running it. This is why virus scanning software is very hard to make and you should assume anybody claiming foolproof computer security is either lying or incompetent.
 

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
Sounds like a quantum version of Godel's incompleteness theorem, which states something to the effect that a formal (i.e. 100% rules-based) system can never be fully self-describing. The most famous application of this is perhaps Turing's Halting Problem, which was a mathematical proof done by Turing in the 1930s where (at least in the general case) you can't predict what an algorithm or program will do by statically analysing the code without running it. This is why virus scanning software is very hard to make and you should assume anybody claiming foolproof computer security is either lying or incompetent.
That's cool about virus scanning stuff!

Yeah it's basically that another observer in quantum theory can't be fully treated as simply another physical system, otherwise you run into contradictions. You need to treat "mental" or "knowledge" states as such separately from physical states in general.
 
Last edited:

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
I've spoken to one of the writers of the PBS videos before, they put a lot of effort into those videos. I don't recall seeing one with an error. They're occasionally out of date or present issues as more open than they really are.

In the entanglement video they have Bell's theorem ruling out one of Locality (no effects travel faster than light) or Realism (the quantities we measure have values before measurement).
Which means you either have to reject realism like quantum theory does or if you want to keep realism you need to develop an alternate theory to quantum mechanics that drops locality. Like many presentations to the public they present this as an open issue, i.e. we don't know which needs to be dropped. However nobody ever developed a working theory that keeps realism and work since Bell shows that keeping realism basically doesn't work*. So today we know that the world is local, but not realist.

*It's been proven to be impossible to replicate particle decays and certain properties of light if you keep realism, but this work happened a few decades after Bell.
 
Last edited:

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
A little app where you can play around with the quantum properties of light:

It's level progression system is not finished yet, but is available in the older version:
 

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
I meant to follow up on this. You asked this in the Cthulhu thread:
@Seadna, what do you think of M-theory and the notion of a holographic universe?
And I said this:
The holographic conjecture is interesting but it's a property of String theories in a universe not like ours. It's cool, but not only is there the ambiguity in whether String theory is right, but also whether its properties in those fictitious universes also hold in ours.
with this as additional detail:
I should say, since I think the media often exaggerates, String theories are still quantum theories. So they're still theories of what Observers see. Directly related to your question above current quantum theory doesn't say exactly what Observers will see at or inside a Black hole. We don't have any clear model of that. String theory is essentially the only extension of quantum theory to handle such cases that doesn't blow up in our face. I just want to mention this as it's often given the name "The Theory of Everything" as if it completely explained reality where as it is still an Observer focused "subjective" theory.

A simple example is the holography principle itself. This is often presented as if it said the world was a projection/illusion/hologram and it's really 2D or something. The principle really says "in situations with high gravity" the observers beliefs about events within a region are fixed by their beliefs about events on the region's boundary. It's like how your predictions for the weather in a country might be fixed by readings from weather stations along its border, but this doesn't imply the weather in the interior of the country is an illusion or something.

This was the state of play for the last twenty years and then one week after you asked somebody released a paper that changed things. Namely it seems to also work in a universe like ours. So now it only depends on String Theory being true.

The interesting thing about the new paper is that it provides stronger mathematical arguments to something that is coming up again and again in Quantum Gravity, namely that we have to get rid of the current notion of time from physical theories. Specifically we have to remove any reference to intervals of time too small to be perceived by an observer with the aid of some clock. To most this is just part of the general quantum message that physics has to confine itself to statements about directly observable phenomena and not try to provide a "world story".
Neils Bohr said:
Physics is to be regarded not so much as the study of something a priori given, but rather as the development of methods of ordering and surveying human experience

I'm saying this in advance because I imagine stuff like this might be in general media with titles like "time is an illusion". Where as the actual message is only the common everyday experience of time is concrete, older mathematical notions of time invented for calculus are mental constructions.
 

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
So physics is limited specifically to human ability/perception. That is fascinating.
If you enjoy a bit of a deeper dive.

This is specifically what bothered Einstein. It's sometimes described that he didn't like randomness but it was more so this. In a letter to Schrodinger he says:
Einstein said:
Most of [Bohr, Heisenberg, Pauli, Born, etc] do not see what a risky game they are playing with reality - reality as something independent of what is experimentally established
Or as Pauli described it in a letter to Born:
Pauli said:
Einstein's point of departure is 'realistic' rather than 'deterministic', which means that his philosophical prejudice is a different one

The real limitation as such is called "complementarity" or "the failure of unicity". It's like in everyday life we can describe a sand dune say as a continuous fluid (useful for large scale behaviour of dune migration across the Sahara say) or as a collection of sand particles (useful for describing an small parts of dune collapsing). However we always have a way of combing them by finding out the true description, i.e. a sand dune is really just a collection of grains. Even if we can't experimentally check which one is right our theories often tell us. So you could either directly or via your theories learn the "true" description.

However quantum theory breaks this. You can prove that empirical evidence gathered in separate scenarios logically contradict each other, so you are not permitted to combine them into a more unified and accurate picture. The French physicist Roland Omnès often used the example of a man who could either choose to see or hear. He walks down a road. If he chooses to open his eyes he either finds himself in a cathedral or cottage, if he chooses to hear he either hears the sounds of being underwater or an explosion. Not only are each of the sense data random, they cannot be combined. The cottage and cathedral are never seen to be exploded or underwater.

A proper example is nuclear reactions. The common picture presented is that neutrons hit nuclei and release energy.
However nuclear reactions usually take place at about 800 degrees and if you churn through the calculations it says that if you measure the neutron's position when it is at this temperature then 99% of the time it is far away from the nucleus. So the nuclear reaction shouldn't be working at all. However in nuclear reactors we don't measure the neutrons position so energy production isn't logically constrained by it. If you measure the neutron's location, the reaction actually does slow down, since then there is an observable phenomena which would contradict it.

In general physics must now confine itself purely to what is directly observable by our senses or using our devices (who themselves must have directly observable readouts or reports). No longer "a photon produced in the sun struck my device" but most accurately "my photodetector left out in the sun developed a mark".
 
Last edited:

Nobby-W

Expert in the Dunning-Kruger effect
Joined
Oct 7, 2018
Messages
6,788
Reaction score
14,480
I'm saying this in advance because I imagine stuff like this might be in general media with titles like "time is an illusion". Where as the actual message is only the common everyday experience of time is concrete, older mathematical notions of time invented for calculus are mental constructions.
As any fule kno, when taken from a non-linear, non-subjective point of view, it's more like wibbly-wobbly timey-wimey stuff.
Best line ever in Doctor Who, with the possible exception of reverse the polarity of the neutron flow.
 

Voros

Doomed Investigator
Joined
Sep 23, 2017
Messages
12,522
Reaction score
24,412
I'm a mathematical moron but fascinated with physics.

This book is an relatively accessible summary of Godel's proof.

Reading it reminded me of the statement by Lewontin that a common cause of a lot of modern scientism is a lack of grounding in the philosophy of logic.

17136289.jpg

And this is a classic of popular philosophy and science that is still a thought-provoking and lyrical read.

61x9qySMjzL.jpg
 
Last edited:

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
This is essentially the "canonical" calculation one has to be able to do by the end of a QFT course in graduate school. Schwinger's main contribution was figuring out how to prevent infinities occurring in the calculations that had blocked previous attempts at working out the anomalous magnetic moment, but how he did it is still fairly controversial since it sort of makes no logical sense but gets the right answer.
 

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
Reading it reminded me of the statement by Lewontin that a common cause of a lot of modern scientism is a lack of grounding in the philosophy of logic.
If you like Hofstadter you might really enjoy "Are Quanta Real?" by J.M. Jauch which Hofstadter himself praised.

Scientism is a strange one as the "peak" of its genuine scientific support came in the late 19th century around the time of Boltzmann's work on atoms where it was somewhat possible to imagine the world as a Lego set built from small components obeying fixed rules/laws. However back then although that worked for some things, e.g. gases in equilibrium, nobody took it seriously. However ironically today when we know due to quantum theory it's explicitly false it's much more prominent in popular thought.

I often see essays online waxing lyrical about things like determinism, atomism, reductionism, mechanism, materialism and how to cope with their "shocking truth", which is daft because we've known since 1925 that they're all false. It's impossible in quantum theory to conceive of atoms as actual objects that really exist, but rather as "poetic metaphors" to help in visualising certain experiments
Bohr said:
We must be clear that when it comes to atoms, language can be used only as in poetry. The poet, too, is not nearly so concerned with describing facts as with creating images and establishing mental connections.
I suspect part of this (based off teaching the subject in different countries) is that Anglophone society in particular has constructed a very strong Mysticism/Materialism dichotomy so that if you say materialism is false people immediately think you're saying ghosts, psychic powers etc are real. I noticed much less of this sort of thing in Japan but even places like France where the philosophical tradition is a bit different.
 
Last edited:

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
Edwin Steiner, an Austrian programmer who has worked with LHC data has an introductory series to QM. The series in general is intended for if one wants to learn QM at the mathematical level, but even if one doesn't want that depth the introductory video is well worth a watch, even just for the quotes. The series is recommended by W.A. Zajc, head physics prof at Columbia and I've seen it used in a few classes.

 

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
Might throw up a few cool pics now and again.

First ever evidence of antimatter, a single streak in a cloud chamber seen in 1932 by Carl Anderson. The positron had been predicted the year before by Paul Dirac purely mathematically, i.e. Dirac didn't postulate it he just tried to model how an electron behaved relativistically and the positron just popped out as it was required by Relativity:

1024px-PositronDiscovery.jpg

Chien-Shiung Wu with her specially designed equipment that super-cooled cobalt so that she could check the directions of decays coming of it. Confirmed for the first time that the universe doesn't conserve Parity. That is if you flip the image of a decay in a mirror you get another decay, but that mirror image decay happens much more rarely in nature. Something impossible in classical physics.

dgwrzgmzu3ib7sycbpss.jpg

Richard Feynman and Julian Schwinger had independently figured out how to combine Relativity and Quantum Mechanics without generating all the infinities that plagued previous attempts. Here they are discussing and comparing their approaches on Shelter Island New York the night before their talks where they had to convey their results to the European "founders" of quantum theory such as Bohr and Heisenberg. Schwinger is on the extreme right crouching on the floor, Feynman in the middle speaking.

R.jpg

Anton Zeilinger with his table top equipment that provided the first "logical" refutation of observation independent reality. Earlier tests by French physicist Alain Aspect had refuted objective theories statistically, i.e. they predict that certain groups of photodetector clicks in certain experiments had to be less than a certain number, but he found they were more common than this number. Zeilinger improved this by performing an experiment where objective realist theories stated a certain type of click was impossible, but quantum theory predicted they always occurred, i.e. the biggest possible disagreement one could have. QM was confirmed of course:

detection-loophole-768x512.jpg

What I think is a cool image from a paper by Wojciech H. Żurek, one of the heads of the Los Alamos laboratory where the atom bomb was designed and partially built.

Screenshot 2021-09-17 at 21-20-20 1808 08598 pdf.png
 

Voros

Doomed Investigator
Joined
Sep 23, 2017
Messages
12,522
Reaction score
24,412
The Babbage podcast from The Economist has a good discussion on the decline of string theory due to evidence from recent experiments (nice to see the string theory skeptics finally winning the day) and something called entropic gravity.

 

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
Hideki Yukawa and Richard Feynman in Kyoto in 1954. Yukawa was the first person to explain why the nucleus of an atom didn't just explode immediately since it contains positive charges packed close together. He proposed that protons and neutrons exchanged an additional particle, the pion, which kept them glued together with a strength the electromagnetic force couldn't overcome. He later received the Nobel prize for this work.

yukawa.jpg

The JUQUEEN supercomputer cluster in 2014, Europe's most powerful at the time, which running at full power with the second most powerful cluster the JUROPA for several days managed to simulate a single proton well enough to compute its mass to within a percent.

juqueen-full.jpg

Chen-Ning Yang and Tsung-Dao Lee work out the early details of the Weak Nuclear force. Yang would go on to figure out the fundamental equations controlling all forces except gravity. Both won Nobels.

296.jpg

Shin'ichirō Tomonaga, ardent Japanese traditionalist and the first person to correctly combine quantum theory and relativity. Julian Schwinger would do the same within a year in the United States but post-war difficulties prevented communication and neither side would know of each other's work for another year. Funnily enough Tomonaga and Schwinger both mean "Shaker". Richard Feynman later combined quantum theory and relativity using very different methods which are easier to use, but less fundamental than those of Tomonaga and Schwinger. All three would later win the Nobel for this work.

shinichiro-tomonaga-smoking-a-cigarette-bettmann.jpg

Very recent and exciting experiment (July 2021)! A machine that pumps positrons into silicon crystals in Bar Ilan university Israel. This provided the first experimental confirmation that even the number of atoms present in a material is observer dependent and non-objective. An effect first predicted by Canadian professor Bill Unruh in British Columbia in 1976.
Stationary detectors perceived the crystal as having the usual number and structure of atoms, but a highly accelerated detector detected a "bath" of super-heated atoms within the crystal. Thus confirming there is no objective answer to "How many atoms are there in an object". Just contradictory subjective answers. Contradictory because if you imagine the superheated atoms are objectively present they should have superheated the stationary detector. The presence of two mutually contradictory truths like this being called by Bohr "Complementarity".

biu8-1-18part20172med.jpg
 
Last edited:

Nobby-W

Expert in the Dunning-Kruger effect
Joined
Oct 7, 2018
Messages
6,788
Reaction score
14,480
Hideki Yukawa and Richard Feynman in Kyoto in 1954. Yukawa was the first person to explain why the nucleus of an atom didn't just explode immediately since it contains positive charges packed close together. He proposed that protons and neutrons exchanged an additional particle, the pion, which kept them glued together with a strength the electromagnetic force couldn't overcome. He later received the Nobel prize for this work.

View attachment 35716

The JUQUEEN supercomputer cluster in 2014, Europe's most powerful at the time, which running at full power with the second most powerful cluster the JUROPA for several days managed to simulate a single proton well enough to compute its mass to within a percent.

View attachment 35717

Chen-Ning Yang and Tsung-Dao Lee work out the early details of the Weak Nuclear force. Yang would go on to figure out the fundamental equations controlling all forces except gravity. Both won Nobels.

View attachment 35718

Shin'ichirō Tomonaga, ardent Japanese traditionalist and the first person to correctly combine quantum theory and relativity. Julian Schwinger would do the same within a year in the United States but post-war difficulties prevented communication and neither side would know of each other's work for another year. Funnily enough Tomonaga and Schwinger both mean "Shaker". Richard Feynman later combined quantum theory and relativity using very different methods which are easier to use, but less fundamental than those of Tomonaga and Schwinger. All three would later win the Nobel for this work.

View attachment 35719

Very recent and exciting experiment (July 2021)! A machine that pumps positrons into silicon crystals in Bar Ilan university Israel. This provided the first experimental confirmation that even the number of atoms present in a material is observer dependent and non-objective. An effect first predicted by Canadian professor Bill Unruh in British Columbia in 1976.
Stationary detectors perceived the crystal as having the usual number and structure of atoms, but a highly accelerated detector detected a "bath" of super-heated atoms within the crystal. Thus confirming there is no objective answer to "How many atoms are there in an object". Just contradictory subjective answers. Contradictory because if you imagine the superheated atoms are objectively present they should have superheated the stationary detector. The presence of two mutually contradictory truths like this being called by Bohr "Complementarity".

View attachment 35721

Supercomputers are funny things. The first teraflop supercomputer was ACSI Red in 1996. Today I think it would be possible to build a machine off Ebay for maybe £10k or so that would outperform it on a fair variety of workloads.
 

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
Supercomputers are funny things. The first teraflop supercomputer was ACSI Red in 1996. Today I think it would be possible to build a machine off Ebay for maybe £10k or so that would outperform it on a fair variety of workloads.
I know for some scientific simulations they're being replaced by clusters of GPUs, but I don't know much beyond that.
 

Nobby-W

Expert in the Dunning-Kruger effect
Joined
Oct 7, 2018
Messages
6,788
Reaction score
14,480
I know for some scientific simulations they're being replaced by clusters of GPUs, but I don't know much beyond that.
Pretty much all supercomputers since about 2000 or so have been clusters. Big SMP machines did get some play but the biggest have been clusters since the 1990s or so. You can put GPUs into a cluster. There are a few places where you can get bottlenecks.

GPUs are SIMD devices - essentially they have a bunch of cores with one execution unit and a bunch of arithmetic logic units. The execution unit dispatches the same instruction to all the ALUs - this is called a warp in GPU speak, and the individual processes are called threads. The execution unit can do the same thing to a bunch of data at the same time through this architecture. You can have conditionally executed instructions, but if a warp splits into multiple paths, the execution unit has to go back and run each path separately; this is called a split warp. A GPU will also have multiple execution units.

GPUs are good at tasks that involve doing the same thing to a lot of data - fortunately this is the core of matrix operations that sit behind a lot of numerical computation and graphics. Although they are heavily optimised for a certain type of use case, that use case turns up in a lot of different applications.

GPUs are fairly memory bound, even with GDDRx memory, so you try to optimise code to do as much as possible at a register level without having to shove large amounts of stuff into and out of memory. Tuning GPU code largely entails frigging it to avoid split warps and make as much use of in-register computations and minimising the memory traffic.

GPUs also come in flavours that can split the individual floating point units into multiple 8 or 16 bit computations, which gives even more bang for buck in lower precision computations. Most number crunching uses 64 bit floats for precision. Most graphics uses 32 bit floats, and most neural net applications can use 8 or 16 bit floats. The current generations support the AI optimisations, and some have further optimisations for this type of work, although this is stretching my understanding of the intricacies of the hardware.

Clustered machines will have network bottlenecks if they have to shovel data over the network, and NUMA machines can have bottlenecks going across the CPU interconnects. NUMA means non-uniform memory access - essentially each CPU in a shared memory machine has some local memory, but getting to memory on other CPUs involves going over an interconnection so it's a bit slower. Older machines like early Sequent servers were much slower, but the overhead is not so much on modern kit. All AMD machines since the Opteron and all Intel machines since the Nelahem (G6)[1] are NUMA architectures.

Optimising code for NUMA architectures also involves aiming for CPU affinity (code running on a CPU uses its local memory where possible) and I/O affinity (keeping I/O operations like disk on peripherals connected to the local CPU). If you have to go over a network to a remote machine for data this will impose an overhead, so optimising code for clustered machines entails trying to minimise the need for this.

Infiniband has a bunch of features like RDMA (remote DMA) that lets you fake a single system image with a cluster - i.e. the cluster can present itself as a single large machine with thousands of CPUs. However, you still have to optimise your code to play nicely with the bottlenecks of the system. Having said this, a modern crossbar or minimal spanning switch can handle enormous aggregate throughput due to the intrinsic parallelism of the architecture. On modern systems, the main issue with the network is latency or the possibility of your computation developing hotspots with a lot of contention over a subset of the nodes.

ACSI red used a parallel file system - i.e. one where the I/O was explicitly split across multiple I/O nodes and the processing pulled data from the individual nodes. Google's BigTable and anything in the Hadoop family use a similar architecture. Various other parallel file systems also exist.

I did a back-of-a-fag-packet calculation about the total CPU and memory throughput and came up with a view that the total integer performance of ACSI red was about equivalent to double what one could get from a modern 4-socket machine with 24-28 core CPUs, and had aggregate memory bandwidth a bit greater. It had two parallel file systems, which could put 4GB/sec through each and had about 6TB of storage each. You could put two NVMe SSDs in a machine today and get comparable performance. The Quickpath interconnect on a modern Xeon is a bit slower than the maximum switching throughput on ACSI red, so traffic that had to go across the QP bus might be a bit slower. With appropriately tuned code that wasn't subject to other bottlenecks, a GPU would have much faster floating point performance than ACSI red.

On some work loads ACSI red would probably be faster, on some workloads our hypothetical machine would probably outperform ACSI red - for about 1/10000th of the price.

TL;DR: If you really wanted to, you could build a PC with comparable throughput to a supercomputer that was state of the art maybe a couple of decades ago.
_________________
[1] Fun fact: There's a school of thought that 'Fly like a G6' actually refers to Intel's 6th generation Nelahem architecture as a similie, as the move from the old front-side bus architecture to Quickpath was a big performance win for folks running sound mixing software.

 
Last edited:

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
The Babbage podcast from The Economist has a good discussion on the decline of string theory due to evidence from recent experiments (nice to see the string theory skeptics finally winning the day) and something called entropic gravity.

He covers the stuff about the decline of String Theory well, but there are three small things I would expand on. This isn't a disagreement with anything he says, just an expansion since I think it might be of interest.

  1. The first is where he says General Relativity and Quantum Theory are contradictory. I would more say we consider it "odd" that gravity is still treated as a classical objective force in the 19th century sense and most physicists "feel" it should be quantum like everything else, but that could easily be wrong. Some even think gravity plays a large role in the emergence of objectivity. Regardless keeping gravity as classical causes no contradictions or issues and experimental evidence certainly doesn't rule it out.

  2. Second, and this is more an expansion on what he says, String theory actually isn't literally about very small vibrating strings. Most quantum theories say the subatomic level can manifest as particles or fields or strings or (loads of others stuff) in our devices. String theory differs from other theories by proposing that the string manifestation is more important for gravity, where as our experience is that the field one was the most important for the other forces. "Most important" here means "theory is mathematically simplest when written in terms of that aspect".

  3. Finally the thing about qubits being "up and down at the same time". This is worth focusing on a bit, especially using Schrodinger's cat since it goes to the heart of many aspects about QM.

    As mentioned before on this thread in QM (to make this shorter) the theory doesn't give a value or description to anything outside of direct perception by observers. To quote Bill Unruh from above:
    Essay titled THOUGHTS ON NON-LOCALITY AND QUANTUM MECHANICS said:
    in quantum mechanics [...] one cannot regard the system in question as having a value for the quantity of interest in the absence of measurement
    Schrodinger's issue was that he could set up and describe mathematically a situation where this seemed to apply to a cat's being alive or not. Rather than being dead and alive at the same time, the issue was much stranger. The cat was neither dead nor alive but rather "undefined" until somebody chose to look in the box.

    This is really nothing more than applying the same principles in atomic physics to the everyday world. If an atom has no well-defined position until an observer checks, well why not go whole-hog and say nothing is well-defined until an observer checks? Including something as stark as an animal being dead or not.

    The "dead and alive" language rather than the more correct "status undefined until somebody looks" arose in the 1980s when some people wanted a way to visualise how a quantum computer works. Imagining it as a suped-up parallel processing computer. It was a noble idea I would say, but we now know it doesn't work. Just like every quantum system a quantum computer isn't given "gears and wheels" as to how the computation is accomplished. It's just "if you do XYZ to a crystal, then measure the crystal in manner ABC, the crystal is 80% likely to spit out the answer to this computational problem encoded in the measurement data". Physical systems can just spit the completed answers for problems back to an observer without any computing or mechanism.

    As for Schrodinger's original question as to whether the cat is alive or dead before somebody looks, this really reduces to who is an observer. I don't want to give my own view in detail, which would be just one biased position. However it is worth seeing this quote (different from my own view) from Chris Fuchs the head of the Quantum Information group at UMass Boston and probably the world leading expert on Quantum Information:

    Essay titled RESPECTING ONE'S FELLOW said:
    our definition of an agent is broad: It does not rule out attributing agency to dogs, euglenas, or artificial life. However, it does exclude a computer program that deterministically “chooses” an action from a look-up table

    The last option has to be excluded due to certain theorems (the Frauchiger-Renner theorem I linked to above is related to this) showing that something controlled by an algorithm can't be an observer, but after that there is little we can say with certainty.
 
Last edited:

Silent Green

Legendary Member
Joined
Jul 10, 2020
Messages
304
Reaction score
553
[…]

TL;DR: If you really wanted to, you could build a PC with comparable throughput to a supercomputer that was state of the art maybe a couple of decades ago.
Thanks for the enlightening read; could you please add a paragraph about GPUs and Bitcoin? I've read that part of the general computer chip shortage is a GPU shortage caused by people hoarding GPUs in farms to mine Bitcoin (which, I gather, is mainly finding prime numbers in the bazillion range). All of which is, at least to me, a pretty dark matter in many respects.
 

Nobby-W

Expert in the Dunning-Kruger effect
Joined
Oct 7, 2018
Messages
6,788
Reaction score
14,480
Thanks for the enlightening read; could you please add a paragraph about GPUs and Bitcoin? I've read that part of the general computer chip shortage is a GPU shortage caused by people hoarding GPUs in farms to mine Bitcoin (which, I gather, is mainly finding prime numbers in the bazillion range). All of which is, at least to me, a pretty dark matter in many respects.
GPUs can exploit parallelism to do the hashing calculations for crypto mining much faster than a CPU can. You can also now buy dedicated ASICs for mining have the hashing functions built into hardware and are faster than GPUs. The GPU shortage was caused by a couple of factors; demand for GPUs from miners was one, but the other one was the manufacturers underestimating the demand for GPUs, which has spiked with COVID as more people are stuck at home.

Supply is expected to catch up with demand in about a year or so, and a couple of provinces in China have banned crypto mining which dumped a lot of used GPUs onto the Chinese market, depressing prices locally. When that happened, 3060 prices dipped a bit on ebay but they seem to have come back now. Over the next year or so it's likely that the inflated prices of GPUs should ease off significantly.

I expect that at some point 30 series GPUs will be readily available, which should put downward pressure on the market for used 10 series units. My pick is that if you have a big enough power supply, secondhand 1080s will be quite good value for money in the next 6-12 months or so. You can refurbish them by getting fan kits (i.e. replacement fans) and replacing the thermal compound on the heatsink.
 
Last edited:

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
Just a little visual contrast. The Standard Model Lagrangian, i.e. the expression describing matter and the three quantum forces (Electromag and the two Nuclear forces).

Section 1 is the Strong force, Section 2 the Weak and EM forces, Section 3 how matter feels the weak force, Section 4 is matter interacting with the Higgs field, Section 5 is a pile of terms describing the complicated geometry of the fields making up the weak and strong forces:

sml.jpg

This is gravity:

2d216d2cd6a2ebab1540022827da8e3d.jpg
 

Nobby-W

Expert in the Dunning-Kruger effect
Joined
Oct 7, 2018
Messages
6,788
Reaction score
14,480
Just a little visual contrast. The Standard Model Lagrangian, i.e. the expression describing matter and the three quantum forces (Electromag and the two Nuclear forces).

Section 1 is the Strong force, Section 2 the Weak and EM forces, Section 3 how matter feels the weak force, Section 4 is matter interacting with the Higgs field, Section 5 is a pile of terms describing the complicated geometry of the fields making up the weak and strong forces:

View attachment 35734

This is gravity:

View attachment 35735
It's all Greek to me.
</dadjoke>
 
Last edited:

spittingimage

hawwwk-ptui
Joined
Sep 21, 2018
Messages
2,229
Reaction score
6,079
Just a little visual contrast. The Standard Model Lagrangian, i.e. the expression describing matter and the three quantum forces (Electromag and the two Nuclear forces).

Section 1 is the Strong force, Section 2 the Weak and EM forces, Section 3 how matter feels the weak force, Section 4 is matter interacting with the Higgs field, Section 5 is a pile of terms describing the complicated geometry of the fields making up the weak and strong forces:

View attachment 35734

This is gravity:

View attachment 35735
This must be what science looks like to muggles.
 

Klibbix!

Depraved Necromancer
Joined
Dec 18, 2020
Messages
613
Reaction score
1,460
He covers the stuff about the decline of String Theory well, but there are three small things I would expand on. This isn't a disagreement with anything he says, just an expansion since I think it might be of interest.

  1. The first is where he says General Relativity and Quantum Theory are contradictory. I would more say we consider it "odd" that gravity is still treated as a classical objective force in the 19th century sense and most physicists "feel" it should be quantum like everything else, but that could easily be wrong. Some even think gravity plays a large role in the emergence of objectivity. Regardless keeping gravity as classical causes no contradictions or issues and experimental evidence certainly doesn't rule it out.

  2. Second, and this is more an expansion on what he says, String theory actually isn't literally about very small vibrating strings. Most quantum theories say the subatomic level can manifest as particles or fields or strings or (loads of others stuff) in our devices. String theory differs from other theories by proposing that the string manifestation is more important for gravity, where as our experience is that the field one was the most important for the other forces. "Most important" here means "theory is mathematically simplest when written in terms of that aspect".

  3. Finally the thing about qubits being "up and down at the same time". This is worth focusing on a bit, especially using Schrodinger's cat since it goes to the heart of many aspects about QM.

    As mentioned before on this thread in QM (to make this shorter) the theory doesn't give a value or description to anything outside of direct perception by observers. To quote Bill Unruh from above:

    Schrodinger's issue was that he could set up and describe mathematically a situation where this seemed to apply to a cat's being alive or not. Rather than being dead and alive at the same time, the issue was much stranger. The cat was neither dead nor alive but rather "undefined" until somebody chose to look in the box.

    This is really nothing more than applying the same principles in atomic physics to the everyday world. If an atom has no well-defined position until an observer checks, well why not go whole-hog and say nothing is well-defined until an observer checks? Including something as stark as an animal being dead or not.

    The "dead and alive" language rather than the more correct "status undefined until somebody looks" arose in the 1980s when some people wanted a way to visualise how a quantum computer works. Imagining it as a suped-up parallel processing computer. It was a noble idea I would say, but we now know it doesn't work. Just like every quantum system a quantum computer isn't given "gears and wheels" as to how the computation is accomplished. It's just "if you do XYZ to a crystal, then measure the crystal in manner ABC, the crystal is 80% likely to spit out the answer to this computational problem encoded in the measurement data". Physical systems can just spit the completed answers for problems back to an observer without any computing or mechanism.

    As for Schrodinger's original question as to whether the cat is alive or dead before somebody looks, this really reduces to who is an observer. I don't want to give my own view in detail, which would be just one biased position. However it is worth seeing this quote (different from my own view) from Chris Fuchs the head of the Quantum Information group at UMass Boston and probably the world leading expert on Quantum Information:



    The last option has to be excluded due to certain theorems (the Frauchiger-Renner theorem I linked to above is related to this) showing that something controlled by an algorithm can't be an observer, but after that there is little we can say with certainty.

Forgive me if this is a stupid question but doesn’t the cat itself count as an (self) observer? If the experiment is changed and it is instead a human being in the box, how could they be in a state of uncertainty if they themselves were observing that they were alive and well (or dead)? I understand that the observer outside of the box has no way of determining what is going on inside, but wouldn’t that uncertainty have already been ‘set’ by the observer within?

or is it all completely dependent on individual observances?
 

Séadna

Legendary Member
Joined
Sep 3, 2018
Messages
6,304
Reaction score
13,752
Forgive me if this is a stupid question but doesn’t the cat itself count as an (self) observer? If the experiment is changed and it is instead a human being in the box, how could they be in a state of uncertainty if they themselves were observing that they were alive and well (or dead)? I understand that the observer outside of the box has no way of determining what is going on inside, but wouldn’t that uncertainty have already been ‘set’ by the observer within?

or is it all completely dependent on individual observances?
It's certainly not a stupid question. In fact replacing the cat with a human is what Eugene Wigner did in his 1963 "Wigner's Friend" essay to sharpen the question. It'll just take steps to answer, if you don't mind I'll break it into separate posts since it'd just be a massive wall of text otherwise.

Sorry for the delay in answering, busy IRL. Feel free to ask follow up questions, if I don't answer it's similar IRL stuff delaying me but I'll get around to it. If any of this makes little sense, that's my failing and I'll try again.

The posts will be:
Complementarity
Interpretations of QM
The Cat Redux

So the first thing to discuss is the central notion of quantum theory which is Complementarity (sometimes called "Contextuality"). As Pauli and Heisenberg mentioned a few times if we could rewind the clock it should have really been called "Complementarity Mechanics" and not "Quantum Mechanics". I'll discuss examples first and then sum up with the general definition.

Complementarity:
So for example if we take a collection of photos of a house, which show the house from different angles, these are a collection of 2D images of the house that we can combine into a single 3D image of the house. That is we can compose our partial information obtained from different photos into a more comprehensive description of the house. However we can also obtained that 3D info in one go using say an ultrasound of the house.

Similarly we could photograph a passing car and somebody else could hold out a speedometer and combining them we'd know the position and speed of the car and so figure out the general motion of the car. However somebody else could capture the motion in "one go" by just filming the car.

In both cases you have a method where you combine bits of partial information to obtain the full picture and another method where you just acquire the full picture directly. What Complementarity says is the the "combination" method is only valid provided some "in one go" method actually exists. You don't have to actually do the "in one go" method yourself or it might not be possible with current technology, it just has to be possible in principle.

Now think about how often we consider information extracted from different situations and use logic or reason or intuition to combine them. Just like Relativity bans faster than light travel, Quantum Theory bans this sort of "over extension" of reason, i.e. attempting to use reason to go beyond what your direct data captures or could capture. To illustrate I'll use four cases, the last the most extreme.

  1. The colour of light and photons. When we look at light with our eyes we see colour, if we put a photodetector in the way of a light beam we detect photons. However no machine or object can sense colour and photons, i.e. nothing can function as both an eye and a photodetector. Thus it makes no sense to combine the concepts "colour" and "photon". Sentences like "Sight is caused by photons hitting your eye" are actually without content according to quantum theory since they combine information from two incompatible situations.

  2. If you put a chemical sample in a bath of water you can measure how much it energy it releases when reacting with the water, or check how many atoms it has, but never both at once. This means you can't say sentences like "The energy was released because the atoms did X,Y and Z".

  3. In a nuclear reactor you can either check the motion of the neutrons and whether they collide with other particles or how much energy the reactor is producing, but not both. Thus again QM says sentences like "The nuclear reaction releases energy because the neutrons collide with and split nuclei" are meaningless. In fact if you actually try to check where the neutrons are the reaction begins to shut down and give far less energy. The neutrons having a well-defined position is completely in contradiction with a functioning nuclear reaction. Note how this means the usual explanation of nuclear fission is simply false.

  4. An extreme example due to Bohr is the psychological information you can obtain from a person by asking "How do you feel today?", "What did you think of the latest episode of X?", etc. Now imagine the complete chemical information of their brain. A full list of every molecule, its chemical state etc. If you think about it, it is impossible to extract both sets of information at the same time. The total chemical information could only be obtained if you shredded the brain and fed it into a mass spectrometer or something, in which case you certainly couldn't ask the person what they thought of the latest episode of something.
    Thus Quantum Theory would say it is meaningless to pose questions like "How did the person's feelings arise from the chemical state of the brain?", since you are attempting to combine information obtained from two incompatible ways of extracting information.
You could say Complementarity is the idea that the scope of logic/reason is tightly controlled by how you actually learn information. Never try to reason about or even imagine the existence of information you could never directly learn in one go.

Randomness and Observers:

Quantum Theory is often presented with randomness or probability taking the center stage, but this is not the correct focus. The probability stuff is just a consequence of Complementarity. Determinism is based on the idea that provided enough information you could completely predict the world or even say a single stone. Is there a machine or device that could actually extract all this required information? No according to basic physical principles. Hence due to Complementarity that means that information is a meaningless extrapolation of human imagination and determinism is false. This was directly checked in the Aspect experiments of the 80s where determinism was experimentally refuted. Zeilinger, who I mentioned above, did stronger tests later proving the same thing.

Secondly if you cannot combine information to obtain a picture of things beyond the direct experimental or sense data you actually obtained you have to be very careful with your language. If you look at the colour of light, then you chose to see "colour" and not "photons" and so you cannot talk about photons or even their existence without running into contradictions. If photons, atoms, energy, etc can only be said to exist in the right observational situation, then they have no objective/observation-independent meaning. Which is what brings in observers.
 
Cthulhu Mythos - Available Now @ DriveThruRPG.com
Top