The talk with both Leonard Susskind and Seth Lloyd is a good watch, thanks!
While diving into the entanglement subject (Leonard Susskind) I found this very nice summarizing image:
<Quoted Image Removed>
Source:
http://www.nature.com/news/the-quantum-source-of-space-time-1.18797And a quote from the same source:
"The geometry–entanglement relationship was general, Van Raamsdonk realized. Entanglement is the essential ingredient that knits space-time together into a smooth whole — not just in exotic cases with black holes, but always."Thanks for the article youhn.
I had been thinking about the error correction problem in conjunction with a fractal multiverse running the way I described where the successive fractal representations run in synch thereby reinforcing each plancktime computation that produces the next iteration. Each iteration that is one step zoomed out from its fractal representation will receive the combined reinforced information (which would be identical) from all levels "below" it. Any small deviations in any single representation would not be "allowed" to occur this way.
Where in the article:
"In principle, when the qubits interact and become entangled in the right way, such a device could perform calculations that an ordinary computer could not finish in the lifetime of the Universe. But in practice, the process can be incredibly fragile: the slightest disturbance from the outside world will disrupt the qubits’ delicate entanglement and destroy any possibility of quantum computation.
That need inspired quantum error-correcting codes, numerical strategies that repair corrupted correlations between the qubits and make the computation more robust. One hallmark of these codes is that they are always ‘non-local’: the information needed to restore any given qubit has to be spread out over a wide region of space."
(the restorative information is actually spread infinitely "downward" into the fractal.)
It may be that using a model with extra-universal end-users interfaced with observers of a plancktime reiterative universe that "falls" into the fractal answers:
"Still, researchers face several challenges. One is that the bulk–boundary correspondence does not apply in our Universe, which is neither static nor bounded; it is expanding and apparently infinite. Most researchers in the field do think that calculations using Maldacena’s correspondence are telling them something true about the real Universe, but there is little agreement as yet on exactly how to translate results from one regime to the other." ----
where the reiterative universe, as in my model, is the very same "space" the article describes as one we DON'T experience (the one that apparently has no gravity which would explain why it can be programmed to go from heat death to big bang to heat death instantly or within one plancktime). Where the reiterative universe is the boundary and the experiential Universe is the bulk. Each reiteration of the boundary occurs one plancktime after the previous and separated by 1 plancklength along the bulk (giving rise to experiential (bulk) time and E=Mc2 (a "bulk-only" phenomenon) as well as "expansion". So the universe is "recalculated" each plancktime with every "logical step, or operation, needed to construct the quantum state of a system" as it should exist one Plancklength away. Since only one planckspace/time occurs prior to recalculation, only ONE "step" or "operation" need be computed for each waveform per iteration rather than keeping track of the entire chain....or "motive". If a waveform undergoes no quantum change in a planck space/time, it is simply resimulated and is one planck spacetime removed from its prior position. (the first C in C squared.) If the waveform is a photon travelling at C its E will = C squared since it theoretically has 0 mass in the "bulk" Universe and it will have reached its position for the next iteration in (observationally) 0 time. (time dilation).
Anything with mass in the bulk that is moving will never reach C without becoming massless so at that slower than light speed the M is introduced E=Mc2
Also I believe my model accounts for time. I believe any model like the one described in the article (which is actually VERY close to mine) will require an interface between an observer (consciousness/perception/sensation) in the bulk and an extra-universal end-user.
The end-user would necessarily exist beyond both the bulk and the boundary. That, though, is the real difficulty physics will have to figure out. But since we DO experience time and observe time dilation and all those other effects that my model handles, it's the only proper explanation imo.
also from the article:
"Another challenge is that the standard definition of entanglement refers to particles only at a given moment. A complete theory of quantum gravity will have to add time to that picture. “Entanglement is a big piece of the story, but it’s not the whole story,” says Susskind."
He thinks physicists may have to embrace another concept from quantum information theory: computational complexity, the number of logical steps, or operations, needed to construct the quantum state of a system. A system with low complexity is analogous to a quantum computer with almost all the qubits on zero: it is easy to define and to build. One with high complexity is analogous to a set of qubits encoding a number that would take aeons to compute."
(this assumes that the "computing" is NOT done each and every Planck spacetime where the reiterative model greatly simplifies computational load)
And here where Susskind refers to, "the number of logical steps, or operations, needed to construct the quantum state of a system", I have called "motive" in the model.
Then this part of the article--
"One potential consequence, which he is just beginning to explore, could be a link between the growth of computational complexity and the expansion of the Universe. Another is that, because the insides of black holes are the very regions where quantum gravity is thought to dominate, computational complexity may have a key role in a complete theory of quantum gravity."
--is directly related to my model's use of black holes as having the computational function of an "information filter" for each reiteration of the boundary. (in the model I use "universe" with a small u where here they use "boundary" --- and Universe with a capital U where here they use "bulk")
Since as each iteration of universe (boundary) unfolds, there is a point at which the first "black hole" would form. All subsequent black holes would, upon refolding of the boundary, feed back to that initial black hole (via wormhole) then back to the singularity. Calculation takes place and the boundary unfolds again with all the calculations necessary for coherent expansion.
There's also the matter of the spin of the singularity and that may be where entanglement starts....which would place the origin of entanglement within the quantum computer itself...the "computer" itself being neither boundary nor bulk being the "original" computer rather than one of the infinite number of fractal representations that "error correct" up through the fractal.
https://en.wikipedia.org/wiki/Cyclic_modelhttps://en.wikipedia.org/wiki/Simulated_realityGreat article, thanks again.