

Well you’re not half wrong, at least the LUMI supercomputer, current nr. 9, runs Cray SUSE Linux Enterprise Server with minimal kernel daemons to reduce OS jitter on the compute nodes.


Well you’re not half wrong, at least the LUMI supercomputer, current nr. 9, runs Cray SUSE Linux Enterprise Server with minimal kernel daemons to reduce OS jitter on the compute nodes.
Add this to your list: https://en.wikipedia.org/wiki/Fock_space
Why not both?


You need to know which basis the sender use to collapse and measure in the same basis. Then you need to sample a statistical distribution and the desired information will be the average of the distribution. This is very well proven in the Bells inequality experiment and can definitely be used to gain information.
It is clearly not very efficient in the sense a lot of transported bits are wasted to convey less information. But the advantages of instantaneous and secure communication will be worth it in some use cases.
That is, of course, if the engineering issues such as quantum repeaters (a sort of range extender) and high fidelity storage are properly solved. It is a few years ago since I did any quantum information in uni, so I don’t know what the current state of things are.


For using the quantum teleportation algorithm you first have two establish entangled qubit pair, with one photon at the sender and one at the destination. This process does take the distance over speed of light amount of time. The trick is that you would pre-process this, and decide later when to and what information to encode into the qubit, allowing for “instant” information transfer. Naturally, this requires that you have a very good memory device that keeps the fidelity of the entangled qubits.


Not a search engine, but last week I learned of the European Open Websearch project, which builds a new free and open search index. It should already be ready to try out. Hopefully we will see some search engines implementing this soon.


Thought I would mention Guix. I don’t know about using it as an OS but just the package manager is so nice to build reproducible software environments (although disclaimer I discovered this myself a few weeks ago). At least as close you can get without including proprietary hardware drivers. Building MPI applications on my laptop and moving them to an HPC cluster with full performance feels like magic.
Cool, can I come over and have a look at your super strong monster magnet Neutron star?
I quite like .ion or .iot


The sun’s spectrum at the earth surface peaks in the green color range, which should make green the most efficient choice. Although, I wonder why they have to absorb only a single or a narrow band of color.

Without knowing much about psychology, I would imagine separating the mindset into a set of orthogonal axis is pretty difficult and certainly the normal range would probably not follow a normal distribution in each axis. As a result the N-dimensional volume would not be a N-sphere but some complex topological shape. Possibly even consisting of multiple disjointed sets. If any of these assumptions are true then the global point average over the entire space may lie outside many of the “normal” ranges.


I feel like I met some recursive endgame boss… I made a penguapplepenguinpenguapplepenguapplepenguin partially from pineapples and penguins and something else I spam combined


0.5% of eluveitie at 1443 minutes, I suppose not too impressive considering 907k monthly listeners. But I’m a varied listener


This certainly could be part of the motivation for publishing it this way, to make themselves more noticed by the big players. Btw, publishing in open source nature is expensive, it’s like 6-8000 euro for the big ones, so there definitely is a reason.


While in not in the field either, I do know that it is quite unusual in computer science academics to publish in actual peer reviewed journals. This is because it can be a long process, and the field is very fast moving, so your results would be outdated by the time you publish. Thus, a paper is typically synonymous with a conference proceeding, and can be found on arxiv. I found this Paper on the arxiv from 2017/2018 which seems to be when this paper was originally published for the scientific community and presented at a very “good” (if I had to guess) conference. Google scholar says this paper has 650 citations, so it probably has had quite some impact. However, I would guess this method is well known and is already implemented in many models, if it was truly disruptive.


The article linked here is rubbish, CrSBr is not a meta material and also not a superconductor. It is a layered semiconductor. However, the Nature article they link to is quite interesting. The background is in cavity engineering, which is where one tries to modify intrinsic material properties by coupling to light “strongly”. This is usually done by creating a cavity (think two opposing mirrors around the material) and have light bounce back and forth.
Here instead they don’t need to use mirrors, but the refractive index is different enough to trap light in the material, and the electronic properties seem to be quite sensitive to the light because the magnetic phase is sensitive to magnetic fields and the different magnetic phases have quite different electronic properties. So all in all they find a strong light-matter coupling but only below 132K (the critical temperature of the magnetic phase).
In one discord channel my alias is Bread due to an old joke when we played monster prom. Im loving these bread themed memes.
Same problem from Denmark