Molitor5901
21 hours ago
I sat through a briefing last week about quantum encryption and the threat that quantum computing poses to encryption in use today. It was stressed that nation states are hoovering up encrypted data now in order to decrypt later with quantum computing. Much the same way America decrypted old soviet encrypted data. I wonder if it will take as long and if anyone will still be alive to make use of that data.
fpoling
20 hours ago
If quantum computing would progress just like in the last 30 years it may take 300 years before it can be useful.
3eb7988a1663
17 hours ago
As has been previously pointed out, the 2001 and 2012 quantum factorisation records may be easily matched with a dog trained to bark three times [33]. We verified this by taking a recently-calibrated reference dog, Scribble, depicted in Figure 6, and having him bark three times, thus simultaneously factorising both 15 and 21. This process wasn’t as simple as it first appeared because Scribble is very well behaved and almost never barks. Having him perform the quantum factorisation required having his owner play with him with a ball in order to encourage him to bark. It was a special performance just for this publication, because he understands the importance of evidence-based science.
I look forward to more dog-based science.geoffpado
13 hours ago
> we also estimate that factorising at least two-digit numbers should be within most dogs’ capabilities, assuming the neighbours don’t start complaining first
mimasama
10 hours ago
This deserves an Ig Nobel Prize lol.
teruakohatu
10 hours ago
Ig Nobel’s go to actual research, not to satire.
wat10000
19 hours ago
If you know a better way to factor 35, I’d like to hear it.
mosura
6 hours ago
If anyone had made meaningful progress on QC in that time there is no way knowledge of it would be allowed to be public.
It is one of those domains where success would land you in a gilded prison.
mapmeld
5 hours ago
Like LLMs, this isn't the sort of thing where a small group would make a sudden advancement and be made secret, and I doubt that the NSA can make theirs significantly faster than any industry team today. I think more likely you would need to get worried if someone got one scalable to hundreds or thousands of logical qubits and then stopped publishing.
mosura
5 hours ago
> I think more likely you would need to get worried if someone got one scalable to hundreds or thousands of logical qubits and then stopped publishing.
Consider the likelihood of managing that without alerting the authorities to what is going on.
bossyTeacher
5 hours ago
If we know anything, it's that development is never linear
lbourdages
19 hours ago
Thanks for sharing this, great read.
PeterisP
6 hours ago
This shouldn't be a major issue because of Forward Secrecy (https://en.wikipedia.org/wiki/Forward_secrecy) principles built into modern TLS protocols, which ensure that even if the public/private key scheme is vulnerable to (for example) quantum attacks, the attacks have to be done now, as a MITM for the handshake, or otherwise the full traffic capture is useless for future decryption without getting some secrets from one of the endpoints.
That being said, it's not 100% used everywhere yet (Wikipedia mentions 92.6% of websites), and various means of tricking devices into downgrading to an older protocol would result in traffic that might be decrypted later.
stephen_g
12 hours ago
What I want to know is how they guess which 0.001% of signals or internet traffic is actually worthwhile to keep? The biggest nation states could conceivably store about 1 year’s worth of internet traffic right now, but then they also need to store whatever other signals intelligence they’re gathering for analysis, so it will be less than a single years worth.
But almost all that data is going to turn out to be useless if or when they gain quantum ability to decrypt it, and even the stuff that could be useful now gets less useful with every month it stays encrypted. Stuff that is very useful intelligence now could be absolutely useless in five years…
snowwrestler
2 hours ago
> What I want to know is how they guess which 0.001% of signals or internet traffic is actually worthwhile to keep?
By observing DNS lookups in centralized taps like room 641a at AT&T.
Razengan
11 hours ago
2000 years in the future people will know which porn you slobbed it to.
qingcharles
14 hours ago
hiddencost
14 hours ago
An exabyte isn't as much as it sounds like.
mkl
14 hours ago
It was a lot more in 2014. Presumably they have upgraded it significantly since.
CaptainOfCoit
7 hours ago
I wonder if there is any way of figuring out a "data space inflation" metric or similar, like money but for drive space?
So we who grew up with 500MB computers could properly communicate how big the drives "felt" at the time, compared to what we have today.
zahlman
4 hours ago
I was a few years into computer use before I got to experience a hard drive, a whopping 40MB.
monster_truck
10 hours ago
I don't think this is true at anything resembling a concerning scale.
Even trying to do something like saving 'just' the average yearly traffic tor handles would account for 2-3% of all the current storage available.
We're talking about the same government that quickly abandoned their quest of 'archiving every tweet in the Library of Congress'
somenameforme
5 hours ago
It's an interesting little nugget of evidence in favor of the simulation hypothesis. We're currently living through the first era in humanity's history where there will be enough raw information to create a workable global level simulation once that data is decrypted. Pair that with the fact that we're living through such a huge inflection point in history (birth of the internet, humanity becoming a multiplanetary species, and more) and you have a time where people both (1) can and (2) will want to simulate/experience. It's quite interesting.
I'm still convinced that the simulation hypothesis is just religion for the atheist or agnostic, because if it turns out that it's correct and one day you 'wake up' only to find that it was all a simulation, well how do you know that isn't now also just another simulation? It's a non-theory. But I find this some quite compelling circumstantial evidence in favor of this non-theory. Because an arbitrary number of individuals may be able to experience "this" era throughout our species' future, yet only one group will be the one that gets to actually live it, and that group will ostensibly be orders of magnitude smaller than the sum total of all that will later 'experience' it. Statistically you're rather more likely to belong to the simulation group than the real, if these assumptions are correct.
spacecadet
7 hours ago
Sat in a similar briefing in 2018, sounds like the same talking points still.