Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

People also forget that analogue computers, mechanical devices to perform calculations, can also be faster than digital computers in some situations. That doesn't make them commercially viable.

Seems to me that most all of the quantum computing community is trying to be in the right place when they can start cracking current encryption standards at a commercial scale. At that moment, anyone with a functional quantum computer will drown in money. Then a few weeks later new quantum-resistant algorithms will appear and the gold rush will end. All the other quantum projects seem like attempts to keep ones foot in the market while waiting for that day.



Analog computers have mostly been electronic rather than mechanical for 60 years. "Digital" and "electronic" are not synonyms; they are completely orthogonal. Analog computers 60 years ago were mostly built with op-amps rather than shafts and gears, despite the survival of WWII-era mechanical naval fire control computers. That's in large part because human-scale shafts and gears max out with signals in the hertz to kilohertz range, while even the most ordinary op-amps can handle signals in the tens of kilohertz range (which has been true for 100 years) and op-amps in the tens of megahertz range have been available for 60 years. Also, shafts and gears have inherent errors of around 1% from backlash, while op-amps can usually do better than 0.1% (again, for the last 100 years) and by 60 years ago better than 0.001%.

So with off-the-shelf electronics an analog computer can compute 1000 times faster with 1000 times better precision than if it were mechanical. Until the 01960s they used vacuum tubes and so used more power and were less reliable; since then electronics have used less power and been more reliable.

Today we still use plenty of analog computation, but it's pushed to the margins. Every sound card has an antialiasing analog filter on its front end before switching to the digital domain. Even software-defined radios still use analog electronics to upconvert and downconvert signals between baseband or IF and the RF. Your Wi-Fi card can't sample that 2.4 GHz signal at its 4.8 Gsps Nyquist rate; doing that is not impossible but still requires high-end digital electronics. Submillimeter-wave communication is very much dependent on precise analog signal processing to modulate your desired signal into the hundreds of GHz range.

("Precise" in this case doesn't mean with linearity errors as low as 1%.)


Quantum resistant encryption is already available


Would you be able to give any examples of quantum resistant encryption algorithms? I'm not familiar with the field and my most recent knowledge is a post on hn saying that some post quantum candidates had been broken by old laptops.


The whole symmetric key cryptography (e.g. AES) ia already quantum resistant. The problem only holds for public key encryption, but as the other commenter pointed out, there are already promising algorithms.


That's nice, I didn't realise AES was quantum resistant.

However, an algorithm being promising doesn't mean it works. Do you know how well the development of these other techniques is progressing?


I'm not an expert in the field, but there is already a NIST competition going on to stansardize post-quantum public key ciphers. So I would say that we're at a good point in post-quantum cryptography development.


It's potentially quantum-resistant depending on how it's used. Grover's algorithm still reduces your effective key length by half in many situations.



No. Encryption that is "not provably quantum-insecure" is available. I doubt this will ever extend to "provably quantum-resistant".


You've got me intrigued -- what are examples of mechanical devices being faster than digital ones? Assuming you're talking man-made.

I'm trying to imagine and am totally stumped.


Fire control computers, like on a navy ship, were faster than digital computers of the day. YouTube has a number of videos on them.

An opamp performs multiplication faster than a digital computer (speed of light vs a few cycles). It's not super useful on its own, but it does fit the criteria.

In Veritasium's video 2/2 on analog computers [0] they show some startup products near the end.

[0]https://youtu.be/GVsUOuSjvcg?t=898


What no. Opamps don’t multiply and they don’t operate at the speed of light. They have some timescale that goes like their bandwidth, which depends on their feedback path.


Yes, feedback op-amps definitely have bandwidth limits. Although you can get ones in the gigahertz range now.

Analog multiplier ICs are available.[1] They're not common, and they cost $10-$20 each. Error is about 2% worst case for that one. There are several clever tricks used to multiply. See "Gilbert Cell" and "quarter square multiplier".

[1] https://www.digikey.com/en/htmldatasheets/production/1031484...


The propagation time around the feedback loop is still (length of loop) / (speed of light*slowdown constant), so yes, "at the speed of light".


This is absolutely not true, the speed of analog circuits is (by a significant margin) determined by parasitic capacitance, inductance, and resistance of the components. To put numbers to it, a typical high performance analog multiplier might have a loop length of 1cm for the feedback path. This circuit should theoretically operate at 30GHz, but realistically such circuits operate with a bandwidth measured in megahertz.


If your values are in the mechanical domain, doing a simple and fixed computation in the mechanical domain may be more efficient. An example would be a mechanical differential in a rear-wheel drive car [1], or a swashplate in a helicopter [2].

[1]: https://en.wikipedia.org/wiki/Differential_(mechanical_devic...

[2]: https://en.wikipedia.org/wiki/Swashplate_(aeronautics)


Those aren't examples of computation; they're examples of power transmission. If you found a way to compute the same information as a swashplate or a differential with a lower-cost, higher-speed, more reliable, lower-power device, it wouldn't replace the swashplate or differential. In fact we've had such devices for over a century, because the swashplate is just multiplying two quadratrue sine waves by constants and summing them, and the differential is just adding (or subtracting).


These are bona fide examples of computation, with results immediately consumed. Computation without output is sort of pointless.

They are not very different from a computation inside an injection controller of an ICE, with its results consumed within microseconds, as motions of injection valves. They key difference is the intermediate use of an electronic computer, an MCU, instead of a purely mechanical and pretty inflexible device, the camshaft.

Certainly we could replace a swashplate with some electric or hydraulic actuators driven by an MCU if we needed to compute something more complex than what a swashplate currently computes, much as we did with the camshaft. This is not very probable though, because a new system should also work unpowered to allow auto-rotation, to say nothing of higher reliability requirements than a system for a car.


My point is that, in that scenario, what replaces the swashplate is mostly the electric or hydraulic actuators, not the MCU. If it wasn't, you'd make the swashplate mechanism much smaller, lighter, and cheaper, even if you had reliability requirements your MCU couldn't meet.

In the north-pointing chariot or the Antikythera mechanism, the differential performed a computational function, with its action of transmitting power quite peripheral to that; in your car's rear end, it performs a power-transmission function, with its action of computation quite peripheral to that.

The same situation holds with transistors. You can use a 2N7000 to toggle a light or control a relay or a motor, or you can use it for (digital or analog) computation.

If you're using it in an NMOS NOT gate or the input stage of an op-amp, you're using it for computation, and so you wish it were smaller; it would work better if it were smaller because then it wouldn't need so much energy to turn it on or off. (For analog computation, you only wish it were smaller up to a point, because at extremely small sizes that makes it more sensitive to noise, but you wish it were really a lot smaller than a 2N7000.) A 2N5457 is generally better for an amplifier input stage, and the no-longer-available discrete signal MOSFETs are probably better for NMOS NOT gates. The N-MOSFETs integrated into a chip are enormously better at computation than a 2N7000.

By the same token, though, a 2N5457 or signal MOSFET is much worse than a 2N7000 at power transmission. If you're using it to PWM a motor, you wish it were larger; it would work better if it were larger because then it would be at less risk of overheating, be more efficient at a given current level, and be able to control a bigger motor. An IRF630 is a better power MOSFET than a 2N7000; an IRF540N is better still. But they're enormously worse at computation than a 2N7000.

Helicopter swashplates and differentials are very much on the power-transmission end of the spectrum, not the computation end, even though they cannot avoid doing computation as part of their job.


> You've got me intrigued -- what are examples of mechanical devices being faster than digital ones? Assuming you're talking man-made.

You might be able to build a fluid device to test a property faster than you can simulate the fluid dynamics in full detail. Perhaps not on the first iteration, but iterating small changes to get a desired result could certainly be faster than simulating it, for simple systems.


> The Water Integrator was an early analog computer built in the Soviet Union in 1936 by Vladimir Sergeevich Lukyanov. It functioned by careful manipulation of water through a room full of interconnected pipes and pumps. The water level in various chambers (with precision to fractions of a millimeter) represented stored numbers, and the rate of flow between them represented mathematical operations. This machine was capable of solving inhomogeneous differential equations.

https://www.techspot.com/trivia/97-1930s-which-countries-bui...

https://en.wikipedia.org/wiki/Water_integrator


Here’s an analogue computer called the MONIAC from ~1949 that calculates monetary flow in an economy: https://www.engineeringnz.org/programmes/heritage/heritage-r... with a fairly naff video of it operating but captures the essence: https://m.youtube.com/watch?v=rAZavOcEnLg

I like the COMPAQ branding added to it!


It all comes down to the definition of "faster". Standard testing is based on binary computations, the idea that there is a finite answer. Take a fire control computer on a ship. It has maybe 30 inputs, all essentially analogue dials. It combines them into a continuous analogue answer, a firing solution for the guns (elevation + azimuth). It doesn't do that "X times per second" or to a particular level of accuracy. The answer is always just there, constantly changing and available to whoever needs it whenever they ask for it, measurable to whatever level of precision you want to measure. If you measure the output every microsecond, then it is a computer that can generate an answer every microsecond. But that speaks more to the method of measurement than the speed of the machine.


It's true that we measure the speed and precision of analog "computers" differently from how we measure them for digital computers, but it does not therefore follow that analog "computers" are all infinitely fast and perfectly precise. Any analog system has a finite bandwidth; signals above some cutoff frequency are strongly attenuated and before long are indistinguishable from noise. And analog systems also introduce error, which digital computation often does not. When digital computation does introduce error, you can decrease the size of the error exponentially just by computing with more digits, and there is no equivalent approach in the analog world.

For mechanical naval fire control computers the cutoff frequency is on the order of 100 Hz and the error is on the order of 1%. You won't learn anything interesting by sampling them every microsecond that you wouldn't learn by sampling them every millisecond.


Basically anything that has to do with processing an analog signal. It's always faster to do that with analog electronics rather than using a ADC, doing the computation in the digital domain, and then getting the result back to the analog world with an ADC.

One example, if I need something that when two switches are triggered will turn on a light bulb (basically an AND gate) it's obviously faster doing that with an analog (mechanical) device, that is the two switches wired in series, than acquiring the signal with a microcontroller and outputting a signal to turn on the light bulb.

Thinking about the industrial world, there are cases where you have constraints about speed and real time that make sense to do signal processing with analog components rather than digital ones. And that was always the case before computers where invented, by the way (missile guidance systems were purely analog, as one example, you can do a lot of stuff!)


> Spanish Catalan architect Antoni Gaudí disliked drawings and prefered to explore some of his designs — such as the unfinished Church of Colònia Güell and the Sagrada Família — using scale models made of chains or weighted strings. It was long known that an optimal arch follows an inverted catenary curve, i.e., an upside-down hanging chain. Gaudí's upside-down physical models took him years to build but gave him more flexibility to explore organic designs, since every adjustment would immediately trigger the "physical recomputation" of optimal arches. He would turn the model upright by the way of a mirror placed underneath or by taking photographs.

http://dataphys.org/list/gaudis-hanging-chain-models/


The simple sundial calculates the time based on Sun's position relative to Earth


Cracking encryption is basically irrelevant as a quantum computing application. Post-quantum encryption algorithm development proceeds apace and the messaging is already "if you want this to still be encrypted 30 years from now, start using post-quantum encryption today." Anybody caught with their pants down the day quantum computers can actually crack 4096-bit RSA simply isn't serious about security.


“Anyone who doesn’t have weaponized anthrax isn’t serious about home defense.”

This forum gets more and more detached from reality every day.


It's definitely not true today: for example, there are no NIST standards (and I'm not sure about standards from other governments) for quantum-resistant key exchange. Several such systems have been developed, and NIST has even chosen one to standardize, but they aren't standardized or widely deployed yet.

But I expect that in 5-10 years, most security systems designed by competent professionals (up-to-date OS security services, TLS servers, SSH servers, VPN, firmware update systems etc) will have post-quantum crypto enabled by default. And I expect it will take longer than that to build a QC that can break classical crypto.

More likely it will play out like the SHA-1 break: all professional security engineers should have switched off SHA-1 (at least for unkeyed hashing) years before any collision was found, and users who apply security patches should therefore by mostly up to date, but I'm sure some are still using the older crypto.


Not this forum, but rather the US government:

“NSA intends that all NSS will be quantum-resistant by 2035, in accordance with the goal espoused in NSM-10.”

Source: https://media.defense.gov/2022/Sep/07/2003071836/-1/-1/0/CSI...


The list of things that have to be encrypted 30 years from now is very, very small. I doubt any (many?) people here have contact with any of it. I don't understand your analogy at all, sorry.


The message I sent to my girlfriend last night. In 30 years, when I am running for president, that email/text/signal message might come back to haunt me should anyone be able to decrypt the archived/encrypted copies held by state agencies.

Anything that is private today is private for a reason. That reason doesn't automatically disappear over time.


This is indistinguishable from hoarder logic. Such things straight up don't matter on the scale of decades. The US DOJ has a policy of automatic declassification after 25 years.


You do realize that because of #metoo, claims and evidence of people's actions 30, 40 years ago are being judged in the court of public opinion, if not in actual courts?

I don't think you've been paying attention to the news.

Also the US government isn't a great example. JFK was assassinated in 1963 and all records surrounding that still haven't been released.

The idea that people don't care about secrets across the span of decades is utterly wrong.


How does encryption impact #metoo? The person waking the accusation would have a decrypted version of the message and even if they didn't they could accuse without proof.


>> hoarder logic.

And the US intelligence community is the greatest data hoarder on the planet, rivaled only perhaps by the combined forces of facebook/google.


What about my crypto-currency? Imagine a quantum computer could crash all them crypto-markets and bring about an economic collapse.


Yeah, people who are serious will have probably switched to / hybridized with PQC before a cryptographically relevant quantum computer is built. Unless some state agency has a secret one. So the main relevance might just be forcing everyone to switch / hybridize.

At the same time, from history it seems almost certain that, if indeed a CRQC ever gets built, a significant number of users will not have secure PQC rolled out on day 0.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: