• FauxLiving@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    15 hours ago

    It looks like you’re doing this exact thing:

    sort of like asking the Manhattan Project physicists in 1943, “so when are you going to produce at least a small nuclear explosion?”

    There are a lot of engineering issues that need to be solved to get this to work. It isn’t like they’re going to figure out how to factor a 2 digit prime and then a year later have a breakthrough that lets them factor a 3 digit prime and then some scientists will figure out a tweak to allow a 4 digit prime.

    Expecting that kind of incremental advancement is kind of like expecting the Manhattan project to make a tiny nuclear explosion and then work their way up to a larger nuclear explosion… it shows a fundamental misunderstanding of the technology.

    You can’t just make a tiny nuclear bomb. You either have critical mass and a big nuclear explosion or you have no nuclear explosion at all. The early experiments with quantum computers where they were able to factor small numbers is akin to the ORNL research that showed that you could split an atom with neutron bombardment.

    The Manhattan project wasn’t simply taking that research and then trying to split 2 atoms, and then 3 atoms until they got to the Trinity device.

    • Passerby6497@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      15 hours ago

      I commented on this issue a couple of days ago here and linked a study arguing that the current methods of “factoring” via QC are not scalable

      https://lemmy.world/comment/23267756

      https://www.nature.com/articles/s41598-022-11687-7

      The issue at hand is that there’s a fundamental limit of what we can effectively do at the moment, and a lot of the hype is being driven by “factorization methods” that ultimately only twiddle a few LSBs in the number to cheat to solve it using something that’s not even remotely close to a real world example.

      To use the Manhattan project analogy, this would be like saying “theoretically, if you smash enough radioactive stuff together into a critical mass it will fission, so we’re going to compress these bananas until we hit that point”.

      • FauxLiving@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        14 hours ago

        I agree that those experiments are not scalable.

        I just see them as demonstrating a proof of concept (like ORNL demonstrating the splitting of an atom via neutron bombardment) and not as an attempt to develop a path towards arbitrary prime factorization.

        Whatever the future prototype will be, it won’t be created by incrementally improving on those proof of concept demonstrations.

        “theoretically, if you smash enough radioactive stuff together into a critical mass it will fission, so we’re going to compress these bananas until we hit that point”.

        Potassium-40 does not produce neutrons as part if its decay process, so it is not even theoretically possible to achieve criticality in that manner.

        The proof of concept ORNL tests used neutron bombardment which IS theoretically a method of achieving criticality, but there was no path for incremental improvements of those specific ORNL tests into anything resembling a weapon.

        There actually were weapons tests that used neutron initiators but the source of those neutrons was not a particle accelerator. (Which is good because it’s hard to carry an entire particle accelerator laboratory in an ICBM)