• @[email protected]
    link
    fedilink
    12 years ago

    Because Turing created an actual computer for his computational model, while Alonzo did not for Lambda Calculus. So people adopted Turing’s model for the early digital computers and the programming languages which is pervasive to this day

    • @[email protected]
      link
      fedilink
      62 years ago

      The machine that Turing made wasn’t exactly influential. EDVAC (Von Neumann’s machine) overshadowed it quite dramatically (to the point most people in the field don’t even know that ACE exists, but know the phrase “Von Neumann Machine” instantly).

      Turing’s main influence on computer science was theoretical, not in implementation, despite him technically being “first” with a stored-program computer.

      • @[email protected]
        link
        fedilink
        2
        edit-2
        2 years ago

        Of course, but Turing influenced Von Neumann. Turing’s model was probably more intuitive than Church’s as well. So the timeline’s roughly like this: Turing > Von Neumann > Imperative dominance

        • @[email protected]
          link
          fedilink
          22 years ago

          I … think that’s exactly what I said. Turing’s influence was mainly theoretical, not practical. That Von Neumann was influenced by (and even plagiarized to some extent) Turing is indisputable, but Turing didn’t “[create] an actual computer for his computational model” in any way that was actually influential.

          Tragically.

          Because EDVAC was kind of lame compared to even Pilot ACE.

          • @[email protected]
            link
            fedilink
            42 years ago

            What I meant (but failed to do so) was that Turing provided an actual model of computing engine so that it was more straightforward to implement it, while Church’s did not. Besides pure lambda calculus was pretty convoluted even for representing things like a natural number. Implementation of Church’s work would only be more explored in the 60s with McCarthy et al, a 20 year gap that defines computing to this day.

            • @[email protected]
              link
              fedilink
              12 years ago

              Fair enough. Turing’s model was a more comprehensible machine from an implementation standpoint.