Seems like it should and the result should be one. Does mathematics agree with me on that?

  • howrar
    link
    fedilink
    English
    arrow-up
    51
    ·
    edit-2
    1 year ago

    It does not. If you enforce 0/0=1, then you end up in a situation where you can prove any two numbers are equal to each other and you end up with a useless system, so we do not allow for that.

    e.g. 0=0*2 -> 0/0 = (0/0)*2 -> 1=1*2 -> 1=2

    If you get into calculus though, you’ll have ways to deal with this to some extent using limits.

      • howrar
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        Thanks. I already fixed it, but it seems Lemmy is just slow to propagate edits.

    • Spzi@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I see you replace two “0” with a “0/0”, but why that? Since you assume it equals 1, why do you replace it for 0?

      • howrar
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        I’m dividing both sides by 0.

        • Spzi@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Ah, yes. Normally not allowed because undefined, but here you define it as 1. Alright, thanks.