• sad_detective_man@sopuli.xyzOP
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    22 hours ago

    Yeah I’m realizing now this is basically a LM but reversed because it asks questions and you give responses. The responses are all value weighted

    • ChaoticNeutralCzech@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      17 hours ago

      No, it’s not a language model. It does not process any language, the question strings are static descriptions of the weighted values.

      If Akinator had a language model, it would never ask “is your character a sea animal” after you said No to “is your character an animal” because you’ve ruled out the bigger set. But it does ask such questions, which means it can’t even notice the basic linguistic operation where adding a qualifier creates a subset. It just doesn’t know the answer to the broader question for some of the currently most probable characters, just the answer to the narrower, at which point it will ask the latter to rule some out even if it’s clear to a human that one implies the other.

      • sad_detective_man@sopuli.xyzOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        14 hours ago

        You’re right. I should have said “neural network” or algorithm or something. I didn’t mean to play it up, I mean those are generally more rudimentary than people think they are.

        But I think it’s backtracking on questions like that because it expects people to give wrongful answers or to change their minds. Also some answers actually require you to give a lead at first then contradict yourself later like that in order to reach. Like “Mary Madeleine’s Skull relic” instead of just “a skull”

        • ChaoticNeutralCzech@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          13 hours ago

          Yeah, that’s definitely part of it. I think it reduces the weights of earlier questions as the game progresses.

          Also, it seems to have memory between games: if you answered “yes” to a very specific question, it is way more likely to ask it in the next game. This is because it’s hard to think of completely original characters each time: if one’s first character was a British politician, the next one is very likely to be British and/or a politician.

          • sad_detective_man@sopuli.xyzOP
            link
            fedilink
            English
            arrow-up
            1
            ·
            12 hours ago

            It does! I’m curious if it’s using that memory from games with your specific IP address or with other people who played recently