Original tweet:

  • Amaltheamannen@lemmy.ml
    link
    fedilink
    arrow-up
    53
    ·
    1 year ago

    Liberals are right of center. America is almost the only place on the Earth where liberalism is considered left wing.

      • jaybone@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I wonder if these are people from outside of the US referring to what we call libertarians.

        But anyway, the “left” in the US is certainly no where near as left as the left in say European countries.

    • FlexibleToast@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I always hear this, but it doesn’t make sense to me. To me it sounds like we define liberals differently. What specific stances make a liberal right of center? I would like to know we’re defining the word the same way.

      • JungleJim@sh.itjust.works
        link
        fedilink
        arrow-up
        17
        ·
        edit-2
        1 year ago

        I’ve thought of it this way, a liberal is a master being giving liberally the treats they shower their workers with to keep them happy and productive. A leftist believes in the class consciousness struggle and that human rights are innate and not something handed down from masters.

          • JungleJim@sh.itjust.works
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Then Google it. I gave you an overview of the two political philosophies. It’s not my job to educate you if you don’t like the answers you get.

            • FlexibleToast@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              You weren’t even the person I was originally asking the question of. You went out of your way to give an answer that didn’t answer my question, and then you get upset at me for what you did?