Kind of a vague question. But I guess anyone that responds can state their interpretation.

Edit: I guess I’m asking because everything I’ve learned about America seems to not be what I was told? Idk how to explain it. Like it feels like USA is one event away from a civil war outright corruption and turning into a D class country.

    • Eol@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      6 months ago

      I feel like touching up the history books and even other areas of teaching is a disservice to humanity. It’s like it’s an active set up for failure or abuse. I was never taught anything realistic 20 years ago. It’s like I was cattle for someone else’s sick dream. Sometimes it feels like heartlessness is rewarded and masked as goodness.

      • Flummoxed@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        6 months ago

        I have to wonder if both our teachers (the good ones in elementary, at least) meant to inform us about how it should work, because that was all we could grasp at the time? Maybe it was their (misguided?) attempt to make us experience serious anger and feel called to action as we discover the truth of the system for ourselves. I’m a teacher, and I have sometimes realized students are not capable of understanding a complex situation, and in those cases I have attempted to at least ensure they understand I am giving them an idealized, simplified perspective of that situation that does not apply to how it works in reality. I try to plant the seeds for a critical understanding in the future, but I am sure there are students out there that believe I lied to them about how the world really works.

        ETA: added “good” to modify “ones” in first sentence for clarity