Kinda deep question for yall
I'm actually curious about this, but when a relationship ends how come everyone just like say "oh he's/she's bad" like blah blah all just shit talking but like yk at some point you guys genuinely had fun, you enjoyed and care for each other. Suddenly all you see in that person is that they're awful and like no more further than that.
This is a situational question so it depends on yall experience ofc, but let me be more specific. I'm talking about relationships where one wasn't using you for their own gain or has been backstabbing you all this time. I'm talking about those ones where it ACTUALLY was awesome and healthy. Then that ONE fight it all ended and all the things you can only say abt the other person is just the negative stuff even though u liked them alot. So why is this?
Obv it depends on how bad the fallout became but I'm just curious, could you still see the person in a good spotlight? See them changing?
If I'm gonna be honest I hated them at some point and like talk crap about them but it's immature of me but it's the only way I could let go but ik myself they're not that shitty people, they made me want to live and etc like I appreciated life cause of them and I hope for god sake they know that. It's why I'm struggling to move on cause of the goo......
reply
30 09,2024