Something, that if you found out the actual truth about, you’d feel like you knew all along but didn’t just want to accept it?
No politics please.
Something, that if you found out the actual truth about, you’d feel like you knew all along but didn’t just want to accept it?
No politics please.
I’m interested in why you thought it’s not a big deal earlier, or at least didn’t full on face this realization?