My parents have never taught me about sex ed. Or how babies are made or about protection. I learned that stuff on my own or at school. I signed my own forms using their name for sex ed lessons so I don't think they know I took those lessons? Do they just assume that I learn at school? I've never had the "talk" with my parents, that's all I'm saying.
My parents never taught me about sex Ed too: if I ask them where I came from the reply I always get is “I pick you up from the side of the road from the garbage can.” reply