When are they teaching it in schools and what viewpoint are they showing? Mine was in 6th grade, and the general premise was "Don't have sex, you will get AIDS and die."
Has the education system become any more liberal in the past decade+?
Do schools get to choose between comprehensive or abstinence-only programs?
What groups or third party programs are in the state to provide further education?
What about secondary education? Do high schools offer health classes still? What do they cover?
Please know that I am asking informational questions, not opinions on the subject matter.