ninja_lord666 Posted March 19, 2007 Share Posted March 19, 2007 I'm sure many ov you have heard that breast feeding is healthier for the babies, but is it right in public? Personally, I don't see what the big deal is about. The human body is a natural thing, and breast feeding is even more natural. What about you? What are your opinions? Link to comment Share on other sites More sharing options...
Dark0ne Posted March 19, 2007 Share Posted March 19, 2007 I think I'm just gonna lock this one up before it even has the possibility of starting. No offense, but it's a crap topic. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.