I hate it.
Sure, some women like to play a submissive role. Certainly not all, and any article that includes that generalization does not have my respect.
Now, I dunno if it's just where I live, but most women I meet want partners who are in control of their
own lives. Not ones who try to take control of them. Not even a little.
Anything that involves a lesson on how to control another person, unless it's related to the BDSM scene or something of the like, immediately earns my mark of disgust.
...though I guess you could have called that a lesson on how to woo Tea Party women and it would have been pretty honest. (Okay, I'm sorry, that was a horrible political cheap shot.

)