Do Women Have Agency?

Tuesday, 17 September 2019

Do Women Have Agency?

Agency means ‘anything which acts or produces a result’. Traditionally nearly ever culture has accepted that men have Agency, men made things happen and that also meant that men were responsible for the outcome of those actions. Men were responsible for the government, armies, religions and families. Because men were active.

About mobiuswolf

Aspiring writer of Zombie fiction.
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s