So, does watching dramas like Orange is The New Black, Weeds, Breaking Bad etc. merely entertain you or is it more? Is it opinion forming in any way?
It does not have to be really significantly opinion forming, like totally transforming your opinions on prisons, drugs, the consequences of terminal illness where medicine is totally private/monetised?
Should it TV be primarily about the one or the other? Is it possible for TV to even be about one without traces of the other?
A mate of mine even told me that watching Big Bang Theory influenced his views about jocks vs. nerds.
It does not have to be really significantly opinion forming, like totally transforming your opinions on prisons, drugs, the consequences of terminal illness where medicine is totally private/monetised?
Should it TV be primarily about the one or the other? Is it possible for TV to even be about one without traces of the other?
A mate of mine even told me that watching Big Bang Theory influenced his views about jocks vs. nerds.