East Coast Stereotypes About The West Coast That Are Plain Wrong

For many east coast folks, the west coast is a mythical land full of sunshine, healthy people, and nothing but sandy beaches. And yes, you can find those things in California, Arizona, and other western states. But there are just some things about the west coast those back east don’t understand. These stereotypes should be put to rest!



Leave a Reply

Your email address will not be published. Required fields are marked *