I definitely think I look better with a nice tan but why is this? Because society has molded that into the perfect skin tone? That Barbie doll look. Blonde hair, dark skin, big boobs. I am fairly certain the pain of this sun burn and the damage I am doing to my skin is not worth the perk of
fitting into society's idea of attractive. I don't need to impress anyone, I have a wonderful boyfriend and am happy with myself. I don't want to end up like this lady.

One time I woke up the day after I wen to the beach with a swollen eye that I couldn't open. The doctor said it was sun poisoning and told me to put some ice on it and it went away in a few minutes. Just goes to show you how powerful the Florida sun is.
I don't plan on being a beach bum over summer solely to tan anymore. If I get tan from being there and having fun that's cool, but it's no longer a concern to me. I probably sound like a feminist right about now, but it's okay!
