They express their concern to us on a regular basis... telling us that if we eat better, take supplements, or just get up and move that we will feel better. The gut reaction for most of us is to put up large and rather thick walls, because this advice just makes us feel like our family members just "don't get it."
I think it boils down to something extremely simple... they just don't WANT us to be sick. They WANT us to be healthy, and having to face the fact that the child that they raised for so many years is doomed to spend the rest of their life in pain is extremely hard for them to take... so they grasp at straws, trying desperately to heal us but only driving us away. There is a strong desire within them to pretend that our illnesses do not exist, hoping that by ignoring the pain it will go away.
What are your thoughts on the matter?