
Dietician Sheela Sherawat
Don’t forget to get some sun in the winters.
Most of experts does suggest a vital role of Sun in the winters as it does give you heat and even in certain regions across any country, it is advised that you must get some warmth by having some sun in the winter season. As it has been advised by elders of the house that you must warm your hands and your face in the winter with sunrays and help to proper balance, it is also advised to turn your back and help yourself get warm by warming your back which can help in maintaining stress level caused due to stress pattern of winter and it's affective impressions.
By all means, it is essential to have some sun in the winter and make most of benefits out which will do you lot of good and shall keep you warmth and equally balanced in terms of your health and body indeed.