W3Schools

Is organic food really healthier?

When you load up your shopping cart with organic leafy greens, are you getting more nutritional benefits than consumers on the other side of the produce aisle? More than half of Americans now believe organic food is healthier than conventionally-grown produce, even though there is no evidence to prove it.

Fifty-five percent of Americans said they believed organic food to be more nutritional, a recent Pew Research Center study found — and of the 40% of Americans who say that “some” of the food they buy is organic, 75% do so because they believe it is healthier.

“That is what 20 years of intensive marketing will do,” said Steve Savage, an agriculture expert and writer on farming and sustainability.

>>> Original Source <<<

Comentários