I am a big believer in organic food. Personally, I eat mostly organic fruits and vegetables myself. Most Americans are happily and ignorantly poisoning themselves with bad food. But, like almost everything else in life, planting organic food should not be mandated by the government. The market should decide.
A great article that I read today on this topic:
But wait, Germany is now also doing the same thing:︄