In a recent article by the Los Angeles Times, it is reported that wages for farming in America have risen, but the employees have not. Farming is the foundation of America. Without farming, we are nothing because farming is what gives us basically everything that we need to live. And yet, some people still look down upon it. I hate it. Farming is a respectable job that one does so that they can help the whole nation. Granted, it does require intense labor and will probably break your back, but it is for the benefit of the nation. It’s sort of a role reversal because the Okies in the Grapes of Wrath were desperate for jobs but the farmers couldn’t take all of them. Now, farmers are desperate for workers, but no one is heading their way. Like the Okies, farmers now have to travel distances to recruit more workers on their farms. All in all, I find it interesting how times have changed since the Great Depression.