I seem to remember not so long ago that the Democrats were always about unions, and “workers rights”. That employees had a right to stand up against the big corporations that employed them. Honestly it was always the one thing that I admired and liked about the Democratic Party, and conversely I always hated how Republicans always kissed the asses of corporations and favored mass illegal immigration to appease their corporate masters.
Now the Democrats/lefties are just pathetic shills for big corporations. They are literally cheering for big corporations to force their employees to get vaccinated against their will with experimental vaccines made by Big Pharma corporations that are immune to any liability. Are employees now just livestock owned by corporations?
Can any so called liberal explain this?
Now the Democrats/lefties are just pathetic shills for big corporations. They are literally cheering for big corporations to force their employees to get vaccinated against their will with experimental vaccines made by Big Pharma corporations that are immune to any liability. Are employees now just livestock owned by corporations?
Can any so called liberal explain this?