Can anyone name one Democrat policy that strengthens the USA?
Can anyone name one Democrat policy that helps Americans, makes the country safer or more prosperous.?
Because I been doing pros and cons of an open border, no cash bail, transgender sports and other Democrat policies and the results were every single Democrat policy weakens or undermines the country and citizens.