Seriously? Conservatives have thoroughly discredited themselves. What does the American right actually stand for? Clearly not fiscal responsibility - deficits rise when Republicans are in the White House and fall when Democrats are (look it up if necessary). Pro-life, but only if you mean unborn life, after that it's dog eat dog. Hating anyone LGBTQ, absolutely. You guys used to be the "Russia is the enemy" but now that they're run by a kleptocratic authoritarian who mouths Christian nationalism you seem to be super cool with Putin.
The hint seems to lie in this notion of "religious liberty": for Christians yes, for everyone else, no. A conservative is someone who "stands athwart history yelling STOP!" The problem is, American history has always been - and this is not arguable - a white Christian patriarchy. Does it make sense that anyone who doesn't fit that mold isn't happy with "Takin' my country back" and "Make America Great Again"? There are a lot of folks living here for whom the promise of America has only just begun to manifest. Going backward is not in their best interests.