Just be aware that that answer is very over-simplifying things. "Left" and "right" are supposed to signify political ideas, not certain parties or politicians. You could say that Biden is to the "left" of Trump, but most people who would call themselves "left-wing" would consider Biden a "centrist" (between left and right) at best, maybe even skewing right-wing.
Typical left-wing ideals would be working towards freedom and equality for everyone by solidarity and cooperation, whereas right-wing ideas usually focus around preserving your own privileges by suppressing whomever you consider "outsiders" or "inferiors".
I'm just saying that the answer is centred on American politics. The definition of right is Republicans and left the Democrats doesn't apply anywhere else.