What if this land were still inhabited by NATIVE AMERICANS only?
Would the world have been better off if the United States of America were never born?
It seems to be heading to he** and taking others with it.
I see no happy ending here. Only more disaster.
What do you see and why?
So please think about it and think seriously. What has the United States of America contributed to the world? That country is gone...seriously damaged like a lion that is shot and in pain. Becomes VERY DANGEROUS.
The United States of America now is VERY DANGEROUS. The underbelly is taking over everything. The dark side is putting out all the lights. There seems to be no stopping the progression.