meaning of America West

America West meaning in Urban Dictionary

(n) A derogatory term if you are bailed away from a challenging circumstance that you usually will have no hope in beating on your own."Do you observe how 'America West' got from a speeding ticket because his dad is a cop?"