meaning of biology major

biology major meaning in Urban Dictionary

A major in nearly every college that has been made to destroy resides.