What does biology major mean?

biology major meaning in Urban Dictionary

A major in nearly every college that has been made to destroy resides.