Shannon’s Information Theory lets you measure the amount of information carried by a signal, and is quite applicable to genetics. However, information cannot be created randomly; only entropy can. Yes, mutations can happen, and sometimes that can make an organism more fit by losing information (say, by making a deformed protein that makes it immune to some kind of virus or whatnot), but overall, information is still lost to entropy, and the organism is hindered by its “adaptation”.