organicism or·gan·i·cism (ôr-gān'ĭ-sĭz'əm)
n.
The theory that all disease is associated with structural alterations of organs.
The theory that the total organization of an organism, rather than the functioning of individual organs, is the principal or exclusive determinant of every life process.