No, genital warts generally do not make you ill. In fact, in most cases, it is the body’s immune system that will cause the warts to go away on their own. However, it is possible for the warts to cause mild irritation or discomfort, and if left untreated, they may be associated with an increased...