1. RElu, PRElu (parametrized RElu) activation functions, see https://arxiv.org/pdf/1502.01852.pdf 2. with global flag, change all BP-based algos to support cross_entropy error: if (cross_entropy==0){# cross_entropy==1 means omit f' in output layer 3. dropout dropout[2]=0.2; if (rand()