All convolutions in a dense block are ReLU-activated and use batch normalization. Channel-wise concatenation is only feasible if the height and width Proportions of the data continue to be unchanged, so convolutions in the dense block are all of stride one. Pooling levels are inserted between dense blocks for https://financefeeds.com/enough-about-bitcoin-btc-900-in-this-altcoin-under-1-could-reach-90000-in-the-next-10-months/