All convolutions in a dense block are ReLU-activated and use batch normalization. Channel-smart concatenation is only possible if the height and width Proportions of the information continue being unchanged, so convolutions in the dense block are all of stride 1. Pooling levels are inserted between dense blocks for even https://financefeeds.com/solaxy-presale-smashes-15m-as-the-first-solana-layer-2-chain-gains-momentum/