All convolutions inside a dense block are ReLU-activated and use batch normalization. Channel-sensible concatenation is just doable if the peak and width Proportions of the data continue to be unchanged, so convolutions in the dense block are all of stride 1. Pooling layers are inserted between dense blocks for https://financefeeds.com/sec-charges-elon-musk-over-underpaying-in-twitter-acquisition/