One Hundred Layers Tiramisu

Hi everyone, I know this topic is kind of stale but I’ll ask anyway.
I’m trying to replicate the tiramisu paper and got stuck on Transition Up part of the model. I don’t quite understand how authors get a reduction in a number of filters on their way back to full resolution. How do you go from 1072 to 800 on the second upsample block if dense blocks only add features? Do I use 1x1 conv to do feature reduction (not part of the described model)
This is the part of the paper where they explain it (page 4) but I don’t get it. Can someone point me to the code that does that? Or explain with a lot of hand waving? Thanks!

Since the upsampling path increases the feature maps spatial
resolution, the linear growth in the number of features
would be too memory demanding, especially for the full
resolution features in the pre-softmax layer.
In order to overcome this limitation, the input of a dense
block is not concatenated with its output. Thus, the transposed
convolution is applied only to the feature maps obtained
by the last dense block and not to all feature maps
concatenated so far.

Hi Brendan,

I’ve trained up your model with my custom data, and it’s working nicely. Now I’m trying to get it converted to coreml using onnx-coreml, but I’m stuck on an unsupported operation error:

TypeError: Error while converting op of type: Slice. Error message: Only single axis Slice is supported now

EDIT: To provide more context in figuring out where this is failing, here is the log of the conversion (seems like it’s in TransitionUp?.. perhaps center_crop()?):

1/224: Converting Node Type Conv
2/224: Converting Node Type BatchNormalization
3/224: Converting Node Type Relu
4/224: Converting Node Type Conv
5/224: Converting Node Type Concat
6/224: Converting Node Type BatchNormalization
7/224: Converting Node Type Relu
8/224: Converting Node Type Conv
9/224: Converting Node Type Concat
10/224: Converting Node Type BatchNormalization
11/224: Converting Node Type Relu
12/224: Converting Node Type Conv
13/224: Converting Node Type Concat
14/224: Converting Node Type BatchNormalization
15/224: Converting Node Type Relu
16/224: Converting Node Type Conv
17/224: Converting Node Type Concat
18/224: Converting Node Type BatchNormalization
19/224: Converting Node Type Relu
20/224: Converting Node Type Conv
21/224: Converting Node Type MaxPool
22/224: Converting Node Type BatchNormalization
23/224: Converting Node Type Relu
24/224: Converting Node Type Conv
25/224: Converting Node Type Concat
26/224: Converting Node Type BatchNormalization
27/224: Converting Node Type Relu
28/224: Converting Node Type Conv
29/224: Converting Node Type Concat
30/224: Converting Node Type BatchNormalization
31/224: Converting Node Type Relu
32/224: Converting Node Type Conv
33/224: Converting Node Type Concat
34/224: Converting Node Type BatchNormalization
35/224: Converting Node Type Relu
36/224: Converting Node Type Conv
37/224: Converting Node Type Concat
38/224: Converting Node Type BatchNormalization
39/224: Converting Node Type Relu
40/224: Converting Node Type Conv
41/224: Converting Node Type MaxPool
42/224: Converting Node Type BatchNormalization
43/224: Converting Node Type Relu
44/224: Converting Node Type Conv
45/224: Converting Node Type Concat
46/224: Converting Node Type BatchNormalization
47/224: Converting Node Type Relu
48/224: Converting Node Type Conv
49/224: Converting Node Type Concat
50/224: Converting Node Type BatchNormalization
51/224: Converting Node Type Relu
52/224: Converting Node Type Conv
53/224: Converting Node Type Concat
54/224: Converting Node Type BatchNormalization
55/224: Converting Node Type Relu
56/224: Converting Node Type Conv
57/224: Converting Node Type Concat
58/224: Converting Node Type BatchNormalization
59/224: Converting Node Type Relu
60/224: Converting Node Type Conv
61/224: Converting Node Type MaxPool
62/224: Converting Node Type BatchNormalization
63/224: Converting Node Type Relu
64/224: Converting Node Type Conv
65/224: Converting Node Type Concat
66/224: Converting Node Type BatchNormalization
67/224: Converting Node Type Relu
68/224: Converting Node Type Conv
69/224: Converting Node Type Concat
70/224: Converting Node Type BatchNormalization
71/224: Converting Node Type Relu
72/224: Converting Node Type Conv
73/224: Converting Node Type Concat
74/224: Converting Node Type BatchNormalization
75/224: Converting Node Type Relu
76/224: Converting Node Type Conv
77/224: Converting Node Type Concat
78/224: Converting Node Type BatchNormalization
79/224: Converting Node Type Relu
80/224: Converting Node Type Conv
81/224: Converting Node Type Concat
82/224: Converting Node Type BatchNormalization
83/224: Converting Node Type Relu
84/224: Converting Node Type Conv
85/224: Converting Node Type MaxPool
86/224: Converting Node Type BatchNormalization
87/224: Converting Node Type Relu
88/224: Converting Node Type Conv
89/224: Converting Node Type Concat
90/224: Converting Node Type BatchNormalization
91/224: Converting Node Type Relu
92/224: Converting Node Type Conv
93/224: Converting Node Type Concat
94/224: Converting Node Type BatchNormalization
95/224: Converting Node Type Relu
96/224: Converting Node Type Conv
97/224: Converting Node Type Concat
98/224: Converting Node Type BatchNormalization
99/224: Converting Node Type Relu
100/224: Converting Node Type Conv
101/224: Converting Node Type Concat
102/224: Converting Node Type BatchNormalization
103/224: Converting Node Type Relu
104/224: Converting Node Type Conv
105/224: Converting Node Type Concat
106/224: Converting Node Type BatchNormalization
107/224: Converting Node Type Relu
108/224: Converting Node Type Conv
109/224: Converting Node Type MaxPool
110/224: Converting Node Type BatchNormalization
111/224: Converting Node Type Relu
112/224: Converting Node Type Conv
113/224: Converting Node Type Concat
114/224: Converting Node Type ConvTranspose
115/224: Converting Node Type Add
116/224: Converting Node Type Slice

On GitHub it was recommended that I “generate two concatenated slicing operations, one working on a dimension at a time” but I’m not sure where (or how, honestly) to do that. Any advice greatly appreciated!