块的示例代码Sequential
是
self._encoder = nn.Sequential(
# 1, 28, 28
nn.Conv2d(in_channels=1, out_channels=32, kernel_size=3, stride=3, padding=1),
# 32, 10, 10 = 16, (1//3)(28 + 2 * 1 - 3) + 1, (1//3)(28 + 2*1 - 3) + 1
nn.ReLU(True),
nn.MaxPool2d(kernel_size=2, stride=2),
# 32, 5, 5
nn.Conv2d(in_channels=32, out_channels=64, kernel_size=3, stride=2, padding=1),
# 64, 3, 3
nn.ReLU(True),
nn.MaxPool2d(kernel_size=2, stride=1),
# 64, 2, 2
)
是否有一些类似nn.Sequential
的结构将模块并行放入其中?
我现在想定义类似
self._mean_logvar_layers = nn.Parallel(
nn.Conv2d(in_channels=64, out_channels=64, kernel_size=2, stride=1, padding=0),
nn.Conv2d(in_channels=64, out_channels=64, kernel_size=2, stride=1, padding=0),
)
其输出应该是两个数据管道 - 每个元素一个,self._mean_logvar_layers
然后可馈送到网络的其余部分。有点像多头网络。
我目前的实现:
self._mean_layer = nn.Conv2d(in_channels=64, out_channels=64, kernel_size=2, stride=1, padding=0)
self._logvar_layer = nn.Conv2d(in_channels=64, out_channels=64, kernel_size=2, stride=1, padding=0)
和
def _encode(self, x: torch.Tensor) -> Tuple[torch.Tensor, torch.Tensor]:
for i, layer in enumerate(self._encoder):
x = layer(x)
mean_output = self._mean_layer(x)
logvar_output = self._logvar_layer(x)
return mean_output, logvar_output
我想将并行构造视为一个层。
这在 PyTorch 中可行吗?