# 建立多任务图

Shared_layer的输出分别作为Y1、Y2的输入，并分别计算loss。

# 训练

1. 交替训练
2. 联合训练

## 如何选择？

### 什么时候交替训练好？

Alternate training is a good idea when you have two different datasets for each of the different tasks (for example, translating from English to French and English to German). By designing a network in this way, you can improve the performance of each of your individual tasks without having to find more task-specific training data.