[ifsheldon/stannum] Proxy `torch.nn.Parameter` in `Tin` for PyTorch optimizers

Now Tin is a subclass of torch.nn.Module and it can have learnable parameters in the form of values in Taichi fields. However, now these values cannot be optimized by PyTorch optimizers, since they are not PyTorch-compatible. One way to make them to be PyTorch-compatible is to use a proxy torch.nn.Parameter and sync the values of torch.nn.Parameter with those in the corresponding Taichi field.

If anyone come up with a better solution, discussions and PRs are always welcomed.

Read more here: Source link