Question 1 (20 mins)
Use PyTorch to find the numerical solution to $2x^2-4x+1$
Question 2 (20 mins)
Implement a custom activation function $f(x) = ln(1 + e^x)$, and integrate it into a simple neural network to approximate a simple function.
Question 3 (10 mins)
Use ONNX/Netron tools to visualize resnet18 architecture from torchvision library
Question 4 (30 mins)
Design a neural network in PyTorch that has two parallel input branches, combines these inputs with additional data midway through the network, and then splits into two separate output branches.
- Input Layer:
- The network starts with two parallel input branches.
- Each branch should accept an input tensor of shape
(N, 10)
, where N
is the batch size.
- First and Second Branch:
- Branch 1 and Branch 2 are identical in structure.
- Each branch consists of the following layers:
- A linear layer that expands the input from 10 to 20 features.
- A ReLU activation layer.
- Another linear layer that further expands from 20 to 30 features.
- Midway Additional Inputs:
- After the first and second branches, introduce an additional input tensor of shape
(N, 5)
.
- This additional input represents extra features to be combined with the outputs of the two branches.
- Combination of Branch Outputs and Additional Input:
- Concatenate the outputs of the two branches (each of shape
(N, 30)
) with the additional input (shape (N, 5)
), resulting in a tensor of shape (N, 65)
.
- Shared Layers After Combination:
- Pass the combined tensor through a shared linear layer that reduces the dimension from 65 to 50.
- Apply a ReLU activation function.
- Two Separate Output Branches:
- Split into two separate output branches after the shared layers.
- Output Branch 1 and Output Branch 2:
- Each output branch consists of a single linear layer that maps the 50 features to a single output feature (shape
(N, 1)
).