Neural Network¶
Bases: Module
A Multi-Layer Perceptron (MLP) model that is compatible with GPFlow, allowing its parameters (weights and biases) to be optimized as part of a GPflow model (e.g., within a custom kernel).
The network consists of multiple fully connected (dense) layers with specified activation functions.
Attributes:
Name | Type | Description |
---|---|---|
dims |
List[int]
|
List of layer sizes, including input and output dimensions. |
activation_fn |
Callable
|
Activation function for hidden layers. |
output_activation_fn |
Callable
|
Activation function for the output layer. |
_weights |
List[Variable]
|
List of TensorFlow Variable for weights of each layer. |
_biases |
List[Variable]
|
List of TensorFlow Variable for biases of each layer. |
Source code in sgptools/kernels/neural_network.py
52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 |
|
__call__(X)
¶
Performs a forward pass through the MLP.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
X
|
Tensor
|
(N, D_in); The input tensor to the MLP. |
required |
Returns:
Type | Description |
---|---|
Tensor
|
tf.Tensor: (N, D_out); The output tensor from the MLP. |
Source code in sgptools/kernels/neural_network.py
__init__(dims, activation_fn='selu', output_activation_fn='softmax')
¶
Initializes the Multi-Layer Perceptron (MLP).
Parameters:
Name | Type | Description | Default |
---|---|---|---|
dims
|
List[int]
|
A list of integers specifying the size of each layer.
The first element is the input dimension, the last is
the output dimension, and intermediate elements are
hidden layer sizes.
Example: |
required |
activation_fn
|
Union[str, Callable]
|
The activation function to use for hidden layers. Can be a string (e.g., 'relu', 'tanh', 'selu') or a callable TensorFlow activation function. Defaults to 'selu'. |
'selu'
|
output_activation_fn
|
Union[str, Callable]
|
The activation function to use for the output layer. Can be a string (e.g., 'softmax', 'sigmoid', 'softplus') or a callable TensorFlow activation function. Defaults to 'softplus'. |
'softmax'
|
Usage
from sgptools.kernels.neural_network import NN
import tensorflow as tf
import numpy as np
# Example: A simple MLP with one hidden layer
mlp = NN(dims=[2, 10, 1], activation_fn='tanh', output_activation_fn='sigmoid')
# Input data
input_data = tf.constant(np.random.rand(5, 2), dtype=tf.float32)
# Pass input through the network
output = mlp(input_data)