API / Model / Container?
-
API/Container/Organizer/Model is way of arranging the layers to create
neural networks.
API Types
| 1. Sequential (Simple, linear models) | 2. Functional (Complex, production-ready models) | 3. Model Subclassing (Research, highly custom architectures) | |
|---|---|---|---|
| Description |
Layers are placed in linear fashion, to create neural network linear topology with exactly one input and one output |
Non-linear data flow, complex layers, shared layers, and multiple
inputs/outputs Supports Multiple Inputs/Outputs |
Provides highest level of flexibility, allowing to implement
everything from scratch by subclassing the tf.keras.Model For dynamic architectures, such as models that require custom loops or conditional logic in the forward pass Supports Multiple Inputs/Outputs |
| Examples | Multi-layer perceptrons (MLPs), Basic Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs) | Multi-input/multi-output models (e.g., a model that takes image and text inputs to produce a classification and a regression output) | Out-of-the-box research models or custom architectures like a Tree-RNN |
|
|
||
| Code Example |
|
|
Layers in Keras?
-
Layer is component used to build a
neural network.
Multiple layers are stacked together to create a neural network. Output
from 1 layer is fed to other.
Layers is composed of Neurons
Types of Layers in Keras
| Layer | Description | How to create |
|---|---|---|
| 1. Dense | Fully connected layer. Every neuron in this layer is connected to every neuron from the previous layer |
|
| 2. Convolutional(Conv2D, Conv1D) | For image/sequence processing | layers.Conv2D(32, (3, 3), activation='relu') |
| 3. Pooling (MaxPooling2D, AveragePooling2D) | For downsampling | layers.MaxPooling2D((2, 2)) |
| 4. Recurrent (LSTM, GRU) | For sequential/temporal data | layers.LSTM(64, return_sequences=True) |
| 5. Dropout | For regularization | layers.Dropout(0.3) |
| 6. Batch Normalization | For stabilizing training | layers.BatchNormalization() |
| 7. Embedding | For text/word representations | layers.Embedding(10000, 128) |
| 8. Flatten | reshaping tensor to 1D | layers.Flatten() |