API / Model / Container?

API/Container/Organizer/Model is way of arranging the layers to create neural networks.

API Types

1. Sequential (Simple, linear models) 2. Functional (Complex, production-ready models) 3. Model Subclassing (Research, highly custom architectures)
Description Layers are placed in linear fashion, to create neural network
linear topology with exactly one input and one output
Non-linear data flow, complex layers, shared layers, and multiple inputs/outputs
Supports Multiple Inputs/Outputs
Provides highest level of flexibility, allowing to implement everything from scratch by subclassing the tf.keras.Model
For dynamic architectures, such as models that require custom loops or conditional logic in the forward pass
Supports Multiple Inputs/Outputs
Examples Multi-layer perceptrons (MLPs), Basic Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs) Multi-input/multi-output models (e.g., a model that takes image and text inputs to produce a classification and a regression output) Out-of-the-box research models or custom architectures like a Tree-RNN

SEQUENTIAL API - STRAIGHT LINE FLOW
┌─────────┐     ┌─────────┐     ┌─────────┐     ┌─────────┐
│ INPUT   │────▶│ LAYER 1│────▶│ LAYER 2 │───▶│ OUTPUT  │
│ (Shape) │     │         │     │         │     │         │
└─────────┘     └─────────┘     └─────────┘     └─────────┘
     │               │               │               │
     └───────────────┴───────────────┴───────────────┘
                 Data flows in ONE direction
          

FUNCTIONAL API - GRAPH STRUCTURE
                          ┌─────────┐
                     ┌───▶│ Dense 3 │───┐
┌─────────┐     ┌────┴─┐  └─────────┘   ▼  ┌─────────┐
│ INPUT 1 │────▶│      │             ┌─────▶│ OUTPUT 1│
└─────────┘     │ Dense│             │      └─────────┘
                 │   1  │             │
┌─────────┐     │      │             │      ┌─────────┐
│ INPUT 2 │────▶│      │───┐         └─────▶│ OUTPUT 2│
└─────────┘     └──────┘   │                └─────────┘
                 │         ▼
                 │    ┌─────────┐
                 └───▶│ Dense 2 │
                      └─────────┘
          
Code Example

model = Sequential([
    layers.Dense(64, activation='relu', 
                input_shape=(10,)),  # Layer 1
    layers.Dropout(0.2),             # Layer 2  
    layers.Dense(32, activation='relu'),# Layer 3
    layers.Dense(1)                     # Layer 4 (Output)
])
          

from tensorflow.keras import Input, Model
from tensorflow.keras import layers

# Define two inputs
text_input = Input(shape=(100,), name='text_input')
image_input = Input(shape=(32, 32, 3), name='image_input')

# Process text branch
text_branch = layers.Dense(64, activation='relu')(text_input)
text_branch = layers.Dense(32, activation='relu')(text_branch)

# Process image branch  
image_branch = layers.Conv2D(32, (3, 3))(image_input)
image_branch = layers.Flatten()(image_branch)
image_branch = layers.Dense(32)(image_branch)

# Combine branches
combined = layers.concatenate([text_branch, image_branch])
output = layers.Dense(1, activation='sigmoid')(combined)

# Create model
model = Model(inputs=[text_input, image_input], outputs=output)
          

Layers in Keras?

Layer is component used to build a neural network. Multiple layers are stacked together to create a neural network. Output from 1 layer is fed to other.
Layers is composed of Neurons
Layers

Types of Layers in Keras

Layer Description How to create
1. Dense Fully connected layer. Every neuron in this layer is connected to every neuron from the previous layer

layers.Dense(output_neurons=1, input_FeaturesOrDimensions=[11])

INPUT LAYER          DENSE LAYER (Output)
   ↓                        ↓
┌─────────┐            ┌─────────┐
│ x₁      │───────────→│         │
├─────────┤            │         │
│ x₂      │───────────→│ Neuron  │──→ Output
├─────────┤            │         │
│ x₃      │───────────→│         │
├─────────┤            └─────────┘
│ ...     │                ↑
├─────────┤           (weights & bias)
│ x₁₁     │───────────→ w₁₁x₁₁ + w₁₀x₁₀ + ... + w₁x₁ + b
└─────────┘
            
2. Convolutional(Conv2D, Conv1D) For image/sequence processing layers.Conv2D(32, (3, 3), activation='relu')
3. Pooling (MaxPooling2D, AveragePooling2D) For downsampling layers.MaxPooling2D((2, 2))
4. Recurrent (LSTM, GRU) For sequential/temporal data layers.LSTM(64, return_sequences=True)
5. Dropout For regularization layers.Dropout(0.3)
6. Batch Normalization For stabilizing training layers.BatchNormalization()
7. Embedding For text/word representations layers.Embedding(10000, 128)
8. Flatten reshaping tensor to 1D layers.Flatten()