Symbolic Neural Modeling

Open in Colab

This notebook illustrates how to use PyGlove to symbolically manipulate Keras layers for neural modeling.

!pip install pyglove

Symbolizing Keras Layers

Before we can manipulate the combination of Keras layers, we symbolize Keras layer classes via pg.symbolize.

import tensorflow as tf
import pyglove as pg

# Symbolize Keras layers.
Sequential = pg.symbolize(tf.keras.Sequential)
Conv2D = pg.symbolize(tf.keras.layers.Conv2D)
Dense = pg.symbolize(tf.keras.layers.Dense)
Flatten = pg.symbolize(tf.keras.layers.Flatten)
ReLU = pg.symbolize(tf.keras.layers.ReLU)

Creating a Symbolic Model

By using the symbolic layer classes, we can create a symbolic neural model for 2D image classification with 10 classes.

def create_model():
  return Sequential([
     Conv2D(16, (5, 5)),
     ReLU(),
     Conv2D(32, (3, 3)),
     ReLU(),
     Flatten(),
     Dense(10)
  ])

model = create_model()

# The symbolized Keras layers can be printed in human readable form.
# For clarity, we hide the default values of the layers.
print(model.format(hide_default_values=True))
Sequential(
  layers = [
    0 : Conv2D(
      filters = 16,
      kernel_size = (5, 5)
    ),
    1 : ReLU(),
    2 : Conv2D(
      filters = 32,
      kernel_size = (3, 3)
    ),
    3 : ReLU(),
    4 : Flatten(),
    5 : Dense(
      units = 10
    )
  ]
)

Manipulating Models

What if we want to upscale the model by increasing the number of filters by 2?

def double_width(k, v, p):
  """A rebind rule for doubling the filters for Conv2D layers.
  
  Args:
    k: A `pg.KeyPath` object representing the location of current node.
    v: The value of current node.
    p: The parent of current node.

  Returns:
    The output value for current node.
  """
  if isinstance(p, Conv2D) and k.key == 'filters':
    return 2 * v
  return v

# Rebind allows the users to manipulate a symbolic object by
# rules.
print(model.rebind(double_width).format(hide_default_values=True))
Sequential(
  layers = [
    0 : Conv2D(
      filters = 32,
      kernel_size = (5, 5)
    ),
    1 : ReLU(),
    2 : Conv2D(
      filters = 64,
      kernel_size = (3, 3)
    ),
    3 : ReLU(),
    4 : Flatten(),
    5 : Dense(
      units = 10
    )
  ]
)

What if we want to remove the ReLU activations?

def remove_activations(k, v, p):
  if isinstance(v, ReLU):
    # `pg.MISSING_VALUE` is a placeholder for deleting a value from a container.
    return pg.MISSING_VALUE
  return v
print(model.rebind(remove_activations).format(hide_default_values=True))
Sequential(
  layers = [
    0 : Conv2D(
      filters = 32,
      kernel_size = (5, 5)
    ),
    1 : Conv2D(
      filters = 64,
      kernel_size = (3, 3)
    ),
    2 : Flatten(),
    3 : Dense(
      units = 10
    )
  ]
)

What if we want to change the number of classes for the classification head from 10 to 100?

# Query the last Dense layer in the model, and modify its units to 100.
result = pg.query(model, where=lambda v: isinstance(v, Dense))
classification_head_location = list(result.keys())[-1]
model.rebind({
    f'{classification_head_location}.units': 100
})
print(model.format(hide_default_values=True))
Sequential(
  layers = [
    0 : Conv2D(
      filters = 32,
      kernel_size = (5, 5)
    ),
    1 : Conv2D(
      filters = 64,
      kernel_size = (3, 3)
    ),
    2 : Flatten(),
    3 : Dense(
      units = 100
    )
  ]
)