September 3, 2025
Observation:
Display an image in VSCodium, loaded from a local file location.
Delete the file, VSCodium still displays the image.
OS call to ls shows that the file does not exist.
Is there some sort of cache?
Answer:
"The keras.utils.get_file function downloads a file from a specified URL if it is not already present in the
local cache.
By default, the file is saved to the directory ~/.keras/datasets/"
https://keras.io/api/utils/python_utils/#getfile-function
September 10, 2025
Observations:
if we remove the line
optimizer.apply_gradients([(grads, combination_image)])
then the loss remains constant throughout the loop, and the combination image remains the same.
This proves that this is the line that modifies the combination image.
September 11, 2025
Starting with the optimization loop and working our way back from these statements in order to understand the
rest of the algorithm:
The following two lines in the optimization loop make up the generation of the combined image:
loss, grads = compute_loss_and_grads( combination_image, base_image, style_reference_image )
optimizer.apply_gradients([(grads, combination_image)])
The first statement computes the gradients that point in the direction of a loss reduction.
For this, the tf.GradientTape() function is used.
Then, with the second statement, we move to a new point in the combination_image space going in the direction of
a lower loss by applying the gradients to the combination_image.
In the next iteration, the new value of combination_image serves as an input for the next gradient generation,
thus creating a loop that leads to a minimum.
What do tf.GradientTape() and apply_gradients() do?
Using TensorFlow and GradientTape to train a Keras model
https://pyimagesearch.com/2020/03/23/using-tensorflow-and-gradienttape-to-train-a-keras-model/
grads = tape.gradient(loss, model.trainable_variables)
opt.apply_gradients(zip(grads, model.trainable_variables))
What is the purpose of the Tensorflow Gradient Tape?
https://stackoverflow.com/questions/53953099/what-is-the-purpose-of-the-tensorflow-gradient-tape
September 12, 2025
Neural Transfer Using PyTorch
https://docs.pytorch.org/tutorials/advanced/neural_style_tutorial.html#style-loss
tf.GradientTape in TensorFlow
https://www.geeksforgeeks.org/deep-learning/tf-gradienttape-in-tensorflow/
Simple example computing the derivative of x^2
scalar = tf.Variable(3.0)
with tf.GradientTape() as tape:
  y = scalar**2
dy_dx = tape.gradient(y, scalar)
tape.gradient computes the gradient (derivative) of y with respect to scalar.
fchollet style transfer example:
with tf.GradientTape() as tape:
  loss = compute_loss(combination_image, base_image, style_reference_image)
grads = tape.gradient(loss, combination_image)
optimizer.apply_gradients([(grads, combination_image)])
tape.gradient computes the gradient of loss with respect to combination_image.