Computing gradients in AutoGraph

Hey, all. I’m in the process of getting familiar with TF’s autograph — the series of posts by @pgaleone were super helpful.

If I want to compute and return gradients inside the graph definition using the tf.function decorator, is using tf.GradientTape a sensible approach for TF 2? This is what I have so far for the simple graph illustrated below:

@tf.function def create_graph(x, y, get_grads):   with tf.GradientTape(persistent=True) as tape:     c = x + y     d = y + 1     e = c * d    if get_grads == False:     return [e, {}]   else:     de_da = tape.gradient(e, x)     de_db = tape.gradient(e, y)     de_dc = tape.gradient(e, c)     de_dd = tape.gradient(e, d)     de_de = tape.gradient(e, e)         return [e,{'d_da':de_da,              'd_db':de_db,             'd_dc':de_dc,             'd_dd':de_dd,              'd_de':de_de}] graph = tf.function(create_graph) 

1 Like