我正在尝试gradient_override_map
与 Tensorflow 2.0 一起使用。文档中有一个示例,我也将在此处用作示例。
在 2.0 中,GradientTape
可用于计算梯度,如下所示:
import tensorflow as tf
print(tf.version.VERSION) # 2.0.0-alpha0
x = tf.Variable(5.0)
with tf.GradientTape() as tape:
s_1 = tf.square(x)
print(tape.gradient(s_1, x))
还有tf.custom_gradient
装饰器,可用于定义新函数的渐变(同样,使用文档中的示例):
import tensorflow as tf
print(tf.version.VERSION) # 2.0.0-alpha
@tf.custom_gradient
def log1pexp(x):
e = tf.exp(x)
def grad(dy):
return dy * (1 - 1 / (1 + e))
return tf.math.log(1 + e), grad
x = tf.Variable(100.)
with tf.GradientTape() as tape:
y = log1pexp(x)
print(tape.gradient(y, x))
但是,我想替换标准函数的渐变,例如tf.square
. 我尝试使用以下代码:
@tf.RegisterGradient("CustomSquare")
def _custom_square_grad(op, grad):
return tf.constant(0)
with tf.Graph().as_default() as g:
x = tf.Variable(5.0)
with g.gradient_override_map({"Square": "CustomSquare"}):
with tf.GradientTape() as tape:
s_2 = tf.square(x, name="Square")
with tf.compat.v1.Session() as sess:
sess.run(tf.compat.v1.global_variables_initializer())
print(sess.run(tape.gradient(s_2, x)))
但是,有两个问题:梯度替换似乎不起作用(它被评估为10.0
而不是0.0
),我需要求助于session.run()
执行图表。有没有办法在“原生”TensorFlow 2.0 中实现这一点?
在 TensorFlow 1.12.0 中,以下生成所需的输出:
import tensorflow as tf
print(tf.__version__) # 1.12.0
@tf.RegisterGradient("CustomSquare")
def _custom_square_grad(op, grad):
return tf.constant(0)
x = tf.Variable(5.0)
g = tf.get_default_graph()
with g.gradient_override_map({"Square": "CustomSquare"}):
s_2 = tf.square(x, name="Square")
grad = tf.gradients(s_2, x)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
print(sess.run(grad))