升入理解TensorFlow

架构与设计

grads_and_vars = optimizer.compute_gradients(loss)
for i, (g, v) in enumerate(grads_and_vars): 
    if g is not None: 
        grads_and_vars[i] = (tf.clip_by_norm(g, 5), v) # 裁剪梯度
train_op = optimizer.apply_gradients(grads_and_vars)

关键模块篇

数据处理方法

TensorFlow编程框架

TensorBoard

模型托管 Tensorflow Serving