TensorFlow学习笔记(一)TensorFlow基础

1
2
3
4
5
6
7
import tensorflow as tf
import tensorflow.keras as keras
import tensorflow.keras.layers as layers

# physical_devices = tf.config.experimental.list_physical_devices('GPU')
# assert len(physical_devices) > 0, "Not enough GPU hardware devices available"
# tf.config.experimental.set_memory_growth(physical_devices[0], True)

数据类型

数值类型

标量在 TensorFlow 是如何创建的

1
2
3
4
5
6
# python 语言方式创建标量
a = 1.2
# TF 方式创建标量
aa = tf.constant(1.2)

type(a), type(aa), tf.is_tensor(aa)
(float, tensorflow.python.framework.ops.EagerTensor, True)

如果要使用 TensorFlow 提供的功能函数, 须通过 TensorFlow 规定的方式去创建张量,而不能使用 Python 语言的标准变量创建方式。

1
2
3
x = tf.constant([1,2.,3.3])
# 打印 TF 张量的相关信息
x
<tf.Tensor: id=1, shape=(3,), dtype=float32, numpy=array([1. , 2. , 3.3], dtype=float32)>
1
2
# 将 TF 张量的数据导出为 numpy 数组格式
x.numpy()
array([1. , 2. , 3.3], dtype=float32)

与标量不同,向量的定义须通过 List 容器传给 tf.constant()函数。

创建一个元素的向量:

1
2
3
# 创建一个元素的向量
a = tf.constant([1.2])
a, a.shape
(<tf.Tensor: id=2, shape=(1,), dtype=float32, numpy=array([1.2], dtype=float32)>,
 TensorShape([1]))

创建 3 个元素的向量:

1
2
3
 # 创建 3 个元素的向量
a = tf.constant([1,2, 3.])
a, a.shape
(<tf.Tensor: id=3, shape=(3,), dtype=float32, numpy=array([1., 2., 3.], dtype=float32)>,
 TensorShape([3]))

定义矩阵

1
2
3
# 创建 2 行 2 列的矩阵
a = tf.constant([[1,2],[3,4]])
a, a.shape
(<tf.Tensor: id=4, shape=(2, 2), dtype=int32, numpy=
 array([[1, 2],
        [3, 4]], dtype=int32)>,
 TensorShape([2, 2]))

三维张量可以定义为:

1
2
# 创建 3 维张量
tf.constant([[[1,2],[3,4]],[[5,6],[7,8]]])
<tf.Tensor: id=5, shape=(2, 2, 2), dtype=int32, numpy=
array([[[1, 2],
        [3, 4]],

       [[5, 6],
        [7, 8]]], dtype=int32)>

通过传入字符串对象即可创建字符串类型的张量

1
2
3
# 创建字符串
a = tf.constant('Hello, Deep Learning.')
a
<tf.Tensor: id=6, shape=(), dtype=string, numpy=b'Hello, Deep Learning.'>

字符串类型

通过传入字符串对象即可创建字符串类型的张量

1
2
3
# 创建字符串
a = tf.constant('Hello, Deep Learning.')
a
<tf.Tensor: id=7, shape=(), dtype=string, numpy=b'Hello, Deep Learning.'>

在 tf.strings 模块中,提供了常见的字符串类型的工具函数,如小写化 lower()、 拼接
join()、 长度 length()、 切分 split()等。

1
2
# 小写化字符串
tf.strings.lower(a)
<tf.Tensor: id=8, shape=(), dtype=string, numpy=b'hello, deep learning.'>

布尔类型

布尔类型的张量只需要传入 Python 语言的布尔类型数据,转换成 TensorFlow 内部布尔型即可。

1
2
# 创建布尔类型标量
tf.constant(True)
<tf.Tensor: id=9, shape=(), dtype=bool, numpy=True>

创建布尔类型的向量

1
2
 # 创建布尔类型向量
tf.constant([True, False])
<tf.Tensor: id=10, shape=(2,), dtype=bool, numpy=array([ True, False])>

需要注意的是, TensorFlow 的布尔类型和 Python 语言的布尔类型并不等价,不能通用

1
2
3
4
5
6
# 创建 TF 布尔张量
a = tf.constant(True)
# TF 布尔类型张量与 python 布尔类型比较
print(a is True)
# 仅数值比较
print(a == True)
False
tf.Tensor(True, shape=(), dtype=bool)

数值精度

在创建张量时,可以指定张量的保存精度

1
2
# 创建指定精度的张量
tf.constant(123456789, dtype=tf.int16)
<tf.Tensor: id=14, shape=(), dtype=int16, numpy=-13035>
1
tf.constant(123456789, dtype=tf.int32)
<tf.Tensor: id=15, shape=(), dtype=int32, numpy=123456789>

对于浮点数, 高精度的张量可以表示更精准的数据,例如采用 tf.float32 精度保存π时,实际保存的数据为 3.1415927

1
2
3
4
5
import numpy as np
# 从 numpy 中导入 pi 常量
np.pi
# 32 位
tf.constant(np.pi, dtype=tf.float32)
<tf.Tensor: id=16, shape=(), dtype=float32, numpy=3.1415927>

如果采用 tf.float64 精度保存π,则能获得更高的精度

1
tf.constant(np.pi, dtype=tf.float64) # 64 位
<tf.Tensor: id=17, shape=(), dtype=float64, numpy=3.141592653589793>

读取精度

通过访问张量的 dtype 成员属性可以判断张量的保存精度

1
2
3
4
5
6
7
8
9
10
a = tf.constant(np.pi, dtype=tf.float16)

# 读取原有张量的数值精度
print('before:',a.dtype)
# 如果精度不符合要求,则进行转换
if a.dtype != tf.float32:
# tf.cast 函数可以完成精度转换
a = tf.cast(a,tf.float32)
# 打印转换后的精度
print('after :',a.dtype)
before: <dtype: 'float16'>
after : <dtype: 'float32'>

类型转换

系统的每个模块使用的数据类型、 数值精度可能各不相同, 对于不符合要求的张量的类型及精度, 需要通过 tf.cast 函数进行转换

1
2
3
4
# 创建 tf.float16 低精度张量
a = tf.constant(np.pi, dtype=tf.float16)
# 转换为高精度张量
tf.cast(a, tf.double)
<tf.Tensor: id=21, shape=(), dtype=float64, numpy=3.140625>

进行类型转换时,需要保证转换操作的合法性, 例如将高精度的张量转换为低精度的张量时,可能发生数据溢出隐患:

1
2
3
a = tf.constant(123456789, dtype=tf.int32)
# 转换为低精度整型
tf.cast(a, tf.int16)
<tf.Tensor: id=23, shape=(), dtype=int16, numpy=-13035>

布尔类型与整型之间相互转换也是合法的, 是比较常见的操作

1
2
3
a = tf.constant([True, False])
# 布尔类型转整型
tf.cast(a, tf.int32)
<tf.Tensor: id=25, shape=(2,), dtype=int32, numpy=array([1, 0], dtype=int32)>

一般默认 0 表示 False, 1 表示 True,在 TensorFlow 中,将非 0 数字都视为 True,

1
2
3
a = tf.constant([-1, 0, 1, 2])
# 整型转布尔类型
tf.cast(a, tf.bool)
<tf.Tensor: id=27, shape=(4,), dtype=bool, numpy=array([ True, False,  True,  True])>

待优化张量

TensorFlow 增加了一种专门的数据类型来支持梯度信息的记录: tf.Variable。 tf.Variable 类型在普通的张量类型基础上添加了 name, trainable 等属性来支持计算图的构建。

1
2
3
4
5
6
# 创建 TF 张量
a = tf.constant([-1, 0, 1, 2])
# 转换为 Variable 类型
aa = tf.Variable(a)
# Variable 类型张量的属性
aa.name, aa.trainable
('Variable:0', True)

name 属性用于命名计算图中的变量,这套命名体系是 TensorFlow 内部维护的, 一般不需要用户关注 name 属性;
trainable属性表征当前张量是否需要被优化,创建 Variable 对象时是默认启用优化标志,可以设置trainable=False 来设置张量不需要优化。

1
2
# 直接创建 Variable 张量
tf.Variable([[1,2],[3,4]])
<tf.Variable 'Variable:0' shape=(2, 2) dtype=int32, numpy=
array([[1, 2],
       [3, 4]], dtype=int32)>

创建张量

从数组、列表对象创建

通过 tf.convert_to_tensor 函数可以创建新 Tensor,并将保存在 Python List 对象或者Numpy Array 对象中的数据导入到新 Tensor 中。

1
2
# 从列表创建张量
tf.convert_to_tensor([1,2.])
<tf.Tensor: id=44, shape=(2,), dtype=float32, numpy=array([1., 2.], dtype=float32)>
1
2
# 从数组中创建张量
tf.convert_to_tensor(np.array([[1,2.],[3,4]]))
<tf.Tensor: id=45, shape=(2, 2), dtype=float64, numpy=
array([[1., 2.],
       [3., 4.]])>

创建全0或全1张量

1
2
# 创建全 0,全 1 的标量
tf.zeros([]),tf.ones([])
(<tf.Tensor: id=46, shape=(), dtype=float32, numpy=0.0>,
 <tf.Tensor: id=47, shape=(), dtype=float32, numpy=1.0>)
1
2
# 创建全 0,全 1 的向量
tf.zeros([1]),tf.ones([1])
(<tf.Tensor: id=50, shape=(1,), dtype=float32, numpy=array([0.], dtype=float32)>,
 <tf.Tensor: id=53, shape=(1,), dtype=float32, numpy=array([1.], dtype=float32)>)

创建全 0 的矩阵

1
2
# 创建全 0 矩阵,指定 shape 为 2 行 2 列
tf.zeros([2,2])
<tf.Tensor: id=56, shape=(2, 2), dtype=float32, numpy=
array([[0., 0.],
       [0., 0.]], dtype=float32)>

创建全 1 的矩阵

1
2
# 创建全 1 矩阵,指定 shape 为 3 行 2 列
tf.ones([3,2])
<tf.Tensor: id=59, shape=(3, 2), dtype=float32, numpy=
array([[1., 1.],
       [1., 1.],
       [1., 1.]], dtype=float32)>

通过 tf.zeros_like, tf.ones_like 可以方便地新建与某个张量 shape 一致, 且内容为全 0 或全 1 的张量。

1
2
3
4
# 创建一个矩阵
a = tf.ones([2,3])
# 创建一个与 a 形状相同,但是全 0 的新矩阵
tf.zeros_like(a)
<tf.Tensor: id=63, shape=(2, 3), dtype=float32, numpy=
array([[0., 0., 0.],
       [0., 0., 0.]], dtype=float32)>

创建与张量A形状一样的全 1 张量

1
2
3
4
# 创建一个矩阵
a = tf.zeros([3,2])
# 创建一个与 a 形状相同,但是全 1 的新矩阵
tf.ones_like(a)
<tf.Tensor: id=69, shape=(3, 2), dtype=float32, numpy=
array([[1., 1.],
       [1., 1.],
       [1., 1.]], dtype=float32)>

创建自定义数值张量

通过 tf.fill(shape, value)可以创建全为自定义数值 value 的张量,形状由 shape 参数指定。

1
2
# 创建-1 的标量
tf.fill([], -1)
<tf.Tensor: id=72, shape=(), dtype=int32, numpy=-1>
1
2
# 创建-1 的向量
tf.fill([1], -1)
<tf.Tensor: id=75, shape=(1,), dtype=int32, numpy=array([-1], dtype=int32)>
1
2
# 创建 2 行 2 列,元素全为 99 的矩阵
tf.fill([2,2], 99)
<tf.Tensor: id=78, shape=(2, 2), dtype=int32, numpy=
array([[99, 99],
       [99, 99]], dtype=int32)>

创建已知分布的张量

通过 tf.random.normal(shape, mean=0.0, stddev=1.0)可以创建形状为 shape,均值为mean,标准差为 stddev 的正态分布$\mathcal{N}(mean, stddev^2)$。

1
2
# 创建标准正态分布的张量
tf.random.normal([2,2])
<tf.Tensor: id=84, shape=(2, 2), dtype=float32, numpy=
array([[ 0.8372936 , -0.00487547],
       [ 0.5917305 ,  0.9924748 ]], dtype=float32)>
1
2
# 创建均值为 1,标准差为 2 的正态分布的张量
tf.random.normal([2,2], mean=1,stddev=2)
<tf.Tensor: id=90, shape=(2, 2), dtype=float32, numpy=
array([[1.6426632 , 0.9099915 ],
       [1.7133203 , 0.14123482]], dtype=float32)>

通过 tf.random.uniform(shape, minval=0, maxval=None, dtype=tf.float32)可以创建采样自[minval, maxval)区间的均匀分布的张量

1
2
# 创建采样自[0,1)均匀分布的矩阵
tf.random.uniform([3,2])
<tf.Tensor: id=97, shape=(3, 2), dtype=float32, numpy=
array([[0.80524087, 0.5057876 ],
       [0.5653434 , 0.21946168],
       [0.48825264, 0.09415054]], dtype=float32)>
1
2
# 创建采样自[0,10)均匀分布的矩阵
tf.random.uniform([2,2],maxval=10)
<tf.Tensor: id=104, shape=(2, 2), dtype=float32, numpy=
array([[8.02882  , 9.814098 ],
       [5.9886417, 1.3643861]], dtype=float32)>

如果需要均匀采样整形类型的数据,必须指定采样区间的最大值 maxval 参数,同时指定数据类型为 tf.int*型

1
2
# 创建采样自[0,100)均匀分布的整型矩阵
tf.random.uniform([2,2],maxval=100,dtype=tf.int32)
<tf.Tensor: id=108, shape=(2, 2), dtype=int32, numpy=
array([[ 5, 91],
       [33, 20]], dtype=int32)>

创建序列

tf.range(limit, delta=1)可以创建[0, limit)之间,步长为 delta 的整型序列,不包含 limit 本身。

1
2
# 0~10,不包含 10
tf.range(10)
<tf.Tensor: id=112, shape=(10,), dtype=int32, numpy=array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9], dtype=int32)>
1
2
# 创建 0~10,步长为 2 的整形序列
tf.range(10,delta=2)
<tf.Tensor: id=116, shape=(5,), dtype=int32, numpy=array([0, 2, 4, 6, 8], dtype=int32)>
1
tf.range(1,10,delta=2) # 1~10
<tf.Tensor: id=120, shape=(5,), dtype=int32, numpy=array([1, 3, 5, 7, 9], dtype=int32)>

张量的典型应用

标量

1
2
3
4
5
6
7
8
9
10
11
# 随机模拟网络输出
out = tf.random.uniform([4,10])
# 随机构造样本真实标签
y = tf.constant([2,3,2,0])
# one-hot 编码
y = tf.one_hot(y, depth=10)
# 计算每个样本的 MSE
loss = tf.keras.losses.mse(y, out)
# 平均 MSE,loss 应是标量
loss = tf.reduce_mean(loss)
print(loss)
tf.Tensor(0.26203847, shape=(), dtype=float32)
  • tf.reduce_mean()函数用于计算张量tensor沿着指定的数轴(tensor的某一维度)上的的平均值,主要用作降维或者计算tensor(图像)的平均值。

向量

考虑 2 个输出节点的网络层, 我们创建长度为 2 的偏置向量b,并累加在每个输出节点上:

1
2
3
4
5
6
7
8
9
# z=wx,模拟获得激活函数的输入 z
z = tf.random.normal([4,2])
print(z)
# 创建偏置向量
b = tf.zeros([2])
print(b)
# 累加上偏置向量
z = z + b
z
tf.Tensor(
[[ 0.8107377   1.2481661 ]
 [-0.9203342  -0.55204725]
 [ 0.944986    0.00977302]
 [ 0.65324616  0.9092525 ]], shape=(4, 2), dtype=float32)
tf.Tensor([0. 0.], shape=(2,), dtype=float32)





<tf.Tensor: id=432714, shape=(4, 2), dtype=float32, numpy=
array([[ 0.8107377 ,  1.2481661 ],
       [-0.9203342 , -0.55204725],
       [ 0.944986  ,  0.00977302],
       [ 0.65324616,  0.9092525 ]], dtype=float32)>

创建输入节点数为 4,输出节点数为 3 的线性层网络,那么它的偏置向量 b 的长度应为 3

1
2
3
4
5
6
# 创建一层 Wx+b,输出节点为 3
fc = tf.keras.layers.Dense(3)
# 通过 build 函数创建 W,b 张量,输入节点为 4
fc.build(input_shape=(2,4))
# 查看偏置向量
fc.bias
<tf.Variable 'bias:0' shape=(3,) dtype=float32, numpy=array([0., 0., 0.], dtype=float32)>

矩阵

1
2
3
4
5
6
7
8
9
# 2 个样本,特征长度为 4 的张量
x = tf.random.normal([2,4])
# 定义 W 张量
w = tf.ones([4,3])
# 定义 b 张量
b = tf.zeros([3])
# X@W+b 运算
o = x@w+b
o
<tf.Tensor: id=184, shape=(2, 3), dtype=float32, numpy=
array([[-5.028141  , -5.028141  , -5.028141  ],
       [ 0.67261326,  0.67261326,  0.67261326]], dtype=float32)>
1
2
3
4
5
6
# 定义全连接层的输出节点为 3
fc = tf.keras.layers.Dense(3)
# 定义全连接层的输入节点为 4
fc.build(input_shape=(2,4))
# 查看权值矩阵 W
fc.kernel
<tf.Variable 'kernel:0' shape=(4, 3) dtype=float32, numpy=
array([[ 0.5571135 ,  0.40619254,  0.7768836 ],
       [-0.61082566, -0.13341528, -0.90817606],
       [-0.16371965, -0.00938004,  0.6606846 ],
       [ 0.38958526, -0.87978166, -0.36103284]], dtype=float32)>

三维张量

1
2
3
4
5
6
# 自动加载 IMDB 电影评价数据集
(x_train,y_train),(x_test,y_test)=keras.datasets.imdb.load_data(num_words=10000)
# 将句子填充、截断为等长 80 个单词的句子
x_train = keras.preprocessing.sequence.pad_sequences(x_train,maxlen=80)
print(x_train[0:2])
x_train.shape
/Users/maqi/opt/anaconda3/envs/tf2/lib/python3.7/site-packages/tensorflow_core/python/keras/datasets/imdb.py:129: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
  x_train, y_train = np.array(xs[:idx]), np.array(labels[:idx])
/Users/maqi/opt/anaconda3/envs/tf2/lib/python3.7/site-packages/tensorflow_core/python/keras/datasets/imdb.py:130: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
  x_test, y_test = np.array(xs[idx:]), np.array(labels[idx:])


[[  15  256    4    2    7 3766    5  723   36   71   43  530  476   26
   400  317   46    7    4    2 1029   13  104   88    4  381   15  297
    98   32 2071   56   26  141    6  194 7486   18    4  226   22   21
   134  476   26  480    5  144   30 5535   18   51   36   28  224   92
    25  104    4  226   65   16   38 1334   88   12   16  283    5   16
  4472  113  103   32   15   16 5345   19  178   32]
 [ 125   68    2 6853   15  349  165 4362   98    5    4  228    9   43
     2 1157   15  299  120    5  120  174   11  220  175  136   50    9
  4373  228 8255    5    2  656  245 2350    5    4 9837  131  152  491
    18    2   32 7464 1212   14    9    6  371   78   22  625   64 1382
     9    8  168  145   23    4 1690   15   16    4 1355    5   28    6
    52  154  462   33   89   78  285   16  145   95]]





(25000, 80)

可以看到 x_train 张量的 shape 为[25000,80],其中 25000 表示句子个数, 80 表示每个句子共 80 个单词,每个单词使用数字编码方式表示。

我们通过 layers.Embedding 层将数字编码的单词转换为长度为 100 个词向量:

1
2
3
4
5
# 创建词向量 Embedding 层类
embedding = tf.keras.layers.Embedding(10000, 100)
# 将数字编码的单词转换为词向量
out = embedding(x_train)
out.shape
TensorShape([25000, 80, 100])

可以看到,经过 Embedding 层编码后,句子张量的 shape 变为[25000,80,100],其中 100 表示每个单词编码为长度是 100 的向量。

四维张量

1
2
3
4
5
6
7
8
# 创建 32x32 的彩色图片输入,个数为 4
x = tf.random.normal([4,32,32,3])
# 创建卷积神经网络
layer = layers.Conv2D(16, kernel_size=3)
# 前向计算
out = layer(x)
# 输出大小
out.shape
TensorShape([4, 30, 30, 16])
1
2
# 访问卷积核张量
layer.kernel.shape
TensorShape([3, 3, 3, 16])

索引与切片

索引

1
2
# 创建4维张量
x = tf.random.normal([4,32,32,3])
1
2
# 取第 1 张图片的数据
x[0]
<tf.Tensor: id=265, shape=(32, 32, 3), dtype=float32, numpy=
array([[[ 2.2041936 , -1.9026781 ,  0.8702505 ],
        [-1.2282028 , -0.33232537,  0.40958533],
        [ 0.11558069, -0.95446974, -1.5603778 ],
        ...,
        [ 1.8689036 ,  1.3471965 ,  0.46157768],
        [-0.04014067,  0.8095603 ,  1.0308311 ],
        [-0.2001917 , -1.0876633 , -0.35982683]],

       [[-0.6193978 , -1.1049955 , -0.06628878],
        [ 0.5612249 ,  1.5542006 ,  0.6287516 ],
        [ 0.34846973,  0.44159728,  0.8838649 ],
        ...,
        [-0.7220847 ,  0.67017406,  0.1659171 ],
        [ 0.17958985, -0.65319884,  0.39171842],
        [ 0.8067303 ,  0.43496   ,  0.2798552 ]],

       [[-1.163977  , -0.06057478, -0.4857398 ],
        [ 1.3414443 , -0.6038178 , -0.23302878],
        [-2.0975337 ,  0.94285005, -0.27974698],
        ...,
        [-0.5631729 ,  1.0614241 , -0.3096405 ],
        [-0.9624238 ,  1.3738877 , -1.8948269 ],
        [ 1.132725  , -0.20089822, -1.7373965 ]],

       ...,

       [[-0.14071971, -0.5568062 ,  0.01075767],
        [-1.7140628 ,  1.3289738 , -0.8903278 ],
        [-1.0916421 , -0.3162519 , -1.249703  ],
        ...,
        [ 1.325685  ,  1.5440601 , -0.4913852 ],
        [-1.3840119 ,  0.23958059, -0.20719068],
        [ 0.877472  ,  1.3066201 , -1.4298698 ]],

       [[ 0.3794225 ,  0.8216657 , -0.3639167 ],
        [-1.4976484 , -1.0524081 , -1.302156  ],
        [ 0.26988387,  0.34318095,  0.06246407],
        ...,
        [ 2.7228684 , -0.2831678 , -1.0059422 ],
        [-0.7020755 , -1.4222299 ,  0.9356876 ],
        [ 0.4152088 , -0.04397644, -0.73320246]],

       [[ 0.65700305, -1.7467034 , -1.5898855 ],
        [ 1.1514107 , -1.0907453 , -0.5877316 ],
        [ 0.86260825, -0.59653807,  0.0976033 ],
        ...,
        [-0.04578071, -1.2980894 ,  0.9463795 ],
        [-0.09251038,  0.25678882, -0.1819165 ],
        [-0.36038232, -0.53460985,  1.2337509 ]]], dtype=float32)>
1
2
# 取第 1 张图片的第 2 行
x[0][1]
<tf.Tensor: id=273, shape=(32, 3), dtype=float32, numpy=
array([[-0.6193978 , -1.1049955 , -0.06628878],
       [ 0.5612249 ,  1.5542006 ,  0.6287516 ],
       [ 0.34846973,  0.44159728,  0.8838649 ],
       [-0.66014725, -0.29447266, -0.8719525 ],
       [-0.53212637,  0.6360704 ,  0.02135803],
       [ 0.40355667,  0.14078747, -0.39829007],
       [-1.3842081 ,  0.04412093, -0.91313547],
       [-0.37355164, -2.0390503 , -0.50824887],
       [-0.7682212 ,  1.4448624 , -0.37302288],
       [ 0.13697726,  0.57252467, -1.0642116 ],
       [-0.17128809,  0.7596571 ,  0.37190843],
       [-0.8967074 , -0.18937345, -0.5372808 ],
       [ 0.33156198, -0.66581064, -0.21653776],
       [-0.11285859, -2.4033732 ,  0.0636418 ],
       [-0.31247538, -0.8419992 ,  0.4025044 ],
       [ 1.2428769 ,  0.34773824,  0.8888833 ],
       [-1.5594406 , -0.0539138 ,  0.7797568 ],
       [-0.5584576 ,  0.44812298, -0.26227227],
       [-0.4017965 , -1.6668578 , -2.0081973 ],
       [ 1.7921695 ,  1.1685921 , -0.537693  ],
       [-0.16341975, -0.42829806,  0.09798718],
       [ 0.49063244, -0.19753823,  0.28310525],
       [ 0.73069364,  0.33411032,  0.06241602],
       [ 0.1417386 ,  0.46909812,  0.90380406],
       [-0.32593566, -0.98549616,  0.36107165],
       [ 1.5818663 , -0.362372  ,  1.0220544 ],
       [ 0.26198712, -1.6119221 ,  0.07946812],
       [ 1.1173558 , -0.677369  ,  0.9825754 ],
       [ 1.2875233 ,  0.2511964 ,  0.9508616 ],
       [-0.7220847 ,  0.67017406,  0.1659171 ],
       [ 0.17958985, -0.65319884,  0.39171842],
       [ 0.8067303 ,  0.43496   ,  0.2798552 ]], dtype=float32)>
1
2
# 取第 1 张图片,第 2 行,第 3 列的数据
x[0][1][2]
<tf.Tensor: id=285, shape=(3,), dtype=float32, numpy=array([0.34846973, 0.44159728, 0.8838649 ], dtype=float32)>
1
2
# 取第 3 张图片,第 2 行,第 1 列的像素, B 通道(第 2 个通道)颜色强度值
x[2][1][0][1]
<tf.Tensor: id=301, shape=(), dtype=float32, numpy=-0.39595583>
1
2
# 取第 2 张图片,第 10 行,第 3 列的数据
x[1,9,2]
<tf.Tensor: id=305, shape=(3,), dtype=float32, numpy=array([ 0.58523804,  0.50835484, -0.7443932 ], dtype=float32)>

切片

1
2
# 读取第 2,3 张图片
x[1:3]
<tf.Tensor: id=309, shape=(2, 32, 32, 3), dtype=float32, numpy=
array([[[[-2.4223676 ,  0.2596306 , -0.5293948 ],
         [-0.3967986 ,  0.6624346 ,  0.41745508],
         [ 1.5329486 ,  0.30801037,  0.54265577],
         ...,
         [-1.2883576 , -0.4979994 , -0.5336313 ],
         [ 1.9402784 , -0.6301418 ,  1.2783034 ],
         [ 0.689839  ,  1.1910218 , -1.9886026 ]],

        [[-0.14839938, -0.34305233,  0.30521095],
         [ 0.4915458 ,  0.29830953, -0.6410243 ],
         [-0.3882759 , -0.1322335 ,  1.2989053 ],
         ...,
         [ 0.52465385, -1.5790194 ,  1.9075392 ],
         [-0.8763953 ,  0.33148092, -1.2615253 ],
         [-2.1037416 , -1.7750245 , -0.8264196 ]],

        [[ 0.42436486, -2.744681  ,  0.68191504],
         [-0.62411004,  1.1706539 ,  0.187509  ],
         [ 0.60655576, -1.426237  ,  0.24151424],
         ...,
         [-1.3997802 ,  0.7346194 , -0.8587046 ],
         [-0.04108864,  2.2934608 ,  0.23547095],
         [ 2.0110242 ,  0.73926306,  0.20124955]],

        ...,

        [[ 1.0731583 , -0.3252651 ,  0.75498104],
         [ 1.177519  , -0.5143665 , -0.90076303],
         [ 0.47401938, -0.43510988, -0.01301517],
         ...,
         [-1.0437206 , -0.66972613, -0.97535443],
         [-0.6570767 , -0.00988437,  0.32322738],
         [-0.4847873 ,  0.40703028,  0.06685828]],

        [[-1.5480559 ,  0.48287508, -1.4049336 ],
         [-0.13378212,  0.5845828 , -0.05725988],
         [ 2.9124444 , -1.2632277 ,  1.6553665 ],
         ...,
         [ 0.9075061 ,  1.5838726 ,  0.01311778],
         [-1.538471  , -0.48859388, -0.18985108],
         [ 0.7335186 , -0.23191583, -0.6732001 ]],

        [[ 0.45795447, -1.0244572 ,  2.6291482 ],
         [-0.11982027, -0.66913885,  0.39017648],
         [-0.46456242, -1.7838262 ,  1.0729996 ],
         ...,
         [ 1.6933389 ,  1.4940627 ,  0.14956625],
         [-1.2214607 , -0.03956367,  0.54512376],
         [ 0.65640074,  1.2754624 , -1.4749504 ]]],


[[[-0.90663576, 0.15839997, 0.32161254],
[-0.9101076 , -0.1349041 , 0.95145386],
[ 0.378604 , -1.4983795 , -0.48038518],
…,
[ 0.8427316 , 1.3538293 , -0.21184391],
[-0.30419785, -2.1156309 , 0.59961736],
[-1.1520345 , 0.7595469 , 0.30996034]],

        [[-1.1446227 , -0.39595583,  0.05506114],
         [ 1.1072568 , -0.14321956, -0.83200383],
         [-0.12360169, -2.973433  , -0.9375662 ],
         ...,
         [-0.93852717,  0.16133627,  0.45352787],
         [-0.66656876,  0.12624261, -0.7791581 ],
         [ 2.5405667 ,  0.7748032 , -2.2527237 ]],

        [[ 0.01577527,  1.0519909 , -1.3275864 ],
         [ 0.83748966,  1.8404965 , -0.30619964],
         [ 1.6023983 , -1.5017103 , -0.30663648],
         ...,
         [-0.8523438 , -0.3250353 ,  0.9320171 ],
         [ 0.32578966, -0.22678792, -0.13579275],
         [ 1.7109146 , -1.1671449 ,  0.06491743]],

        ...,

        [[ 0.44134948,  0.5566953 , -0.47516817],
         [-1.2281955 , -0.27368283,  1.4019957 ],
         [-0.7539954 , -0.2248977 , -1.0345727 ],
         ...,
         [-1.0997441 , -0.5867889 ,  0.24920598],
         [-1.1366905 , -0.33894378,  1.2943493 ],
         [ 0.866115  ,  0.09259874,  0.5898721 ]],

        [[-1.042004  , -0.42821613,  0.2879594 ],
         [-0.8600638 , -0.4365882 ,  0.82840854],
         [ 0.76567596, -0.46973774, -1.0789526 ],
         ...,
         [-0.19796038,  0.558751  , -0.75277686],
         [-0.60283434, -1.0192461 , -0.12388539],
         [-0.5070267 ,  0.08337619, -1.4103692 ]],

        [[ 0.9950036 , -1.3551532 ,  0.5169268 ],
         [ 0.59422225, -0.87916857,  0.7648795 ],
         [ 0.32365948, -1.6526997 , -1.1206408 ],
         ...,
         [ 0.05121538,  1.2883476 , -0.6445231 ],
         [ 0.86587644,  0.9763926 , -0.08709614],
         [ 1.4661231 , -1.8772072 ,  0.2751547 ]]]], dtype=float32)>
1
2
# 读取第一张图片
x[0,::]
<tf.Tensor: id=313, shape=(32, 32, 3), dtype=float32, numpy=
array([[[ 2.2041936 , -1.9026781 ,  0.8702505 ],
        [-1.2282028 , -0.33232537,  0.40958533],
        [ 0.11558069, -0.95446974, -1.5603778 ],
        ...,
        [ 1.8689036 ,  1.3471965 ,  0.46157768],
        [-0.04014067,  0.8095603 ,  1.0308311 ],
        [-0.2001917 , -1.0876633 , -0.35982683]],

       [[-0.6193978 , -1.1049955 , -0.06628878],
        [ 0.5612249 ,  1.5542006 ,  0.6287516 ],
        [ 0.34846973,  0.44159728,  0.8838649 ],
        ...,
        [-0.7220847 ,  0.67017406,  0.1659171 ],
        [ 0.17958985, -0.65319884,  0.39171842],
        [ 0.8067303 ,  0.43496   ,  0.2798552 ]],

       [[-1.163977  , -0.06057478, -0.4857398 ],
        [ 1.3414443 , -0.6038178 , -0.23302878],
        [-2.0975337 ,  0.94285005, -0.27974698],
        ...,
        [-0.5631729 ,  1.0614241 , -0.3096405 ],
        [-0.9624238 ,  1.3738877 , -1.8948269 ],
        [ 1.132725  , -0.20089822, -1.7373965 ]],

       ...,

       [[-0.14071971, -0.5568062 ,  0.01075767],
        [-1.7140628 ,  1.3289738 , -0.8903278 ],
        [-1.0916421 , -0.3162519 , -1.249703  ],
        ...,
        [ 1.325685  ,  1.5440601 , -0.4913852 ],
        [-1.3840119 ,  0.23958059, -0.20719068],
        [ 0.877472  ,  1.3066201 , -1.4298698 ]],

       [[ 0.3794225 ,  0.8216657 , -0.3639167 ],
        [-1.4976484 , -1.0524081 , -1.302156  ],
        [ 0.26988387,  0.34318095,  0.06246407],
        ...,
        [ 2.7228684 , -0.2831678 , -1.0059422 ],
        [-0.7020755 , -1.4222299 ,  0.9356876 ],
        [ 0.4152088 , -0.04397644, -0.73320246]],

       [[ 0.65700305, -1.7467034 , -1.5898855 ],
        [ 1.1514107 , -1.0907453 , -0.5877316 ],
        [ 0.86260825, -0.59653807,  0.0976033 ],
        ...,
        [-0.04578071, -1.2980894 ,  0.9463795 ],
        [-0.09251038,  0.25678882, -0.1819165 ],
        [-0.36038232, -0.53460985,  1.2337509 ]]], dtype=float32)>
1
x[:,0:28:2,0:28:2,:]
<tf.Tensor: id=317, shape=(4, 14, 14, 3), dtype=float32, numpy=
array([[[[ 2.2041936 , -1.9026781 ,  0.8702505 ],
         [ 0.11558069, -0.95446974, -1.5603778 ],
         [-0.10582599,  0.4360513 ,  0.37447408],
         ...,
         [-0.04653996,  1.6447414 ,  0.5684349 ],
         [ 0.9232003 , -0.30295762, -0.33417934],
         [-1.0266304 , -1.0249001 , -0.05951962]],

        [[-1.163977  , -0.06057478, -0.4857398 ],
         [-2.0975337 ,  0.94285005, -0.27974698],
         [ 0.8568684 , -2.3641932 , -2.787721  ],
         ...,
         [-0.02272389,  0.7538776 ,  0.05307977],
         [ 1.3103249 , -2.8305936 , -0.02025553],
         [ 0.72770905, -0.2757186 , -1.2772908 ]],

        [[-0.48045605,  0.7057281 ,  0.767962  ],
         [ 1.4860299 , -1.2072684 , -2.6429942 ],
         [-2.1154718 , -0.4968008 ,  0.40296978],
         ...,
         [ 0.6735097 , -0.37706473,  0.30742761],
         [ 1.5466257 ,  0.01344285,  0.4478075 ],
         [ 0.52647936,  0.3019742 , -0.04138045]],

        ...,

        [[ 0.06652974, -1.310362  ,  0.52491206],
         [ 0.20300347,  0.4878598 ,  1.1967695 ],
         [ 0.26188427, -1.1881219 , -0.8308305 ],
         ...,
         [-0.9027409 ,  0.49990463, -0.31936365],
         [ 0.14605626,  1.6312102 ,  0.5990152 ],
         [-0.22002122,  1.550344  ,  0.8017888 ]],

        [[-1.8214884 ,  0.18888037, -0.7315172 ],
         [ 1.1054498 ,  0.02177003, -0.80032647],
         [ 0.832248  ,  0.30545396, -0.00517098],
         ...,
         [-0.8079335 , -1.0006244 ,  1.7094636 ],
         [ 0.3665858 ,  0.12043276,  1.5349431 ],
         [ 1.451506  ,  1.7146869 ,  1.1798096 ]],

        [[-0.02927143, -0.662752  ,  1.7197117 ],
         [-0.07830945,  0.19495389,  1.0558871 ],
         [ 0.09200678, -2.0492928 , -1.149692  ],
         ...,
         [ 0.84948075, -0.7274614 , -0.6107158 ],
         [-1.04149   , -0.8495479 ,  0.4960098 ],
         [-0.00758181,  1.1287268 , -1.1791425 ]]],


[[[-2.4223676 , 0.2596306 , -0.5293948 ],
[ 1.5329486 , 0.30801037, 0.54265577],
[-0.25038302, -1.505699 , 0.22218615],
…,
[ 1.8112099 , -0.4017005 , 0.316382 ],
[-0.18795913, 0.21327318, 0.13639478],
[ 0.88907754, -1.068848 , 0.49985337]],

        [[ 0.42436486, -2.744681  ,  0.68191504],
         [ 0.60655576, -1.426237  ,  0.24151424],
         [ 0.83602005,  0.02829585, -0.19792575],
         ...,
         [-0.4921264 ,  0.47025818, -0.20402747],
         [-0.19556889,  0.71231675, -1.1210784 ],
         [-0.50484693,  0.29336897,  0.0850678 ]],

        [[-0.3722062 ,  0.18532671,  1.7206814 ],
         [-0.85221314,  0.557481  ,  1.8532947 ],
         [-0.05675818, -0.56605554, -0.846615  ],
         ...,
         [ 0.0248818 , -1.263318  ,  1.0077718 ],
         [ 1.1570826 ,  0.1613118 ,  0.20786911],
         [-1.0473794 ,  1.0830846 ,  1.0416656 ]],

        ...,

        [[ 0.0331895 ,  1.7457578 , -0.35708535],
         [ 1.0369142 , -0.62837493, -0.5342489 ],
         [ 0.7757275 ,  0.535828  , -2.2308693 ],
         ...,
         [-0.9503758 , -1.3476964 ,  0.17882505],
         [-0.25491032, -0.85506326, -2.003958  ],
         [ 0.92684764, -0.4062368 , -1.5470201 ]],

        [[-0.9265145 , -1.143782  , -0.9362721 ],
         [ 0.9630645 ,  0.65629876,  1.1364145 ],
         [ 2.0485058 , -0.6168327 ,  0.16756117],
         ...,
         [ 1.1698273 ,  2.6709888 , -0.45540768],
         [-0.3581334 ,  1.1361488 ,  1.4096297 ],
         [-0.03351761, -0.9961699 ,  0.81231606]],

        [[ 0.26294824, -0.0122492 , -1.2524768 ],
         [ 0.19943246,  0.7689961 ,  0.2076496 ],
         [ 0.22466388,  0.8513927 , -0.12332796],
         ...,
         [ 0.13668203, -0.14629023, -0.49706447],
         [ 1.6254246 ,  1.1169688 ,  0.69922197],
         [ 0.38690066,  1.3984909 , -0.7125247 ]]],


[[[-0.90663576, 0.15839997, 0.32161254],
[ 0.378604 , -1.4983795 , -0.48038518],
[-0.0130377 , -0.6399751 , 0.7394333 ],
…,
[-0.6753409 , 0.01053149, -1.4270033 ],
[ 1.1157323 , -0.5980183 , 0.49497938],
[ 1.4786468 , -0.4598702 , -0.08252096]],

        [[ 0.01577527,  1.0519909 , -1.3275864 ],
         [ 1.6023983 , -1.5017103 , -0.30663648],
         [ 1.065943  ,  1.1778338 , -0.5005816 ],
         ...,
         [-1.4590057 ,  0.95748615,  1.4595517 ],
         [ 0.9277145 ,  0.87606174,  0.69505954],
         [-1.105703  , -0.0888804 , -0.15580973]],

        [[-0.08234025,  1.0907137 , -2.2424757 ],
         [-1.2051404 , -0.03379055,  0.74277437],
         [ 0.24598132, -0.5550462 ,  0.8092795 ],
         ...,
         [-2.91178   ,  0.20674153,  0.40773728],
         [-0.28130236, -1.4947956 ,  0.0447046 ],
         [-1.4446735 , -0.08543364, -1.2267051 ]],

        ...,

        [[ 0.12023102,  1.2192281 ,  1.8644665 ],
         [ 0.71077096, -0.407154  , -0.3728209 ],
         [-1.4906154 ,  1.4894596 ,  2.1380718 ],
         ...,
         [ 0.1265301 , -0.46740493,  0.03761578],
         [-0.7213555 , -0.2611885 ,  2.1900265 ],
         [-0.32233417, -0.7339213 ,  1.4348257 ]],

        [[ 0.15944216,  1.0575757 , -0.32219157],
         [ 1.0994414 ,  0.89874107, -0.74534416],
         [ 0.55564195,  0.22377524,  0.79618496],
         ...,
         [-1.1586384 , -0.5727887 ,  0.0525245 ],
         [ 1.1248014 , -0.3213812 , -0.6321217 ],
         [ 1.1729585 ,  0.6997143 , -1.1535952 ]],

        [[ 0.1488529 , -0.5701219 ,  0.6574311 ],
         [ 0.7145128 , -0.57302225,  0.7365589 ],
         [-1.3955393 ,  0.2823049 , -0.25600722],
         ...,
         [-1.3540319 ,  0.27442855, -0.48966768],
         [ 2.1693397 , -0.41355062, -0.1416041 ],
         [-0.6702472 , -0.21834244,  0.3533043 ]]],


[[[-1.2110972 , -0.9158722 , 0.4041985 ],
[-0.08361922, 0.46396288, 0.6809368 ],
[-0.3673456 , 0.902671 , -0.4238117 ],
…,
[-1.4638704 , 0.10005575, 0.33722964],
[-0.5335524 , -0.07159513, -0.98311245],
[ 0.35258508, -0.7577552 , 0.00567928]],

        [[ 0.8245692 ,  1.0927265 , -0.5207532 ],
         [-0.1369488 , -0.3078722 , -1.3035924 ],
         [-0.45273212, -0.2587627 , -0.85130745],
         ...,
         [ 1.0517457 , -1.6728585 , -0.07226256],
         [ 0.68702376,  1.2428858 ,  0.93717146],
         [-1.006323  , -0.5241735 ,  0.77420044]],

        [[-0.51503855, -0.1137079 ,  0.52393454],
         [-1.1306531 , -0.38302454, -0.16332257],
         [ 1.2486451 ,  0.33851364,  0.2546582 ],
         ...,
         [ 1.7983892 , -1.6029406 ,  0.42837998],
         [-0.44229293, -1.100362  ,  0.43953687],
         [ 0.0773904 ,  0.14096828, -0.69741434]],

        ...,

        [[-1.687785  ,  0.19534737,  0.84400016],
         [ 1.4822593 ,  0.51837   , -0.5977481 ],
         [ 0.72277683, -0.84718037, -1.4383492 ],
         ...,
         [ 0.09861249,  2.7846844 ,  0.06162486],
         [ 0.9868257 ,  0.8325828 , -1.0587668 ],
         [ 1.9446942 , -0.40730464,  0.8500739 ]],

        [[-1.3877448 , -0.56070095,  0.57353336],
         [ 0.23248737, -0.5203832 , -0.26604426],
         [ 0.22834507, -0.02200814,  0.56439346],
         ...,
         [-0.8777562 , -0.42350784, -0.05138672],
         [-0.67386514,  0.6522291 , -0.8428607 ],
         [-0.1801546 , -0.2436022 , -0.32848358]],

        [[ 0.15476868, -1.3199596 , -1.1284592 ],
         [ 2.1280808 ,  0.68520063,  0.5801554 ],
         [ 0.4836316 , -0.4967644 , -0.5746127 ],
         ...,
         [-1.9212904 ,  0.39191443, -2.5196192 ],
         [-0.04232699, -0.31231558, -1.8565068 ],
         [-1.2433186 , -1.3967386 , -3.036623  ]]]], dtype=float32)>
1
2
3
4
5
# 考虑一个 0~9 的简单序列向量, 逆序取到第 1 号元素,不包含第 1 号
# 创建 0~9 向量
x = tf.range(9)
# 从 8 取到 0,逆序,不包含 0
x[8:0:-1]
<tf.Tensor: id=325, shape=(8,), dtype=int32, numpy=array([8, 7, 6, 5, 4, 3, 2, 1], dtype=int32)>
1
2
# 逆序全部元素
x[::-1]
<tf.Tensor: id=329, shape=(9,), dtype=int32, numpy=array([8, 7, 6, 5, 4, 3, 2, 1, 0], dtype=int32)>
1
2
# 逆序间隔采样
x[::-2]
<tf.Tensor: id=333, shape=(5,), dtype=int32, numpy=array([8, 6, 4, 2, 0], dtype=int32)>

读取每张图片的所有通道,其中行按着逆序隔行采样,列按着逆序隔行采样

1
2
3
x = tf.random.normal([4,32,32,3])
# 行、列逆序间隔采样
x[0,::-2,::-2]
<tf.Tensor: id=343, shape=(16, 16, 3), dtype=float32, numpy=
array([[[ 1.32819211e+00, -1.52891368e-01, -2.68408567e-01],
        [-1.74235809e+00,  5.97050309e-01, -2.14324856e+00],
        [-1.28296447e+00,  6.17663026e-01,  1.12792604e-01],
        [-2.07204247e+00, -1.18166316e+00, -8.19493711e-01],
        [-1.47719014e+00, -7.35922277e-01, -3.67488146e-01],
        [-3.82268518e-01,  8.88675451e-03,  1.29524207e+00],
        [-3.21091980e-01,  2.21426225e+00,  9.91399765e-01],
        [ 2.05135364e-02,  1.74879110e+00, -2.37907872e-01],
        [-2.91886926e-01, -9.75054145e-01, -8.84131372e-01],
        [-1.99409172e-01, -9.77180898e-02, -6.13150775e-01],
        [-2.09669054e-01, -3.75757724e-01,  9.72125709e-01],
        [ 8.99972498e-01, -1.29678416e+00, -1.20591462e+00],
        [-1.59504545e+00,  1.60751998e+00,  1.36306405e-01],
        [-1.19246662e+00, -1.64794803e+00,  1.45283183e-02],
        [ 4.74597424e-01, -1.27889240e+00,  4.06340212e-02],
        [-1.79539633e+00, -9.81691927e-02, -6.85885489e-01]],

       [[-1.05812716e+00, -1.30784822e+00, -6.80017769e-01],
        [ 3.65186512e-01, -3.48650187e-01, -1.54725778e+00],
        [ 1.51886746e-01, -2.09844962e-01,  1.39984548e+00],
        [-5.62044561e-01, -1.41484439e+00, -4.25017208e-01],
        [-6.71886727e-02,  1.13901690e-01,  1.71582669e-01],
        [ 1.66557586e+00, -9.23811913e-01, -1.95637453e+00],
        [-6.33425772e-01, -2.03683758e+00, -5.52891195e-01],
        [ 4.30578351e-01,  4.01591599e-01,  7.07811356e-01],
        [ 7.40033031e-01,  7.59029865e-01, -4.48047101e-01],
        [-4.86449093e-01, -7.00091779e-01,  5.79828203e-01],
        [ 1.56244147e+00, -7.40261674e-01, -7.41748929e-01],
        [-3.04721802e-01,  3.59575897e-01,  9.25536156e-01],
        [ 9.93468523e-01,  9.88783717e-01,  9.81922805e-01],
        [ 1.08223462e+00,  7.46599495e-01,  5.29822886e-01],
        [ 3.31095785e-01, -4.47714269e-01, -4.05531228e-01],
        [ 1.60369647e+00, -5.92184007e-01,  2.54667439e-02]],

       [[-4.86227632e-01, -1.10030425e+00,  9.10474122e-01],
        [-9.61585999e-01, -1.19987130e+00,  4.75821495e-01],
        [ 2.26800650e-01, -4.53597531e-02,  4.84708756e-01],
        [ 9.83571932e-02, -5.63235462e-01, -7.65108049e-01],
        [-4.45220917e-01,  1.46985579e+00, -3.55396181e-01],
        [-6.69205308e-01, -9.33043242e-01, -9.96201992e-01],
        [ 7.35680684e-02,  5.58141649e-01, -5.32615781e-01],
        [ 6.23787344e-01,  6.98106110e-01,  5.59944332e-01],
        [ 1.89795434e-01,  5.20511985e-01,  3.45360667e-01],
        [-5.39386809e-01,  7.92361617e-01,  7.72233069e-01],
        [-1.37562764e+00, -7.65306532e-01,  1.22537184e+00],
        [-9.93735671e-01, -2.28927445e+00, -3.30761880e-01],
        [ 3.47521663e-01,  1.81813228e+00,  1.49911916e+00],
        [-5.90717047e-03, -3.43079537e-01, -6.15450263e-01],
        [ 6.11240566e-01,  6.44246340e-01, -7.47387826e-01],
        [-3.00381750e-01,  3.15724164e-01,  1.64138222e+00]],

       [[ 1.14825201e+00,  1.13074481e+00, -4.92495179e-01],
        [ 2.25241870e-01, -5.84089123e-02,  9.25830454e-02],
        [-1.82172522e-01, -9.57806230e-01, -3.77334505e-01],
        [ 3.15930390e+00, -5.52801453e-02,  1.61293708e-02],
        [ 4.44656760e-01,  1.07683194e+00,  9.81891006e-02],
        [-1.31772089e+00,  1.09420873e-01,  1.52856982e+00],
        [-2.39866480e-01, -6.98523045e-01, -1.24893987e+00],
        [ 1.29468739e+00,  3.06010634e-01, -5.18583715e-01],
        [-4.67290908e-01,  2.67672628e-01,  7.30149746e-02],
        [-1.74860966e+00,  9.92399633e-01, -7.79615223e-01],
        [ 2.40579620e-01,  2.39096731e-01, -1.05543458e+00],
        [-2.80319154e-01, -3.87402582e+00,  5.01442015e-01],
        [ 8.12131941e-01,  5.19016683e-01, -9.54104364e-01],
        [ 1.14224434e+00,  6.78500652e-01, -1.34504056e+00],
        [ 3.85929286e-01,  9.36694257e-03, -5.74368834e-01],
        [-3.63719165e-01,  2.71460544e-02,  2.09300327e+00]],

       [[-2.35270150e-02, -2.96098262e-01,  8.58490467e-01],
        [-1.98163879e+00, -8.91919672e-01, -4.12080497e-01],
        [ 2.83049166e-01, -3.09135169e-01, -1.37894654e+00],
        [ 9.72408593e-01, -3.07032514e+00,  6.41499221e-01],
        [-5.71825683e-01,  1.70615464e-01,  2.49677584e-01],
        [-2.03208899e+00, -4.59082909e-02, -1.12768102e+00],
        [-5.74081719e-01,  1.36184072e+00, -1.35754287e+00],
        [-7.02018738e-01, -1.22644699e+00,  1.23843646e+00],
        [-1.86847806e+00, -7.55038798e-01, -1.55198109e+00],
        [ 1.59925127e+00, -1.77682626e+00, -4.47454542e-01],
        [-8.89484346e-01,  4.06048335e-02, -2.12907586e-02],
        [ 1.55495811e+00, -9.46091533e-01, -1.12370884e+00],
        [-6.63149476e-01, -1.48054332e-01, -8.66370499e-01],
        [-9.72609699e-01,  8.09224486e-01, -1.08757228e-01],
        [-1.84078431e+00, -1.07596278e+00, -8.74609530e-01],
        [ 9.88747358e-01,  9.55015272e-02, -2.35948014e+00]],

       [[ 1.40270567e+00,  1.50841713e-01, -5.54310754e-02],
        [ 2.03900361e+00, -2.81785190e-01,  4.42986637e-02],
        [ 1.21783614e+00, -1.34693730e+00, -1.44243157e+00],
        [ 5.76931775e-01,  1.62811887e+00,  6.39955223e-01],
        [-1.74793065e+00,  2.07304955e-01, -2.25865468e-01],
        [-4.15330142e-01, -1.55576670e+00, -1.13930893e+00],
        [ 1.11974978e+00, -1.79331243e-01, -9.33242738e-01],
        [-5.40467203e-01, -8.10507298e-01,  7.65565455e-01],
        [-1.25150323e-01,  2.45413408e-01, -8.35556448e-01],
        [-6.55914128e-01, -5.80529928e-01, -1.20343566e-01],
        [-2.26229757e-01, -1.95507139e-01, -1.70554236e-01],
        [-3.00912589e-01, -4.94531870e-01,  1.16584015e+00],
        [ 4.59960520e-01,  6.07771397e-01,  4.26176339e-02],
        [ 7.55990624e-01, -1.91223100e-02, -3.85362864e-01],
        [-1.14951158e+00,  6.91074133e-01,  1.67067599e+00],
        [-3.21438015e-01, -7.53839314e-02, -9.35887218e-01]],

       [[ 5.47035635e-01, -5.23284450e-02,  4.02895719e-01],
        [ 1.32033587e-01, -4.70424891e-01,  1.16757905e+00],
        [ 3.76113653e-01,  1.76386505e-01, -1.63666332e+00],
        [ 6.88591599e-01, -2.48966232e-01,  1.59020257e+00],
        [-4.79439110e-01,  1.28868616e+00,  2.21981835e+00],
        [ 5.40017374e-02,  6.91947281e-01,  1.94959357e-01],
        [-9.36701447e-02,  3.91052485e-01, -4.17478114e-01],
        [-1.12415302e+00,  1.05244577e-01, -8.60867977e-01],
        [-3.53260577e-01,  8.07365239e-01,  1.98053196e-01],
        [ 1.44271660e+00, -4.19594377e-01, -1.77386373e-01],
        [ 1.36769521e+00, -1.38748944e+00,  5.03023248e-03],
        [-2.43702188e-01, -1.36886001e+00,  4.11833525e-01],
        [ 3.02441150e-01, -4.80698109e-01, -1.39226437e+00],
        [ 2.36330613e-01,  1.66690373e+00,  2.00038359e-01],
        [-1.22779334e+00, -1.39988613e+00, -3.50548536e-01],
        [ 2.32266456e-01, -7.95637667e-01,  1.97104156e+00]],

       [[-5.69649875e-01, -2.46080613e+00, -9.10816312e-01],
        [-1.53168082e-01,  2.16495895e+00, -1.27430940e+00],
        [-1.75009024e+00,  5.70950091e-01, -9.35105205e-01],
        [-2.02183932e-01, -7.59766936e-01,  2.29213595e-01],
        [-1.39746463e+00,  2.65763164e-01, -4.06110078e-01],
        [-1.84702861e+00, -6.93249941e-01,  9.25590456e-01],
        [ 1.45949423e-01, -4.35498893e-01,  1.90595949e+00],
        [-3.57079446e-01, -1.51399589e+00, -9.99029800e-02],
        [-9.42782313e-02,  1.21779490e+00,  3.88828933e-01],
        [ 2.00789642e+00,  1.02215707e-02,  3.21455784e-02],
        [ 1.45261729e+00, -8.86097327e-02, -6.89221799e-01],
        [-2.26393327e-01,  6.15001380e-01, -1.28379261e+00],
        [-2.23580487e-02,  9.74746525e-01, -9.66164768e-01],
        [ 3.50023448e-01,  1.82733262e+00, -2.53733128e-01],
        [-1.11022592e+00, -1.86617315e+00, -2.11713147e+00],
        [ 2.80960530e-01, -4.51435268e-01,  1.90480697e+00]],

       [[-2.77264547e+00,  9.57480609e-01,  6.22376800e-01],
        [ 9.52174425e-01, -4.27199155e-01,  1.14266515e+00],
        [ 8.86744082e-01, -6.22356236e-01, -5.81559777e-01],
        [ 5.89285254e-01, -6.01863384e-01,  1.73346370e-01],
        [ 1.54971564e+00, -8.13169956e-01,  1.47795677e+00],
        [-4.01796371e-01, -1.46614432e+00,  1.30820823e+00],
        [-2.98423506e-02,  1.06418443e+00, -4.78232026e-01],
        [ 1.82253325e+00, -3.88808012e-01,  1.80159080e+00],
        [ 1.64312124e-01,  1.27614602e-01, -1.71271533e-01],
        [-1.74178255e+00,  9.71022546e-01,  1.55694091e+00],
        [ 2.64798254e-01, -1.31978318e-01, -1.27089739e-01],
        [-2.90385246e-01, -2.81607056e+00, -2.51615524e-01],
        [ 1.50572884e+00,  1.02218115e+00,  1.16663694e-01],
        [ 3.35120916e-01, -8.72932673e-01, -6.25664711e-01],
        [-3.21538270e-01, -7.99890280e-01, -6.18392229e-01],
        [ 3.06067228e+00, -1.26156688e-01,  1.18348384e+00]],

       [[-5.11001945e-01,  1.37932420e+00, -3.48675430e-01],
        [ 2.76659799e+00, -4.34706032e-01,  1.66739762e-01],
        [-1.10698283e-01, -7.76158631e-01,  1.86271176e-01],
        [ 1.22287059e+00,  8.22692811e-01,  7.54150748e-01],
        [ 4.93106544e-01,  6.56304955e-01,  1.21033490e+00],
        [ 3.89292389e-01,  1.74910271e+00,  4.62190390e-01],
        [ 2.27324545e-01,  5.73735595e-01, -2.48087004e-01],
        [-3.79279375e-01,  3.78067166e-01,  1.46806073e+00],
        [ 2.30334461e-01, -1.67860663e+00, -7.74816453e-01],
        [ 6.61772549e-01,  9.88777637e-01, -2.18693733e+00],
        [ 1.29639733e+00, -2.89914489e-01, -6.09108448e-01],
        [-5.62642634e-01,  1.12929857e+00,  1.78704515e-01],
        [ 1.59194541e+00,  4.59247902e-02,  3.04074079e-01],
        [-9.05971676e-02,  2.23558825e-02,  6.90295696e-01],
        [-1.76028121e+00,  1.30459869e+00,  1.10061681e+00],
        [ 5.74148335e-02,  1.37532806e+00, -5.93708098e-01]],

       [[-5.62136054e-01,  1.11537382e-01,  1.86342442e+00],
        [-7.76148736e-01, -8.18978250e-01,  1.35009933e+00],
        [-4.34110254e-01, -5.29790819e-01, -6.76819623e-01],
        [ 8.09686065e-01,  1.00224167e-01, -1.14079773e+00],
        [ 6.72304094e-01,  8.45222652e-01,  8.68369520e-01],
        [ 1.88847947e+00,  6.60299420e-01, -1.01915455e+00],
        [ 4.52204853e-01, -7.47173548e-01, -1.01478136e+00],
        [ 6.49616838e-01, -3.51152241e-01, -3.22207630e-01],
        [-1.56287539e+00, -8.39486599e-01,  4.92055297e-01],
        [ 1.12434494e+00, -4.15864557e-01,  2.98760504e-01],
        [-4.64543775e-02,  5.32227039e-01,  5.67610443e-01],
        [-2.64979064e-01,  5.11899471e-01, -5.91439664e-01],
        [-1.96026877e-01,  1.25646031e+00, -2.65661448e-01],
        [ 1.50221694e+00, -6.95784390e-01, -4.32838410e-01],
        [-9.25149560e-01,  1.55666733e+00,  7.89082229e-01],
        [ 1.03696835e+00,  1.14898336e+00,  2.37887636e-01]],

       [[ 4.73943323e-01,  4.90469962e-01, -6.27518237e-01],
        [-1.11759044e-01,  1.31907976e+00, -1.92628849e+00],
        [-6.40158474e-01,  1.16672480e+00, -5.82574248e-01],
        [ 1.82465971e-01,  5.98510027e-01, -1.54943538e+00],
        [ 1.23925221e+00,  1.85171413e+00, -4.11800183e-02],
        [ 5.96398175e-01, -5.77813566e-01,  2.84586579e-01],
        [-2.33158922e+00,  4.85183299e-01, -6.45461261e-01],
        [-1.00312984e+00,  3.38520497e-01, -2.68755138e-01],
        [ 3.27760369e-01, -6.05535984e-01,  5.60963929e-01],
        [ 4.49014939e-02,  1.46062136e+00, -2.22097754e+00],
        [ 1.37192681e-01,  2.33995080e-01,  1.73316765e+00],
        [-1.01195645e+00, -1.36518753e+00, -2.85154253e-01],
        [ 7.14541018e-01, -7.50025034e-01,  1.12300861e+00],
        [-9.05730128e-01, -2.49278724e-01,  8.21055114e-01],
        [ 1.36068606e+00,  1.04029274e+00, -4.62492704e-01],
        [-8.05677921e-02,  2.65979171e-01,  2.23054901e-01]],

       [[-1.24165677e-01, -9.52346399e-02,  4.24239188e-01],
        [ 5.64876080e-01, -1.03957675e-01, -5.80752552e-01],
        [-2.28657699e+00, -8.04197013e-01,  4.47991550e-01],
        [-3.88402313e-01,  4.04412657e-01, -1.15122008e+00],
        [-2.26028576e-01,  4.98672754e-01,  1.82685316e-01],
        [-1.02170885e-01, -7.63889849e-01,  2.21727896e+00],
        [-1.21248543e+00, -1.18503594e+00, -1.04385889e+00],
        [-6.25540912e-01,  1.21357477e+00, -1.37694407e+00],
        [-1.06482141e-01,  1.24098301e+00,  4.69377786e-01],
        [ 5.69198370e-01,  1.34320125e-01,  2.64150798e-01],
        [-8.89743328e-01,  7.29027569e-01, -3.96091849e-01],
        [-5.38439631e-01,  7.89792895e-01, -2.41921830e+00],
        [ 1.78635567e-01,  6.26172364e-01,  1.26544416e+00],
        [ 9.92504656e-01,  8.01704347e-01, -1.41732895e+00],
        [ 3.73012841e-01, -3.74639213e-01, -1.93168428e-02],
        [ 6.25086367e-01, -1.16802764e+00, -1.35501802e-01]],

       [[ 7.90464222e-01,  3.84943634e-01,  1.34319830e+00],
        [-1.04067123e+00,  1.20278490e+00, -9.86785233e-01],
        [-2.10872635e-01, -2.94924617e-01,  2.18456030e+00],
        [-1.25211585e+00, -4.26412076e-01,  3.85715276e-01],
        [ 4.80433226e-01, -2.17810750e+00,  1.72025964e-01],
        [-1.36507463e+00, -5.88170290e-01, -1.12746871e+00],
        [ 4.61366147e-01,  8.17801833e-01,  2.02035308e+00],
        [-2.33708096e+00,  2.93909404e-02,  5.49295485e-01],
        [ 5.44144630e-01,  8.78731012e-01,  5.76293409e-01],
        [-3.52834463e-01, -2.13488245e+00, -9.02048647e-02],
        [-2.68439913e+00, -7.18059778e-01,  1.58271170e+00],
        [ 7.84022629e-01,  1.80395007e-01,  9.44528505e-02],
        [ 1.11435282e+00, -7.96168029e-01, -9.54501331e-01],
        [ 1.08020775e-01,  4.66115266e-01,  1.13831210e+00],
        [ 1.40373373e+00, -6.06358469e-01, -6.87408030e-01],
        [-5.19226313e-01,  6.26494050e-01,  7.20157683e-01]],

       [[-1.53032497e-01, -2.31825694e-01,  2.38443211e-01],
        [ 1.17578602e+00,  8.96336198e-01,  1.00056005e+00],
        [-8.06149364e-01,  1.67887485e+00,  1.31185651e+00],
        [-4.14346933e-01,  9.25169349e-01, -8.90269935e-01],
        [ 4.75324124e-01,  6.48465008e-02, -1.17474127e+00],
        [ 8.11189532e-01,  4.58558321e-01, -1.89462161e+00],
        [-8.52047384e-01, -1.95253909e+00, -1.18440950e+00],
        [ 5.65351129e-01,  3.13308030e-01,  1.34290731e+00],
        [-8.23011279e-01,  4.34110940e-01, -1.57076240e-01],
        [-7.66083777e-01,  1.53549409e+00, -6.54839098e-01],
        [ 1.29202247e-01,  6.42492482e-03,  9.93899703e-02],
        [ 2.33376041e-01, -8.38140726e-01,  9.34675157e-01],
        [-9.31392252e-01, -1.87041914e+00, -6.32766664e-01],
        [-3.24939862e-02, -6.08937442e-01,  4.31290448e-01],
        [-9.14271355e-01,  8.74976039e-01,  7.50481248e-01],
        [ 1.18830375e-01,  1.08729470e+00,  1.97146928e+00]],

       [[-9.39358115e-01, -1.37552381e-01,  1.66079611e-01],
        [ 1.00940740e+00, -5.13267696e-01,  1.12969530e+00],
        [ 2.46125221e+00, -1.14048445e+00, -7.83394337e-01],
        [-1.23253345e+00,  1.30130267e+00, -1.51509032e-01],
        [ 1.01419091e+00,  2.42352739e-01,  7.56354570e-01],
        [ 7.67537236e-01,  9.57925975e-01, -1.85001838e+00],
        [ 1.54104221e+00,  9.20635283e-01,  3.35547149e-01],
        [ 8.05710435e-01, -7.26651132e-01,  3.50032095e-03],
        [-2.14763775e-01, -1.70537174e+00,  9.35129881e-01],
        [-5.43601632e-01, -6.72167778e-01, -9.50358927e-01],
        [-8.49173665e-01, -1.43499419e-01, -9.19315442e-02],
        [ 5.55126607e-01, -7.18098879e-01,  1.32945049e+00],
        [-1.82561159e-01,  2.36541009e+00,  4.69969809e-01],
        [ 7.36051142e-01,  8.05300534e-01,  4.18028831e-01],
        [ 1.13606131e+00, -9.57732141e-01,  1.11834717e+00],
        [-4.51551750e-02, -1.14675157e-01, -2.38522816e+00]]],
      dtype=float32)>
1
2
# 取 G 通道数据
x[:,:,:,1]
<tf.Tensor: id=347, shape=(4, 32, 32), dtype=float32, numpy=
array([[[ 1.14909673e+00, -4.06459235e-02, -5.78127801e-01, ...,
          1.55137196e-01,  7.60451019e-01,  9.85731423e-01],
        [-1.13586128e+00, -1.14675157e-01,  1.63204148e-01, ...,
         -5.13267696e-01, -1.18245673e+00, -1.37552381e-01],
        [ 9.94689882e-01,  5.30135810e-01,  3.25026214e-01, ...,
         -1.92351151e+00, -1.82740724e+00,  1.64374709e-01],
        ...,
        [-1.52717039e-01, -5.92184007e-01, -1.48266196e+00, ...,
         -3.48650187e-01,  4.83299762e-01, -1.30784822e+00],
        [-1.26717579e+00, -5.01342475e-01,  9.09223035e-02, ...,
          2.43971795e-02,  5.13184786e-01,  6.98500574e-01],
        [-8.86834741e-01, -9.81691927e-02,  7.17816591e-01, ...,
          5.97050309e-01,  1.22520790e-01, -1.52891368e-01]],

       [[ 5.27211905e-01,  5.10949850e-01,  6.68793380e-01, ...,
          1.01723397e+00, -9.20990646e-01,  1.55521703e+00],
        [ 5.97045481e-01, -1.12161386e+00,  1.01290178e+00, ...,
         -1.12059198e-01,  1.63329244e+00,  9.03684914e-01],
        [-3.70465130e-01,  1.35258186e+00,  1.91781148e-02, ...,
          7.91784763e-01, -5.38403928e-01,  1.19437456e+00],
        ...,
        [-2.66319364e-02,  4.80187714e-01, -6.81482777e-02, ...,
         -8.33781809e-02, -2.15396023e+00, -1.00828364e-01],
        [-5.73343694e-01,  1.27166235e+00, -5.36300726e-02, ...,
          6.65309191e-01,  1.02147615e+00,  7.86082864e-01],
        [-6.50614142e-01,  8.00769866e-01, -3.60975653e-01, ...,
          3.29060793e-01, -7.21324742e-01, -1.71777833e+00]],

       [[-5.47546387e-01, -1.24894366e-01, -6.13053203e-01, ...,
         -4.76122051e-01, -6.67316198e-01, -5.32188356e-01],
        [ 7.25843370e-01,  1.25086391e+00,  6.61642969e-01, ...,
         -1.11920547e+00,  8.22943971e-02,  8.71762872e-01],
        [ 6.10169657e-02,  8.98746789e-01, -1.89981267e-01, ...,
          1.32393092e-03,  7.66479552e-01,  4.74087834e-01],
        ...,
        [ 1.36904991e+00,  1.88162339e+00, -1.29588962e+00, ...,
          2.02118421e+00,  2.84831226e-01, -8.29148889e-01],
        [ 3.66007835e-01,  5.39520979e-01, -1.21468163e+00, ...,
          1.26315391e+00, -1.57071245e+00,  3.33765388e-01],
        [ 3.69738698e-01, -3.00485075e-01,  3.49693507e-01, ...,
         -1.00170338e+00, -8.53059292e-01, -1.43128681e+00]],

       [[ 9.08448339e-01, -7.05780163e-02, -5.45533061e-01, ...,
          1.39675033e+00, -9.83740449e-01,  4.93973970e-01],
        [-2.29770586e-01, -1.70520768e-01,  4.63991873e-02, ...,
          1.25932079e-02,  2.69956380e-01, -2.21568316e-01],
        [-1.71562707e+00, -2.53337473e-01,  1.14060119e-01, ...,
          1.60762429e+00, -3.74208689e-01,  1.31152779e-01],
        ...,
        [ 6.35229349e-01, -3.34331602e-01, -2.70434052e-01, ...,
         -4.81671304e-01, -1.03246319e+00,  1.72697484e+00],
        [-3.85653168e-01, -3.87742639e-01, -7.38137007e-01, ...,
         -4.67593260e-02,  8.60109150e-01,  4.53103155e-01],
        [-3.68833989e-01,  6.17409274e-02,  2.55871916e+00, ...,
         -7.19225705e-02, -1.25733685e+00,  6.05888307e-01]]],
      dtype=float32)>
1
2
3
# 读取第 1~2 张图片的 G/B 通道数据
# 高宽维度全部采集
x[0:2,...,1:]
<tf.Tensor: id=351, shape=(2, 32, 32, 2), dtype=float32, numpy=
array([[[[ 1.1490967 ,  1.7380066 ],
         [-0.04064592,  0.48029   ],
         [-0.5781278 , -1.291669  ],
         ...,
         [ 0.1551372 ,  0.8534301 ],
         [ 0.760451  ,  0.587535  ],
         [ 0.9857314 ,  0.1369431 ]],

        [[-1.1358613 , -0.06066316],
         [-0.11467516, -2.3852282 ],
         [ 0.16320415,  0.01811434],
         ...,
         [-0.5132677 ,  1.1296953 ],
         [-1.1824567 ,  0.7329599 ],
         [-0.13755238,  0.16607961]],

        [[ 0.9946899 ,  0.48675606],
         [ 0.5301358 , -1.0126823 ],
         [ 0.3250262 , -0.6064818 ],
         ...,
         [-1.9235115 , -0.41639256],
         [-1.8274072 , -0.5375008 ],
         [ 0.16437471, -0.4204572 ]],

        ...,

        [[-0.15271704,  0.02707502],
         [-0.592184  ,  0.02546674],
         [-1.482662  , -1.4665922 ],
         ...,
         [-0.3486502 , -1.5472578 ],
         [ 0.48329976, -2.0207098 ],
         [-1.3078482 , -0.68001777]],

        [[-1.2671758 ,  0.03466341],
         [-0.5013425 ,  0.1263919 ],
         [ 0.0909223 ,  0.29931667],
         ...,
         [ 0.02439718,  0.5069986 ],
         [ 0.5131848 , -0.6002897 ],
         [ 0.6985006 , -1.3119441 ]],

        [[-0.88683474, -0.14877406],
         [-0.09816919, -0.6858855 ],
         [ 0.7178166 ,  0.44352156],
         ...,
         [ 0.5970503 , -2.1432486 ],
         [ 0.12252079, -0.15961307],
         [-0.15289137, -0.26840857]]],


[[[ 0.5272119 , 0.6869629 ],
[ 0.51094985, 0.2770362 ],
[ 0.6687934 , -1.4204 ],
…,
[ 1.017234 , 0.35187325],
[-0.92099065, -0.585941 ],
[ 1.555217 , -0.6104895 ]],

        [[ 0.5970455 ,  0.7830326 ],
         [-1.1216139 ,  0.16928901],
         [ 1.0129018 ,  0.71436375],
         ...,
         [-0.1120592 ,  0.37095946],
         [ 1.6332924 ,  0.4852164 ],
         [ 0.9036849 ,  0.84450924]],

        [[-0.37046513, -0.4693162 ],
         [ 1.3525819 , -0.66847706],
         [ 0.01917811, -0.40561342],
         ...,
         [ 0.79178476,  1.6169451 ],
         [-0.5384039 , -2.6904156 ],
         [ 1.1943746 ,  0.15126795]],

        ...,

        [[-0.02663194, -0.42372993],
         [ 0.4801877 , -1.6053843 ],
         [-0.06814828,  0.39376357],
         ...,
         [-0.08337818, -0.56289715],
         [-2.1539602 ,  0.7823069 ],
         [-0.10082836,  0.64499325]],

        [[-0.5733437 ,  0.8600085 ],
         [ 1.2716624 ,  1.4874613 ],
         [-0.05363007, -1.5294101 ],
         ...,
         [ 0.6653092 ,  0.31750998],
         [ 1.0214761 ,  0.22179288],
         [ 0.78608286, -1.4824792 ]],

        [[-0.65061414, -0.6899978 ],
         [ 0.80076987, -1.4741213 ],
         [-0.36097565, -0.48046836],
         ...,
         [ 0.3290608 , -1.5610422 ],
         [-0.72132474,  0.18023647],
         [-1.7177783 , -0.53801376]]]], dtype=float32)>
1
2
3
# 读取 R/G 通道数据
# 所有样本,所有高、宽的前 2 个通道
x[...,:2]
<tf.Tensor: id=359, shape=(4, 32, 32, 2), dtype=float32, numpy=
array([[[[-1.30701756e+00,  1.14909673e+00],
         [ 3.61870110e-01, -4.06459235e-02],
         [ 6.26487672e-01, -5.78127801e-01],
         ...,
         [ 1.38576820e-01,  1.55137196e-01],
         [-1.29257798e+00,  7.60451019e-01],
         [-3.25585663e-01,  9.85731423e-01]],

        [[-2.72432625e-01, -1.13586128e+00],
         [-4.51551750e-02, -1.14675157e-01],
         [-4.10570800e-01,  1.63204148e-01],
         ...,
         [ 1.00940740e+00, -5.13267696e-01],
         [-1.04479229e+00, -1.18245673e+00],
         [-9.39358115e-01, -1.37552381e-01]],

        [[-1.14408398e+00,  9.94689882e-01],
         [-2.45610863e-01,  5.30135810e-01],
         [ 3.69893968e-01,  3.25026214e-01],
         ...,
         [ 1.47702467e+00, -1.92351151e+00],
         [-8.77718687e-01, -1.82740724e+00],
         [-1.90951622e+00,  1.64374709e-01]],

        ...,

        [[-8.38505983e-01, -1.52717039e-01],
         [ 1.60369647e+00, -5.92184007e-01],
         [ 2.45542109e-01, -1.48266196e+00],
         ...,
         [ 3.65186512e-01, -3.48650187e-01],
         [-6.50465429e-01,  4.83299762e-01],
         [-1.05812716e+00, -1.30784822e+00]],

        [[ 7.95438468e-01, -1.26717579e+00],
         [ 9.37338114e-01, -5.01342475e-01],
         [-1.69611961e-01,  9.09223035e-02],
         ...,
         [ 1.64364791e+00,  2.43971795e-02],
         [ 1.96424723e-01,  5.13184786e-01],
         [ 8.26264262e-01,  6.98500574e-01]],

        [[ 2.59421289e-01, -8.86834741e-01],
         [-1.79539633e+00, -9.81691927e-02],
         [ 3.78742844e-01,  7.17816591e-01],
         ...,
         [-1.74235809e+00,  5.97050309e-01],
         [ 6.53830469e-01,  1.22520790e-01],
         [ 1.32819211e+00, -1.52891368e-01]]],


[[[-5.31493947e-02, 5.27211905e-01],
[-8.06730747e-01, 5.10949850e-01],
[ 1.84080076e+00, 6.68793380e-01],
…,
[ 1.31973469e+00, 1.01723397e+00],
[ 4.49128337e-02, -9.20990646e-01],
[-1.32044387e+00, 1.55521703e+00]],

        [[ 7.68865108e-01,  5.97045481e-01],
         [-7.52771422e-02, -1.12161386e+00],
         [ 1.21640265e+00,  1.01290178e+00],
         ...,
         [ 1.01818316e-01, -1.12059198e-01],
         [ 1.12015426e+00,  1.63329244e+00],
         [ 1.95406660e-01,  9.03684914e-01]],

        [[-3.43454480e-02, -3.70465130e-01],
         [-3.47994983e-01,  1.35258186e+00],
         [ 1.07138467e+00,  1.91781148e-02],
         ...,
         [ 9.40576553e-01,  7.91784763e-01],
         [ 5.54417372e-01, -5.38403928e-01],
         [-1.44541347e+00,  1.19437456e+00]],

        ...,

        [[ 1.11028528e+00, -2.66319364e-02],
         [-1.03816831e+00,  4.80187714e-01],
         [ 5.60190491e-02, -6.81482777e-02],
         ...,
         [ 4.46985304e-01, -8.33781809e-02],
         [-1.76779434e-01, -2.15396023e+00],
         [-1.36233258e+00, -1.00828364e-01]],

        [[ 1.25010625e-01, -5.73343694e-01],
         [ 4.23534930e-01,  1.27166235e+00],
         [ 6.20880544e-01, -5.36300726e-02],
         ...,
         [-4.97313976e-01,  6.65309191e-01],
         [-6.49542287e-02,  1.02147615e+00],
         [ 1.87847123e-01,  7.86082864e-01]],

        [[-8.89460385e-01, -6.50614142e-01],
         [ 6.55708909e-01,  8.00769866e-01],
         [ 1.00335670e+00, -3.60975653e-01],
         ...,
         [-7.29620278e-01,  3.29060793e-01],
         [ 2.53696367e-02, -7.21324742e-01],
         [-4.38493162e-01, -1.71777833e+00]]],


[[[-5.11693060e-01, -5.47546387e-01],
[-2.56009412e+00, -1.24894366e-01],
[-1.66868377e+00, -6.13053203e-01],
…,
[-3.40102255e-01, -4.76122051e-01],
[-2.68808216e-01, -6.67316198e-01],
[ 1.95494068e+00, -5.32188356e-01]],

        [[-6.79937303e-01,  7.25843370e-01],
         [ 7.51152635e-01,  1.25086391e+00],
         [ 1.31343961e+00,  6.61642969e-01],
         ...,
         [ 3.19355845e-01, -1.11920547e+00],
         [-4.93650079e-01,  8.22943971e-02],
         [ 1.77995250e-01,  8.71762872e-01]],

        [[ 5.79456747e-01,  6.10169657e-02],
         [-3.90781134e-01,  8.98746789e-01],
         [-1.64386973e-01, -1.89981267e-01],
         ...,
         [ 1.72087538e+00,  1.32393092e-03],
         [-1.03725746e-01,  7.66479552e-01],
         [ 8.60096216e-01,  4.74087834e-01]],

        ...,

        [[ 2.98859119e-01,  1.36904991e+00],
         [-1.31470454e+00,  1.88162339e+00],
         [ 3.38255256e-01, -1.29588962e+00],
         ...,
         [ 4.79147226e-01,  2.02118421e+00],
         [ 3.93357724e-01,  2.84831226e-01],
         [-1.07760859e+00, -8.29148889e-01]],

        [[-7.26247966e-01,  3.66007835e-01],
         [ 6.38583839e-01,  5.39520979e-01],
         [ 2.58788407e-01, -1.21468163e+00],
         ...,
         [ 3.30879092e-01,  1.26315391e+00],
         [-9.85577762e-01, -1.57071245e+00],
         [ 1.34247553e+00,  3.33765388e-01]],

        [[ 3.28157872e-01,  3.69738698e-01],
         [-4.69663978e-01, -3.00485075e-01],
         [ 8.08599889e-01,  3.49693507e-01],
         ...,
         [ 1.20650291e-01, -1.00170338e+00],
         [-1.25450063e+00, -8.53059292e-01],
         [-5.60456105e-02, -1.43128681e+00]]],


[[[ 6.02736592e-01, 9.08448339e-01],
[ 1.45205522e+00, -7.05780163e-02],
[ 1.12210441e+00, -5.45533061e-01],
…,
[-1.61648536e+00, 1.39675033e+00],
[ 3.89932483e-01, -9.83740449e-01],
[-3.43187571e-01, 4.93973970e-01]],

        [[-4.33481991e-01, -2.29770586e-01],
         [ 4.20535475e-01, -1.70520768e-01],
         [ 1.40664136e+00,  4.63991873e-02],
         ...,
         [ 3.30364525e-01,  1.25932079e-02],
         [-5.44138372e-01,  2.69956380e-01],
         [ 5.51277101e-01, -2.21568316e-01]],

        [[-2.39359438e-01, -1.71562707e+00],
         [ 8.63479078e-02, -2.53337473e-01],
         [-5.11896372e-01,  1.14060119e-01],
         ...,
         [-7.51873851e-01,  1.60762429e+00],
         [-1.85268188e+00, -3.74208689e-01],
         [-4.49496716e-01,  1.31152779e-01]],

        ...,

        [[-1.24805138e-01,  6.35229349e-01],
         [ 1.62983191e+00, -3.34331602e-01],
         [-3.98483366e-01, -2.70434052e-01],
         ...,
         [ 2.51731694e-01, -4.81671304e-01],
         [ 1.65011346e+00, -1.03246319e+00],
         [-1.56109953e+00,  1.72697484e+00]],

        [[ 3.73855352e-01, -3.85653168e-01],
         [-1.18297446e+00, -3.87742639e-01],
         [-5.74579597e-01, -7.38137007e-01],
         ...,
         [ 5.06586790e-01, -4.67593260e-02],
         [ 4.67046916e-01,  8.60109150e-01],
         [-8.88322115e-01,  4.53103155e-01]],

        [[-1.47322047e+00, -3.68833989e-01],
         [ 3.80937368e-01,  6.17409274e-02],
         [-1.07242978e+00,  2.55871916e+00],
         ...,
         [ 1.02848232e+00, -7.19225705e-02],
         [-1.13464808e+00, -1.25733685e+00],
         [-6.02429748e-01,  6.05888307e-01]]]], dtype=float32)>

维度变换

改变视图

我们通过 tf.range()模拟生成一个向量数据,并通过 tf.reshape 视图改变函数产生不同的视图

1
2
3
4
5
# 生成向量
x=tf.range(96)
# 改变 x 的视图,获得 4D 张量,存储并未改变
x=tf.reshape(x,[2,4,4,3])
x
<tf.Tensor: id=365, shape=(2, 4, 4, 3), dtype=int32, numpy=
array([[[[ 0,  1,  2],
         [ 3,  4,  5],
         [ 6,  7,  8],
         [ 9, 10, 11]],

        [[12, 13, 14],
         [15, 16, 17],
         [18, 19, 20],
         [21, 22, 23]],

        [[24, 25, 26],
         [27, 28, 29],
         [30, 31, 32],
         [33, 34, 35]],

        [[36, 37, 38],
         [39, 40, 41],
         [42, 43, 44],
         [45, 46, 47]]],


[[[48, 49, 50],
[51, 52, 53],
[54, 55, 56],
[57, 58, 59]],

        [[60, 61, 62],
         [63, 64, 65],
         [66, 67, 68],
         [69, 70, 71]],

        [[72, 73, 74],
         [75, 76, 77],
         [78, 79, 80],
         [81, 82, 83]],

        [[84, 85, 86],
         [87, 88, 89],
         [90, 91, 92],
         [93, 94, 95]]]], dtype=int32)>
1
2
# 获取张量的维度数和形状列表
x.ndim,x.shape
(4, TensorShape([2, 4, 4, 3]))

通过 tf.reshape(x, new_shape),可以将张量的视图任意地合法改变

1
tf.reshape(x,[2,-1])
<tf.Tensor: id=373, shape=(2, 48), dtype=int32, numpy=
array([[ 0,  1,  2,  3,  4,  5,  6,  7,  8,  9, 10, 11, 12, 13, 14, 15,
        16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31,
        32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47],
       [48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63,
        64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79,
        80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95]],
      dtype=int32)>
1
tf.reshape(x,[2,4,12])
<tf.Tensor: id=375, shape=(2, 4, 12), dtype=int32, numpy=
array([[[ 0,  1,  2,  3,  4,  5,  6,  7,  8,  9, 10, 11],
        [12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23],
        [24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35],
        [36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47]],

       [[48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59],
        [60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71],
        [72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83],
        [84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95]]], dtype=int32)>
1
tf.reshape(x,[2,-1,3])
<tf.Tensor: id=377, shape=(2, 16, 3), dtype=int32, numpy=
array([[[ 0,  1,  2],
        [ 3,  4,  5],
        [ 6,  7,  8],
        [ 9, 10, 11],
        [12, 13, 14],
        [15, 16, 17],
        [18, 19, 20],
        [21, 22, 23],
        [24, 25, 26],
        [27, 28, 29],
        [30, 31, 32],
        [33, 34, 35],
        [36, 37, 38],
        [39, 40, 41],
        [42, 43, 44],
        [45, 46, 47]],

       [[48, 49, 50],
        [51, 52, 53],
        [54, 55, 56],
        [57, 58, 59],
        [60, 61, 62],
        [63, 64, 65],
        [66, 67, 68],
        [69, 70, 71],
        [72, 73, 74],
        [75, 76, 77],
        [78, 79, 80],
        [81, 82, 83],
        [84, 85, 86],
        [87, 88, 89],
        [90, 91, 92],
        [93, 94, 95]]], dtype=int32)>

增、删维度

1
2
3
# 产生矩阵
x = tf.random.uniform([28,28],maxval=10,dtype=tf.int32)
x
<tf.Tensor: id=381, shape=(28, 28), dtype=int32, numpy=
array([[4, 1, 1, 5, 4, 1, 3, 5, 4, 0, 8, 8, 0, 1, 8, 6, 4, 9, 5, 1, 8, 0,
        1, 3, 3, 5, 0, 4],
       [5, 9, 2, 1, 8, 6, 3, 8, 6, 3, 6, 4, 7, 7, 5, 9, 2, 8, 4, 6, 6, 4,
        9, 0, 5, 9, 9, 0],
       [8, 0, 0, 1, 4, 2, 5, 8, 9, 3, 5, 7, 0, 1, 3, 6, 2, 0, 4, 7, 7, 5,
        8, 2, 7, 8, 6, 0],
       [9, 6, 4, 8, 5, 5, 7, 1, 2, 8, 6, 9, 5, 3, 3, 6, 5, 9, 4, 4, 1, 0,
        5, 9, 3, 7, 1, 6],
       [5, 8, 7, 4, 6, 5, 4, 5, 7, 5, 1, 3, 2, 2, 9, 0, 9, 5, 3, 3, 4, 9,
        5, 1, 7, 0, 4, 6],
       [9, 2, 6, 7, 5, 7, 9, 3, 1, 8, 2, 0, 0, 8, 2, 7, 2, 2, 1, 1, 7, 1,
        9, 5, 2, 2, 6, 4],
       [6, 4, 2, 2, 7, 2, 8, 0, 1, 5, 9, 5, 0, 8, 0, 3, 8, 6, 3, 7, 0, 5,
        8, 1, 6, 1, 5, 4],
       [3, 9, 2, 4, 1, 8, 1, 5, 7, 0, 0, 2, 9, 0, 5, 0, 5, 1, 7, 0, 5, 0,
        1, 3, 2, 6, 3, 8],
       [2, 9, 2, 6, 0, 4, 8, 7, 7, 4, 0, 3, 0, 9, 1, 6, 1, 8, 5, 2, 0, 6,
        4, 0, 7, 5, 5, 9],
       [4, 6, 8, 6, 5, 5, 8, 8, 2, 5, 1, 7, 0, 7, 7, 2, 3, 2, 5, 3, 3, 4,
        4, 1, 2, 4, 7, 1],
       [8, 3, 0, 5, 0, 4, 4, 0, 2, 1, 3, 0, 8, 8, 3, 0, 5, 8, 6, 4, 3, 2,
        1, 4, 2, 4, 9, 5],
       [4, 3, 1, 4, 7, 0, 4, 9, 3, 2, 5, 9, 2, 4, 1, 5, 5, 8, 0, 5, 0, 7,
        0, 1, 0, 0, 2, 6],
       [2, 4, 9, 9, 4, 2, 0, 0, 2, 5, 6, 0, 0, 9, 7, 3, 6, 2, 7, 3, 8, 8,
        7, 2, 9, 9, 7, 3],
       [2, 8, 8, 8, 5, 7, 7, 9, 1, 8, 6, 5, 4, 8, 4, 4, 4, 5, 6, 5, 8, 2,
        5, 1, 1, 3, 5, 9],
       [2, 3, 8, 5, 2, 1, 6, 9, 5, 9, 0, 5, 7, 5, 7, 8, 8, 0, 9, 9, 3, 0,
        4, 3, 3, 3, 4, 5],
       [9, 6, 3, 8, 8, 3, 6, 0, 3, 4, 1, 1, 2, 9, 8, 0, 5, 3, 0, 7, 0, 9,
        2, 0, 8, 1, 1, 9],
       [4, 8, 7, 0, 3, 6, 1, 7, 7, 9, 0, 1, 4, 6, 7, 0, 9, 5, 2, 2, 6, 5,
        5, 0, 3, 1, 1, 7],
       [9, 2, 4, 6, 0, 5, 8, 2, 2, 7, 7, 9, 1, 1, 9, 5, 5, 8, 0, 3, 8, 4,
        2, 7, 0, 4, 2, 7],
       [9, 2, 3, 7, 7, 6, 3, 7, 4, 6, 4, 8, 4, 9, 3, 3, 2, 4, 8, 4, 7, 6,
        6, 2, 0, 7, 1, 9],
       [6, 1, 2, 0, 2, 0, 0, 0, 5, 8, 7, 6, 9, 7, 9, 0, 6, 6, 6, 5, 3, 1,
        3, 2, 3, 2, 3, 4],
       [7, 4, 8, 9, 8, 3, 4, 0, 8, 0, 5, 2, 0, 3, 9, 8, 3, 8, 4, 2, 5, 3,
        6, 1, 9, 8, 6, 5],
       [6, 3, 1, 6, 4, 1, 8, 1, 6, 7, 2, 7, 1, 2, 8, 1, 4, 5, 0, 0, 4, 9,
        9, 4, 6, 9, 7, 5],
       [4, 0, 1, 1, 8, 7, 0, 8, 1, 2, 8, 6, 2, 1, 4, 6, 9, 2, 6, 9, 4, 0,
        9, 0, 7, 9, 8, 4],
       [2, 8, 3, 1, 9, 6, 1, 0, 0, 4, 5, 7, 1, 2, 3, 5, 9, 4, 7, 9, 5, 5,
        8, 5, 0, 0, 5, 8],
       [4, 6, 7, 6, 4, 1, 6, 8, 4, 2, 4, 5, 6, 1, 6, 6, 4, 2, 1, 1, 2, 6,
        8, 3, 0, 0, 4, 0],
       [6, 3, 3, 6, 8, 4, 6, 3, 6, 3, 8, 9, 7, 2, 2, 9, 0, 5, 7, 7, 2, 6,
        3, 4, 6, 9, 4, 2],
       [2, 7, 0, 8, 7, 0, 7, 8, 2, 2, 8, 3, 9, 6, 3, 0, 0, 5, 5, 7, 3, 9,
        4, 7, 4, 4, 5, 0],
       [3, 5, 7, 5, 4, 6, 8, 5, 9, 4, 7, 1, 6, 8, 0, 3, 1, 5, 2, 0, 3, 5,
        9, 7, 6, 3, 3, 1]], dtype=int32)>

通过 tf.expand_dims(x, axis)可在指定的 axis 轴前可以插入一个新的维度

1
2
3
# axis=2 表示宽维度后面的一个维度
x = tf.expand_dims(x,axis=2)
x
<tf.Tensor: id=383, shape=(28, 28, 1), dtype=int32, numpy=
array([[[4],
        [1],
        [1],
        [5],
        [4],
        [1],
        [3],
        [5],
        [4],
        [0],
        [8],
        [8],
        [0],
        [1],
        [8],
        [6],
        [4],
        [9],
        [5],
        [1],
        [8],
        [0],
        [1],
        [3],
        [3],
        [5],
        [0],
        [4]],

       [[5],
        [9],
        [2],
        [1],
        [8],
        [6],
        [3],
        [8],
        [6],
        [3],
        [6],
        [4],
        [7],
        [7],
        [5],
        [9],
        [2],
        [8],
        [4],
        [6],
        [6],
        [4],
        [9],
        [0],
        [5],
        [9],
        [9],
        [0]],

       [[8],
        [0],
        [0],
        [1],
        [4],
        [2],
        [5],
        [8],
        [9],
        [3],
        [5],
        [7],
        [0],
        [1],
        [3],
        [6],
        [2],
        [0],
        [4],
        [7],
        [7],
        [5],
        [8],
        [2],
        [7],
        [8],
        [6],
        [0]],

       [[9],
        [6],
        [4],
        [8],
        [5],
        [5],
        [7],
        [1],
        [2],
        [8],
        [6],
        [9],
        [5],
        [3],
        [3],
        [6],
        [5],
        [9],
        [4],
        [4],
        [1],
        [0],
        [5],
        [9],
        [3],
        [7],
        [1],
        [6]],

       [[5],
        [8],
        [7],
        [4],
        [6],
        [5],
        [4],
        [5],
        [7],
        [5],
        [1],
        [3],
        [2],
        [2],
        [9],
        [0],
        [9],
        [5],
        [3],
        [3],
        [4],
        [9],
        [5],
        [1],
        [7],
        [0],
        [4],
        [6]],

       [[9],
        [2],
        [6],
        [7],
        [5],
        [7],
        [9],
        [3],
        [1],
        [8],
        [2],
        [0],
        [0],
        [8],
        [2],
        [7],
        [2],
        [2],
        [1],
        [1],
        [7],
        [1],
        [9],
        [5],
        [2],
        [2],
        [6],
        [4]],

       [[6],
        [4],
        [2],
        [2],
        [7],
        [2],
        [8],
        [0],
        [1],
        [5],
        [9],
        [5],
        [0],
        [8],
        [0],
        [3],
        [8],
        [6],
        [3],
        [7],
        [0],
        [5],
        [8],
        [1],
        [6],
        [1],
        [5],
        [4]],

       [[3],
        [9],
        [2],
        [4],
        [1],
        [8],
        [1],
        [5],
        [7],
        [0],
        [0],
        [2],
        [9],
        [0],
        [5],
        [0],
        [5],
        [1],
        [7],
        [0],
        [5],
        [0],
        [1],
        [3],
        [2],
        [6],
        [3],
        [8]],

       [[2],
        [9],
        [2],
        [6],
        [0],
        [4],
        [8],
        [7],
        [7],
        [4],
        [0],
        [3],
        [0],
        [9],
        [1],
        [6],
        [1],
        [8],
        [5],
        [2],
        [0],
        [6],
        [4],
        [0],
        [7],
        [5],
        [5],
        [9]],

       [[4],
        [6],
        [8],
        [6],
        [5],
        [5],
        [8],
        [8],
        [2],
        [5],
        [1],
        [7],
        [0],
        [7],
        [7],
        [2],
        [3],
        [2],
        [5],
        [3],
        [3],
        [4],
        [4],
        [1],
        [2],
        [4],
        [7],
        [1]],

       [[8],
        [3],
        [0],
        [5],
        [0],
        [4],
        [4],
        [0],
        [2],
        [1],
        [3],
        [0],
        [8],
        [8],
        [3],
        [0],
        [5],
        [8],
        [6],
        [4],
        [3],
        [2],
        [1],
        [4],
        [2],
        [4],
        [9],
        [5]],

       [[4],
        [3],
        [1],
        [4],
        [7],
        [0],
        [4],
        [9],
        [3],
        [2],
        [5],
        [9],
        [2],
        [4],
        [1],
        [5],
        [5],
        [8],
        [0],
        [5],
        [0],
        [7],
        [0],
        [1],
        [0],
        [0],
        [2],
        [6]],

       [[2],
        [4],
        [9],
        [9],
        [4],
        [2],
        [0],
        [0],
        [2],
        [5],
        [6],
        [0],
        [0],
        [9],
        [7],
        [3],
        [6],
        [2],
        [7],
        [3],
        [8],
        [8],
        [7],
        [2],
        [9],
        [9],
        [7],
        [3]],

       [[2],
        [8],
        [8],
        [8],
        [5],
        [7],
        [7],
        [9],
        [1],
        [8],
        [6],
        [5],
        [4],
        [8],
        [4],
        [4],
        [4],
        [5],
        [6],
        [5],
        [8],
        [2],
        [5],
        [1],
        [1],
        [3],
        [5],
        [9]],

       [[2],
        [3],
        [8],
        [5],
        [2],
        [1],
        [6],
        [9],
        [5],
        [9],
        [0],
        [5],
        [7],
        [5],
        [7],
        [8],
        [8],
        [0],
        [9],
        [9],
        [3],
        [0],
        [4],
        [3],
        [3],
        [3],
        [4],
        [5]],

       [[9],
        [6],
        [3],
        [8],
        [8],
        [3],
        [6],
        [0],
        [3],
        [4],
        [1],
        [1],
        [2],
        [9],
        [8],
        [0],
        [5],
        [3],
        [0],
        [7],
        [0],
        [9],
        [2],
        [0],
        [8],
        [1],
        [1],
        [9]],

       [[4],
        [8],
        [7],
        [0],
        [3],
        [6],
        [1],
        [7],
        [7],
        [9],
        [0],
        [1],
        [4],
        [6],
        [7],
        [0],
        [9],
        [5],
        [2],
        [2],
        [6],
        [5],
        [5],
        [0],
        [3],
        [1],
        [1],
        [7]],

       [[9],
        [2],
        [4],
        [6],
        [0],
        [5],
        [8],
        [2],
        [2],
        [7],
        [7],
        [9],
        [1],
        [1],
        [9],
        [5],
        [5],
        [8],
        [0],
        [3],
        [8],
        [4],
        [2],
        [7],
        [0],
        [4],
        [2],
        [7]],

       [[9],
        [2],
        [3],
        [7],
        [7],
        [6],
        [3],
        [7],
        [4],
        [6],
        [4],
        [8],
        [4],
        [9],
        [3],
        [3],
        [2],
        [4],
        [8],
        [4],
        [7],
        [6],
        [6],
        [2],
        [0],
        [7],
        [1],
        [9]],

       [[6],
        [1],
        [2],
        [0],
        [2],
        [0],
        [0],
        [0],
        [5],
        [8],
        [7],
        [6],
        [9],
        [7],
        [9],
        [0],
        [6],
        [6],
        [6],
        [5],
        [3],
        [1],
        [3],
        [2],
        [3],
        [2],
        [3],
        [4]],

       [[7],
        [4],
        [8],
        [9],
        [8],
        [3],
        [4],
        [0],
        [8],
        [0],
        [5],
        [2],
        [0],
        [3],
        [9],
        [8],
        [3],
        [8],
        [4],
        [2],
        [5],
        [3],
        [6],
        [1],
        [9],
        [8],
        [6],
        [5]],

       [[6],
        [3],
        [1],
        [6],
        [4],
        [1],
        [8],
        [1],
        [6],
        [7],
        [2],
        [7],
        [1],
        [2],
        [8],
        [1],
        [4],
        [5],
        [0],
        [0],
        [4],
        [9],
        [9],
        [4],
        [6],
        [9],
        [7],
        [5]],

       [[4],
        [0],
        [1],
        [1],
        [8],
        [7],
        [0],
        [8],
        [1],
        [2],
        [8],
        [6],
        [2],
        [1],
        [4],
        [6],
        [9],
        [2],
        [6],
        [9],
        [4],
        [0],
        [9],
        [0],
        [7],
        [9],
        [8],
        [4]],

       [[2],
        [8],
        [3],
        [1],
        [9],
        [6],
        [1],
        [0],
        [0],
        [4],
        [5],
        [7],
        [1],
        [2],
        [3],
        [5],
        [9],
        [4],
        [7],
        [9],
        [5],
        [5],
        [8],
        [5],
        [0],
        [0],
        [5],
        [8]],

       [[4],
        [6],
        [7],
        [6],
        [4],
        [1],
        [6],
        [8],
        [4],
        [2],
        [4],
        [5],
        [6],
        [1],
        [6],
        [6],
        [4],
        [2],
        [1],
        [1],
        [2],
        [6],
        [8],
        [3],
        [0],
        [0],
        [4],
        [0]],

       [[6],
        [3],
        [3],
        [6],
        [8],
        [4],
        [6],
        [3],
        [6],
        [3],
        [8],
        [9],
        [7],
        [2],
        [2],
        [9],
        [0],
        [5],
        [7],
        [7],
        [2],
        [6],
        [3],
        [4],
        [6],
        [9],
        [4],
        [2]],

       [[2],
        [7],
        [0],
        [8],
        [7],
        [0],
        [7],
        [8],
        [2],
        [2],
        [8],
        [3],
        [9],
        [6],
        [3],
        [0],
        [0],
        [5],
        [5],
        [7],
        [3],
        [9],
        [4],
        [7],
        [4],
        [4],
        [5],
        [0]],

       [[3],
        [5],
        [7],
        [5],
        [4],
        [6],
        [8],
        [5],
        [9],
        [4],
        [7],
        [1],
        [6],
        [8],
        [0],
        [3],
        [1],
        [5],
        [2],
        [0],
        [3],
        [5],
        [9],
        [7],
        [6],
        [3],
        [3],
        [1]]], dtype=int32)>
1
2
3
# 产生矩阵
x = tf.random.uniform([28,28],maxval=10,dtype=tf.int32)
x
<tf.Tensor: id=432726, shape=(28, 28), dtype=int32, numpy=
array([[5, 3, 3, 6, 9, 3, 0, 4, 1, 6, 3, 8, 7, 7, 5, 4, 0, 0, 8, 5, 2, 3,
        6, 4, 4, 8, 8, 3],
       [8, 4, 7, 9, 3, 6, 1, 7, 5, 9, 4, 9, 7, 4, 4, 8, 4, 9, 7, 9, 6, 1,
        3, 5, 0, 2, 4, 2],
       [3, 5, 5, 5, 7, 9, 5, 3, 6, 0, 7, 8, 0, 0, 4, 7, 8, 7, 4, 1, 9, 9,
        2, 6, 6, 2, 0, 3],
       [3, 5, 5, 2, 4, 0, 2, 5, 0, 2, 3, 2, 1, 0, 1, 4, 5, 3, 1, 9, 5, 3,
        0, 5, 8, 1, 9, 4],
       [6, 0, 1, 4, 0, 0, 7, 7, 0, 4, 3, 3, 3, 5, 2, 3, 5, 6, 4, 6, 4, 1,
        3, 2, 8, 5, 8, 4],
       [1, 2, 9, 3, 2, 3, 4, 4, 0, 0, 6, 6, 4, 0, 2, 7, 2, 2, 4, 2, 6, 0,
        4, 0, 5, 5, 4, 5],
       [3, 2, 6, 7, 7, 1, 0, 3, 3, 5, 5, 7, 8, 1, 6, 5, 5, 4, 3, 2, 9, 0,
        0, 3, 7, 0, 0, 0],
       [7, 7, 0, 7, 9, 8, 4, 5, 3, 7, 3, 3, 6, 1, 4, 8, 9, 2, 2, 8, 3, 1,
        0, 9, 3, 3, 2, 4],
       [9, 8, 8, 1, 4, 9, 3, 0, 5, 9, 6, 6, 8, 1, 4, 9, 8, 9, 8, 9, 3, 8,
        7, 5, 8, 0, 3, 6],
       [0, 8, 3, 1, 2, 1, 9, 4, 5, 0, 7, 1, 3, 5, 4, 4, 7, 3, 6, 5, 5, 6,
        2, 0, 6, 1, 4, 8],
       [4, 2, 6, 1, 7, 5, 6, 2, 4, 0, 9, 8, 0, 0, 0, 6, 8, 2, 3, 0, 0, 5,
        6, 6, 9, 8, 4, 9],
       [7, 3, 2, 5, 2, 5, 4, 3, 6, 4, 9, 2, 1, 7, 6, 4, 4, 5, 7, 7, 1, 6,
        4, 0, 3, 6, 4, 8],
       [1, 7, 6, 0, 4, 8, 0, 3, 0, 2, 0, 7, 0, 5, 6, 5, 3, 1, 4, 3, 8, 8,
        8, 6, 7, 3, 1, 7],
       [0, 8, 0, 5, 5, 2, 2, 5, 2, 1, 4, 4, 1, 4, 9, 4, 4, 1, 7, 1, 7, 7,
        4, 1, 8, 2, 3, 2],
       [5, 3, 5, 6, 6, 0, 9, 2, 9, 2, 8, 2, 1, 4, 0, 4, 7, 3, 0, 3, 8, 1,
        5, 7, 5, 4, 6, 6],
       [4, 7, 0, 1, 9, 6, 1, 1, 2, 1, 8, 2, 2, 9, 4, 7, 5, 1, 2, 4, 7, 5,
        7, 6, 6, 4, 1, 5],
       [4, 5, 7, 8, 2, 0, 5, 3, 4, 6, 3, 4, 5, 4, 9, 3, 6, 0, 2, 7, 1, 0,
        1, 7, 2, 4, 4, 0],
       [5, 3, 9, 1, 2, 4, 4, 8, 8, 2, 2, 1, 6, 4, 5, 2, 5, 0, 0, 1, 6, 4,
        5, 9, 5, 8, 9, 5],
       [6, 1, 1, 8, 9, 6, 8, 9, 9, 8, 2, 0, 9, 7, 9, 0, 9, 7, 0, 5, 3, 8,
        0, 9, 1, 8, 9, 4],
       [9, 1, 7, 3, 3, 7, 8, 3, 2, 2, 6, 3, 2, 1, 0, 5, 8, 2, 8, 4, 5, 5,
        2, 9, 9, 6, 8, 3],
       [0, 3, 8, 2, 5, 1, 6, 3, 5, 0, 2, 3, 9, 9, 4, 6, 1, 9, 3, 8, 7, 8,
        8, 7, 2, 4, 4, 8],
       [6, 7, 8, 6, 6, 6, 5, 2, 1, 8, 4, 7, 8, 9, 9, 4, 5, 2, 4, 7, 8, 5,
        9, 3, 0, 0, 9, 1],
       [7, 8, 4, 9, 4, 8, 9, 3, 5, 4, 8, 3, 7, 9, 2, 0, 2, 3, 8, 5, 6, 0,
        4, 3, 6, 1, 6, 3],
       [2, 4, 3, 9, 0, 2, 5, 9, 0, 0, 0, 3, 1, 2, 4, 3, 1, 4, 5, 7, 4, 3,
        9, 6, 9, 3, 6, 6],
       [3, 8, 6, 4, 0, 0, 3, 8, 1, 1, 5, 4, 8, 4, 8, 3, 3, 1, 1, 9, 6, 4,
        9, 7, 6, 7, 3, 7],
       [4, 3, 7, 5, 3, 4, 0, 4, 8, 2, 5, 1, 2, 8, 2, 6, 4, 7, 7, 6, 3, 5,
        9, 6, 4, 2, 5, 9],
       [4, 9, 2, 7, 8, 6, 1, 1, 9, 4, 1, 3, 1, 8, 6, 5, 3, 0, 2, 2, 3, 0,
        3, 6, 9, 1, 2, 4],
       [0, 7, 3, 6, 4, 2, 3, 7, 5, 0, 9, 5, 5, 2, 7, 5, 0, 3, 6, 8, 7, 3,
        3, 6, 1, 2, 3, 6]], dtype=int32)>
1
2
x = tf.expand_dims(x,axis=0) # 高维度之前插入新维度
x
<tf.Tensor: id=432728, shape=(1, 28, 28), dtype=int32, numpy=
array([[[5, 3, 3, 6, 9, 3, 0, 4, 1, 6, 3, 8, 7, 7, 5, 4, 0, 0, 8, 5, 2,
         3, 6, 4, 4, 8, 8, 3],
        [8, 4, 7, 9, 3, 6, 1, 7, 5, 9, 4, 9, 7, 4, 4, 8, 4, 9, 7, 9, 6,
         1, 3, 5, 0, 2, 4, 2],
        [3, 5, 5, 5, 7, 9, 5, 3, 6, 0, 7, 8, 0, 0, 4, 7, 8, 7, 4, 1, 9,
         9, 2, 6, 6, 2, 0, 3],
        [3, 5, 5, 2, 4, 0, 2, 5, 0, 2, 3, 2, 1, 0, 1, 4, 5, 3, 1, 9, 5,
         3, 0, 5, 8, 1, 9, 4],
        [6, 0, 1, 4, 0, 0, 7, 7, 0, 4, 3, 3, 3, 5, 2, 3, 5, 6, 4, 6, 4,
         1, 3, 2, 8, 5, 8, 4],
        [1, 2, 9, 3, 2, 3, 4, 4, 0, 0, 6, 6, 4, 0, 2, 7, 2, 2, 4, 2, 6,
         0, 4, 0, 5, 5, 4, 5],
        [3, 2, 6, 7, 7, 1, 0, 3, 3, 5, 5, 7, 8, 1, 6, 5, 5, 4, 3, 2, 9,
         0, 0, 3, 7, 0, 0, 0],
        [7, 7, 0, 7, 9, 8, 4, 5, 3, 7, 3, 3, 6, 1, 4, 8, 9, 2, 2, 8, 3,
         1, 0, 9, 3, 3, 2, 4],
        [9, 8, 8, 1, 4, 9, 3, 0, 5, 9, 6, 6, 8, 1, 4, 9, 8, 9, 8, 9, 3,
         8, 7, 5, 8, 0, 3, 6],
        [0, 8, 3, 1, 2, 1, 9, 4, 5, 0, 7, 1, 3, 5, 4, 4, 7, 3, 6, 5, 5,
         6, 2, 0, 6, 1, 4, 8],
        [4, 2, 6, 1, 7, 5, 6, 2, 4, 0, 9, 8, 0, 0, 0, 6, 8, 2, 3, 0, 0,
         5, 6, 6, 9, 8, 4, 9],
        [7, 3, 2, 5, 2, 5, 4, 3, 6, 4, 9, 2, 1, 7, 6, 4, 4, 5, 7, 7, 1,
         6, 4, 0, 3, 6, 4, 8],
        [1, 7, 6, 0, 4, 8, 0, 3, 0, 2, 0, 7, 0, 5, 6, 5, 3, 1, 4, 3, 8,
         8, 8, 6, 7, 3, 1, 7],
        [0, 8, 0, 5, 5, 2, 2, 5, 2, 1, 4, 4, 1, 4, 9, 4, 4, 1, 7, 1, 7,
         7, 4, 1, 8, 2, 3, 2],
        [5, 3, 5, 6, 6, 0, 9, 2, 9, 2, 8, 2, 1, 4, 0, 4, 7, 3, 0, 3, 8,
         1, 5, 7, 5, 4, 6, 6],
        [4, 7, 0, 1, 9, 6, 1, 1, 2, 1, 8, 2, 2, 9, 4, 7, 5, 1, 2, 4, 7,
         5, 7, 6, 6, 4, 1, 5],
        [4, 5, 7, 8, 2, 0, 5, 3, 4, 6, 3, 4, 5, 4, 9, 3, 6, 0, 2, 7, 1,
         0, 1, 7, 2, 4, 4, 0],
        [5, 3, 9, 1, 2, 4, 4, 8, 8, 2, 2, 1, 6, 4, 5, 2, 5, 0, 0, 1, 6,
         4, 5, 9, 5, 8, 9, 5],
        [6, 1, 1, 8, 9, 6, 8, 9, 9, 8, 2, 0, 9, 7, 9, 0, 9, 7, 0, 5, 3,
         8, 0, 9, 1, 8, 9, 4],
        [9, 1, 7, 3, 3, 7, 8, 3, 2, 2, 6, 3, 2, 1, 0, 5, 8, 2, 8, 4, 5,
         5, 2, 9, 9, 6, 8, 3],
        [0, 3, 8, 2, 5, 1, 6, 3, 5, 0, 2, 3, 9, 9, 4, 6, 1, 9, 3, 8, 7,
         8, 8, 7, 2, 4, 4, 8],
        [6, 7, 8, 6, 6, 6, 5, 2, 1, 8, 4, 7, 8, 9, 9, 4, 5, 2, 4, 7, 8,
         5, 9, 3, 0, 0, 9, 1],
        [7, 8, 4, 9, 4, 8, 9, 3, 5, 4, 8, 3, 7, 9, 2, 0, 2, 3, 8, 5, 6,
         0, 4, 3, 6, 1, 6, 3],
        [2, 4, 3, 9, 0, 2, 5, 9, 0, 0, 0, 3, 1, 2, 4, 3, 1, 4, 5, 7, 4,
         3, 9, 6, 9, 3, 6, 6],
        [3, 8, 6, 4, 0, 0, 3, 8, 1, 1, 5, 4, 8, 4, 8, 3, 3, 1, 1, 9, 6,
         4, 9, 7, 6, 7, 3, 7],
        [4, 3, 7, 5, 3, 4, 0, 4, 8, 2, 5, 1, 2, 8, 2, 6, 4, 7, 7, 6, 3,
         5, 9, 6, 4, 2, 5, 9],
        [4, 9, 2, 7, 8, 6, 1, 1, 9, 4, 1, 3, 1, 8, 6, 5, 3, 0, 2, 2, 3,
         0, 3, 6, 9, 1, 2, 4],
        [0, 7, 3, 6, 4, 2, 3, 7, 5, 0, 9, 5, 5, 2, 7, 5, 0, 3, 6, 8, 7,
         3, 3, 6, 1, 2, 3, 6]]], dtype=int32)>
1
2
x = tf.squeeze(x, axis=0) # 删除图片数量维度
x
<tf.Tensor: id=432729, shape=(28, 28), dtype=int32, numpy=
array([[5, 3, 3, 6, 9, 3, 0, 4, 1, 6, 3, 8, 7, 7, 5, 4, 0, 0, 8, 5, 2, 3,
        6, 4, 4, 8, 8, 3],
       [8, 4, 7, 9, 3, 6, 1, 7, 5, 9, 4, 9, 7, 4, 4, 8, 4, 9, 7, 9, 6, 1,
        3, 5, 0, 2, 4, 2],
       [3, 5, 5, 5, 7, 9, 5, 3, 6, 0, 7, 8, 0, 0, 4, 7, 8, 7, 4, 1, 9, 9,
        2, 6, 6, 2, 0, 3],
       [3, 5, 5, 2, 4, 0, 2, 5, 0, 2, 3, 2, 1, 0, 1, 4, 5, 3, 1, 9, 5, 3,
        0, 5, 8, 1, 9, 4],
       [6, 0, 1, 4, 0, 0, 7, 7, 0, 4, 3, 3, 3, 5, 2, 3, 5, 6, 4, 6, 4, 1,
        3, 2, 8, 5, 8, 4],
       [1, 2, 9, 3, 2, 3, 4, 4, 0, 0, 6, 6, 4, 0, 2, 7, 2, 2, 4, 2, 6, 0,
        4, 0, 5, 5, 4, 5],
       [3, 2, 6, 7, 7, 1, 0, 3, 3, 5, 5, 7, 8, 1, 6, 5, 5, 4, 3, 2, 9, 0,
        0, 3, 7, 0, 0, 0],
       [7, 7, 0, 7, 9, 8, 4, 5, 3, 7, 3, 3, 6, 1, 4, 8, 9, 2, 2, 8, 3, 1,
        0, 9, 3, 3, 2, 4],
       [9, 8, 8, 1, 4, 9, 3, 0, 5, 9, 6, 6, 8, 1, 4, 9, 8, 9, 8, 9, 3, 8,
        7, 5, 8, 0, 3, 6],
       [0, 8, 3, 1, 2, 1, 9, 4, 5, 0, 7, 1, 3, 5, 4, 4, 7, 3, 6, 5, 5, 6,
        2, 0, 6, 1, 4, 8],
       [4, 2, 6, 1, 7, 5, 6, 2, 4, 0, 9, 8, 0, 0, 0, 6, 8, 2, 3, 0, 0, 5,
        6, 6, 9, 8, 4, 9],
       [7, 3, 2, 5, 2, 5, 4, 3, 6, 4, 9, 2, 1, 7, 6, 4, 4, 5, 7, 7, 1, 6,
        4, 0, 3, 6, 4, 8],
       [1, 7, 6, 0, 4, 8, 0, 3, 0, 2, 0, 7, 0, 5, 6, 5, 3, 1, 4, 3, 8, 8,
        8, 6, 7, 3, 1, 7],
       [0, 8, 0, 5, 5, 2, 2, 5, 2, 1, 4, 4, 1, 4, 9, 4, 4, 1, 7, 1, 7, 7,
        4, 1, 8, 2, 3, 2],
       [5, 3, 5, 6, 6, 0, 9, 2, 9, 2, 8, 2, 1, 4, 0, 4, 7, 3, 0, 3, 8, 1,
        5, 7, 5, 4, 6, 6],
       [4, 7, 0, 1, 9, 6, 1, 1, 2, 1, 8, 2, 2, 9, 4, 7, 5, 1, 2, 4, 7, 5,
        7, 6, 6, 4, 1, 5],
       [4, 5, 7, 8, 2, 0, 5, 3, 4, 6, 3, 4, 5, 4, 9, 3, 6, 0, 2, 7, 1, 0,
        1, 7, 2, 4, 4, 0],
       [5, 3, 9, 1, 2, 4, 4, 8, 8, 2, 2, 1, 6, 4, 5, 2, 5, 0, 0, 1, 6, 4,
        5, 9, 5, 8, 9, 5],
       [6, 1, 1, 8, 9, 6, 8, 9, 9, 8, 2, 0, 9, 7, 9, 0, 9, 7, 0, 5, 3, 8,
        0, 9, 1, 8, 9, 4],
       [9, 1, 7, 3, 3, 7, 8, 3, 2, 2, 6, 3, 2, 1, 0, 5, 8, 2, 8, 4, 5, 5,
        2, 9, 9, 6, 8, 3],
       [0, 3, 8, 2, 5, 1, 6, 3, 5, 0, 2, 3, 9, 9, 4, 6, 1, 9, 3, 8, 7, 8,
        8, 7, 2, 4, 4, 8],
       [6, 7, 8, 6, 6, 6, 5, 2, 1, 8, 4, 7, 8, 9, 9, 4, 5, 2, 4, 7, 8, 5,
        9, 3, 0, 0, 9, 1],
       [7, 8, 4, 9, 4, 8, 9, 3, 5, 4, 8, 3, 7, 9, 2, 0, 2, 3, 8, 5, 6, 0,
        4, 3, 6, 1, 6, 3],
       [2, 4, 3, 9, 0, 2, 5, 9, 0, 0, 0, 3, 1, 2, 4, 3, 1, 4, 5, 7, 4, 3,
        9, 6, 9, 3, 6, 6],
       [3, 8, 6, 4, 0, 0, 3, 8, 1, 1, 5, 4, 8, 4, 8, 3, 3, 1, 1, 9, 6, 4,
        9, 7, 6, 7, 3, 7],
       [4, 3, 7, 5, 3, 4, 0, 4, 8, 2, 5, 1, 2, 8, 2, 6, 4, 7, 7, 6, 3, 5,
        9, 6, 4, 2, 5, 9],
       [4, 9, 2, 7, 8, 6, 1, 1, 9, 4, 1, 3, 1, 8, 6, 5, 3, 0, 2, 2, 3, 0,
        3, 6, 9, 1, 2, 4],
       [0, 7, 3, 6, 4, 2, 3, 7, 5, 0, 9, 5, 5, 2, 7, 5, 0, 3, 6, 8, 7, 3,
        3, 6, 1, 2, 3, 6]], dtype=int32)>
1
2
x = tf.random.uniform([1,28,28,1],maxval=10,dtype=tf.int32)
tf.squeeze(x) # 删除所有长度为 1 的维度
<tf.Tensor: id=391, shape=(28, 28), dtype=int32, numpy=
array([[3, 5, 3, 9, 7, 0, 0, 8, 3, 1, 4, 8, 5, 7, 8, 6, 9, 4, 1, 1, 5, 8,
        6, 2, 8, 3, 5, 3],
       [4, 8, 9, 7, 6, 0, 8, 7, 8, 3, 1, 3, 5, 9, 3, 6, 6, 2, 3, 1, 7, 6,
        9, 6, 2, 7, 4, 2],
       [5, 1, 2, 0, 3, 7, 5, 0, 7, 4, 7, 7, 5, 8, 9, 2, 2, 6, 7, 3, 8, 9,
        4, 1, 6, 5, 4, 7],
       [2, 5, 3, 4, 4, 7, 5, 5, 1, 1, 7, 0, 9, 8, 4, 3, 8, 6, 9, 3, 3, 2,
        1, 2, 4, 4, 4, 7],
       [9, 2, 3, 0, 3, 5, 4, 5, 8, 7, 0, 8, 6, 4, 9, 7, 1, 8, 3, 6, 5, 7,
        0, 4, 4, 2, 6, 9],
       [9, 3, 4, 4, 6, 8, 1, 7, 0, 8, 6, 0, 0, 2, 8, 3, 5, 0, 6, 6, 8, 4,
        8, 9, 4, 0, 9, 4],
       [3, 8, 5, 9, 4, 5, 1, 8, 5, 3, 5, 9, 7, 8, 9, 2, 8, 8, 5, 5, 5, 9,
        1, 9, 3, 4, 4, 8],
       [9, 5, 9, 4, 2, 0, 8, 1, 4, 2, 0, 3, 6, 9, 7, 6, 0, 5, 8, 9, 0, 8,
        0, 0, 3, 1, 1, 7],
       [4, 6, 9, 0, 6, 6, 7, 6, 2, 3, 1, 7, 8, 7, 8, 5, 2, 5, 4, 5, 1, 9,
        9, 6, 6, 4, 4, 8],
       [1, 4, 2, 6, 7, 8, 4, 9, 2, 7, 8, 8, 0, 7, 0, 3, 8, 2, 3, 1, 9, 2,
        7, 9, 1, 1, 6, 7],
       [0, 1, 7, 6, 4, 1, 4, 3, 0, 0, 7, 4, 7, 2, 6, 1, 3, 1, 8, 9, 1, 5,
        7, 3, 4, 3, 4, 6],
       [7, 7, 7, 3, 6, 6, 3, 6, 2, 8, 0, 3, 5, 5, 9, 1, 5, 0, 1, 8, 3, 9,
        7, 6, 7, 8, 0, 9],
       [3, 3, 9, 2, 4, 8, 1, 8, 8, 7, 5, 7, 4, 0, 1, 8, 5, 2, 9, 1, 1, 5,
        7, 5, 4, 0, 5, 5],
       [7, 9, 7, 1, 7, 7, 1, 5, 7, 1, 8, 3, 0, 5, 1, 9, 4, 0, 2, 4, 4, 4,
        5, 1, 8, 0, 2, 8],
       [8, 6, 4, 6, 5, 3, 3, 6, 7, 6, 1, 9, 0, 3, 6, 3, 9, 3, 0, 0, 4, 2,
        5, 5, 7, 1, 2, 0],
       [6, 7, 0, 4, 3, 2, 7, 8, 4, 4, 5, 8, 5, 0, 0, 4, 3, 4, 4, 9, 6, 6,
        8, 8, 4, 9, 8, 7],
       [1, 3, 5, 7, 6, 0, 2, 2, 1, 9, 8, 6, 6, 6, 0, 3, 6, 8, 9, 4, 0, 4,
        4, 0, 8, 0, 8, 9],
       [4, 6, 1, 4, 4, 8, 9, 7, 6, 8, 7, 9, 0, 8, 8, 3, 0, 5, 9, 8, 6, 6,
        9, 6, 5, 1, 0, 9],
       [0, 3, 1, 4, 2, 1, 2, 7, 6, 2, 1, 3, 0, 6, 6, 0, 7, 9, 5, 7, 7, 9,
        7, 6, 9, 9, 2, 7],
       [2, 8, 2, 1, 4, 4, 8, 8, 0, 3, 4, 6, 8, 2, 4, 5, 8, 3, 7, 5, 1, 6,
        7, 5, 6, 3, 1, 2],
       [4, 0, 7, 4, 0, 8, 3, 4, 9, 0, 0, 8, 9, 1, 1, 9, 7, 8, 9, 1, 9, 2,
        0, 7, 3, 6, 6, 2],
       [0, 4, 0, 9, 8, 3, 2, 5, 9, 1, 0, 2, 7, 9, 9, 7, 4, 5, 0, 0, 2, 7,
        7, 2, 1, 7, 5, 3],
       [9, 6, 3, 2, 6, 3, 1, 5, 1, 6, 6, 8, 9, 8, 3, 9, 6, 2, 8, 2, 3, 5,
        9, 6, 8, 0, 9, 5],
       [0, 3, 4, 7, 3, 5, 5, 0, 7, 3, 7, 7, 2, 1, 8, 4, 9, 7, 9, 1, 2, 5,
        9, 7, 7, 7, 8, 0],
       [6, 7, 3, 1, 2, 6, 4, 8, 5, 5, 4, 3, 7, 5, 4, 4, 1, 9, 6, 7, 6, 6,
        5, 2, 4, 0, 3, 3],
       [8, 8, 4, 5, 9, 3, 2, 7, 6, 5, 8, 4, 5, 4, 8, 3, 4, 6, 7, 3, 3, 4,
        9, 8, 0, 4, 1, 2],
       [5, 5, 9, 3, 6, 7, 4, 5, 2, 3, 4, 8, 0, 5, 3, 4, 1, 0, 3, 7, 6, 9,
        3, 8, 9, 4, 9, 8],
       [1, 4, 2, 1, 9, 3, 4, 7, 8, 1, 9, 3, 5, 8, 9, 4, 8, 3, 6, 9, 2, 1,
        7, 7, 4, 4, 9, 3]], dtype=int32)>

交换维度

1
2
x = tf.random.uniform([1,2,3,4])
print(x)
tf.Tensor(
[[[[0.5282526  0.3555627  0.41090894 0.47944117]
   [0.06685734 0.73899055 0.274917   0.786981  ]
   [0.5963073  0.47864938 0.4129647  0.9002305 ]]

  [[0.70865    0.46636987 0.76260746 0.23017025]
   [0.2235589  0.3718114  0.8150687  0.30672145]
   [0.78165174 0.63648796 0.61503696 0.35355854]]]], shape=(1, 2, 3, 4), dtype=float32)
1
2
# 交换维度
tf.transpose(x,perm=[0,3,1,2])
<tf.Tensor: id=432771, shape=(1, 4, 2, 3), dtype=float32, numpy=
array([[[[0.5282526 , 0.06685734, 0.5963073 ],
         [0.70865   , 0.2235589 , 0.78165174]],

        [[0.3555627 , 0.73899055, 0.47864938],
         [0.46636987, 0.3718114 , 0.63648796]],

        [[0.41090894, 0.274917  , 0.4129647 ],
         [0.76260746, 0.8150687 , 0.61503696]],

        [[0.47944117, 0.786981  , 0.9002305 ],
         [0.23017025, 0.30672145, 0.35355854]]]], dtype=float32)>

复制数据

1
2
3
4
5
6
# 创建向量 b
b = tf.constant([1,2])
print(b)
# 插入新维度,变成矩阵
b = tf.expand_dims(b, axis=0)
b
tf.Tensor([1 2], shape=(2,), dtype=int32)





<tf.Tensor: id=432780, shape=(1, 2), dtype=int32, numpy=array([[1, 2]], dtype=int32)>
1
2
3
# 样本行维度上复制一份
b = tf.tile(b, multiples=[2,1])
b
<tf.Tensor: id=412, shape=(2, 2), dtype=int32, numpy=
array([[1, 2],
       [1, 2]], dtype=int32)>
1
2
3
4
x = tf.range(4)
# 创建 2 行 2 列矩阵
x=tf.reshape(x,[2,2])
x
<tf.Tensor: id=432787, shape=(2, 2), dtype=int32, numpy=
array([[0, 1],
       [2, 3]], dtype=int32)>
1
2
3
# 列维度复制一份
x = tf.tile(x,multiples=[1,2])
x
<tf.Tensor: id=432789, shape=(2, 4), dtype=int32, numpy=
array([[0, 1, 0, 1],
       [2, 3, 2, 3]], dtype=int32)>
1
2
3
# 行维度复制一份
x = tf.tile(x,multiples=[2,1])
x
<tf.Tensor: id=432791, shape=(4, 4), dtype=int32, numpy=
array([[0, 1, 0, 1],
       [2, 3, 2, 3],
       [0, 1, 0, 1],
       [2, 3, 2, 3]], dtype=int32)>

Broadcasting

1
2
3
4
# 创建矩阵
A = tf.random.normal([32,1])
# 扩展为 4D 张量
tf.broadcast_to(A, [2,32,32,3])
<tf.Tensor: id=430, shape=(2, 32, 32, 3), dtype=float32, numpy=
array([[[[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]],

        [[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]],

        [[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]],

        ...,

        [[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]],

        [[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]],

        [[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]]],


[[[ 0.04447514, 0.04447514, 0.04447514],
[-0.8540972 , -0.8540972 , -0.8540972 ],
[ 0.30159432, 0.30159432, 0.30159432],
…,
[-0.84129137, -0.84129137, -0.84129137],
[ 0.58230823, 0.58230823, 0.58230823],
[ 0.1573652 , 0.1573652 , 0.1573652 ]],

        [[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]],

        [[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]],

        ...,

        [[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]],

        [[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]],

        [[ 0.04447514,  0.04447514,  0.04447514],
         [-0.8540972 , -0.8540972 , -0.8540972 ],
         [ 0.30159432,  0.30159432,  0.30159432],
         ...,
         [-0.84129137, -0.84129137, -0.84129137],
         [ 0.58230823,  0.58230823,  0.58230823],
         [ 0.1573652 ,  0.1573652 ,  0.1573652 ]]]], dtype=float32)>
1
2
3
4
5
6
A = tf.random.normal([32,2])
# 不符合 Broadcasting 条件
try:
tf.broadcast_to(A, [2,32,32,4])
except Exception as e:
print(e)
Incompatible shapes: [32,2] vs. [2,32,32,4] [Op:BroadcastTo]

数学运算

加、减、乘、除运算

1
2
3
4
a = tf.range(5)
b = tf.constant(2)
# 整除运算
a//b
<tf.Tensor: id=443, shape=(5,), dtype=int32, numpy=array([0, 0, 1, 1, 2], dtype=int32)>
1
2
# 余除运算
a%b
<tf.Tensor: id=444, shape=(5,), dtype=int32, numpy=array([0, 1, 0, 1, 0], dtype=int32)>

乘方运算

1
2
3
x = tf.range(4)
# 乘方运算
tf.pow(x,3)
<tf.Tensor: id=450, shape=(4,), dtype=int32, numpy=array([ 0,  1,  8, 27], dtype=int32)>
1
2
# 乘方运算符
x**2
<tf.Tensor: id=452, shape=(4,), dtype=int32, numpy=array([0, 1, 4, 9], dtype=int32)>
1
2
3
x=tf.constant([1.,4.,9.])
# 平方根
x**(0.5)
<tf.Tensor: id=455, shape=(3,), dtype=float32, numpy=array([1., 2., 3.], dtype=float32)>
1
2
3
4
5
6
7
8
x = tf.range(5)
print(x)
# 转换为浮点数
x = tf.cast(x, dtype=tf.float32)
print(x)
# 平方
x = tf.square(x)
print(x)
tf.Tensor([0 1 2 3 4], shape=(5,), dtype=int32)
tf.Tensor([0. 1. 2. 3. 4.], shape=(5,), dtype=float32)
tf.Tensor([ 0.  1.  4.  9. 16.], shape=(5,), dtype=float32)
1
2
# 平方根
tf.sqrt(x)
<tf.Tensor: id=432798, shape=(5,), dtype=float32, numpy=
array([0.        , 0.99999994, 1.9999999 , 2.9999998 , 4.        ],
      dtype=float32)>

指数和对数运算

1
2
3
x = tf.constant([1.,2.,3.])
# 指数运算
2**x
<tf.Tensor: id=465, shape=(3,), dtype=float32, numpy=array([2., 4., 8.], dtype=float32)>
1
2
# 自然指数运算
tf.exp(1.)
<tf.Tensor: id=467, shape=(), dtype=float32, numpy=2.7182817>
1
2
3
x = tf.exp(3.)
# 对数运算
tf.math.log(x)
<tf.Tensor: id=470, shape=(), dtype=float32, numpy=3.0>
1
2
3
4
x = tf.constant([1.,2.])
x = 10**x
# 换底公式
tf.math.log(x)/tf.math.log(10.)
<tf.Tensor: id=477, shape=(2,), dtype=float32, numpy=array([1., 2.], dtype=float32)>

矩阵相乘运算

1
2
3
4
a = tf.random.normal([4,3,28,32])
b = tf.random.normal([4,3,32,2])
# 批量形式的矩阵相乘
print(a@b)
tf.Tensor(
[[[[ 2.93815994e+00  1.80159616e+00]
   [-4.95495558e+00 -2.65059781e+00]
   [ 6.36351776e+00  8.27180672e+00]
   [-2.50184441e+00 -3.22499895e+00]
   [-6.89737129e+00  6.43845701e+00]
   [ 4.72115231e+00 -7.39528358e-01]
   [-5.87444019e+00 -6.75603390e+00]
   [ 8.61762238e+00  4.49309635e+00]
   [ 3.18081021e+00 -1.84904563e+00]
   [-5.71209073e-01  1.76863492e+00]
   [-8.77548409e+00 -2.09427929e+00]
   [ 6.11399221e+00  3.75506377e+00]
   [-5.72681367e-01 -5.56786919e+00]
   [ 1.03334942e+01 -6.21349716e+00]
   [ 2.05781221e+00 -2.48031449e+00]
   [-1.05474174e-01  1.17951145e+01]
   [-8.32847595e+00  8.04420090e+00]
   [ 1.10347319e+01 -7.15183640e+00]
   [-1.10890408e+01  2.06656051e+00]
   [ 2.04201794e+00 -1.72195137e-01]
   [ 4.16003466e+00  2.92319274e+00]
   [ 1.00829735e+01  3.50188327e+00]
   [ 1.60061455e+01 -3.23914313e+00]
   [-1.34949207e+00 -2.27372718e+00]
   [ 1.16594486e+01 -1.30499089e+00]
   [ 2.90008926e+00  6.59213543e+00]
   [-3.04731274e+00 -1.17982030e-01]
   [-6.25353050e+00 -1.59929824e+00]]

  [[ 2.35922217e+00 -1.44711876e+00]
   [-2.82181549e+00 -4.16362000e+00]
   [-4.13206530e+00  1.96330786e-01]
   [-1.13723636e+00 -1.90036798e+00]
   [-1.42907238e+00  4.24102306e-01]
   [ 1.01430655e+01  2.54081345e+00]
   [-4.05478477e+00 -9.29689407e+00]
   [ 1.36705666e+01  1.40576875e+00]
   [-5.09379244e+00  4.66089153e+00]
   [ 6.25803471e-02 -2.86052656e+00]
   [-1.12946069e+00 -4.28373003e+00]
   [ 5.32545996e+00  1.97562897e+00]
   [ 7.16341162e+00  6.10791969e+00]
   [-4.77171421e+00 -1.75261497e+00]
   [-1.43213348e+01  5.44925928e+00]
   [-2.13357735e+00 -2.74817157e+00]
   [-6.38115454e+00  6.48117113e+00]
   [-1.21313601e+01 -1.16765201e+00]
   [-1.98863697e+00  1.22314978e+01]
   [ 3.63174462e+00  5.20076323e+00]
   [-1.05080090e+01  4.47047186e+00]
   [ 8.52560043e+00 -3.26042938e+00]
   [ 1.86961699e+00  1.04149675e+00]
   [ 3.27967310e+00  4.52322531e+00]
   [-1.08596125e+01  4.40047550e+00]
   [ 3.30025196e+00 -3.57261777e-01]
   [-4.17899323e+00 -5.29293346e+00]
   [-6.29359818e+00  2.55025506e-01]]

  [[ 2.79792500e+00 -1.14968262e+01]
   [ 6.42120302e-01  8.60604167e-01]
   [ 8.26789284e+00  9.11268139e+00]
   [-1.24864876e+00 -1.29506755e+00]
   [ 1.83019781e+00  1.32512970e+01]
   [-4.38226223e+00  2.93613434e+00]
   [-1.01948481e+01 -2.50259852e+00]
   [-6.08818817e+00  5.71516156e-01]
   [ 8.14604282e-01  8.74936581e-01]
   [-9.27050591e-01  1.68381357e+00]
   [-2.39078522e+00 -4.39953446e-01]
   [-1.79722738e+00  2.44799304e+00]
   [-6.19097829e-01  4.20792866e+00]
   [-3.59187007e+00  2.05337834e+00]
   [-2.02478099e+00  3.92844319e+00]
   [-2.78609324e+00 -1.00785866e+01]
   [ 1.35041237e-01  9.82832527e+00]
   [ 3.77985573e+00 -2.92683578e+00]
   [ 2.50951290e+00 -1.10158062e+00]
   [-2.69217896e+00  7.27837420e+00]
   [ 4.59399509e+00 -4.81438732e+00]
   [ 9.01638508e+00  5.12754726e+00]
   [ 5.19506645e+00 -2.35464978e+00]
   [ 2.05791235e-01  3.24537897e+00]
   [-6.17561936e-02  1.22012386e+01]
   [ 8.34735334e-01  2.56306553e+00]
   [-8.42908740e-01 -4.72223663e+00]
   [ 7.59096265e-01 -8.70975971e-01]]]


[[[-1.25730896e+01 1.73365784e+00]
[-8.16483736e-01 -3.12521791e+00]
[ 4.31258678e+00 -3.65629935e+00]
[ 7.81925964e+00 2.67266393e+00]
[-1.45902622e+00 -1.69710827e+00]
[-4.97250271e+00 1.06699669e+00]
[-1.15320644e+01 3.67050219e+00]
[ 9.82042491e-01 -8.96060181e+00]
[ 4.24584293e+00 -2.03312969e+00]
[-8.90874267e-02 -2.19113445e+00]
[ 7.03373575e+00 4.82089567e+00]
[ 2.19787431e+00 -1.35815620e+00]
[-4.33743429e+00 -2.77082419e+00]
[-6.55539846e+00 -5.28619862e+00]
[-4.10456562e+00 1.53431883e+01]
[-6.97701550e+00 5.58186054e-01]
[-4.06244993e+00 -1.29598303e+01]
[ 1.80246496e+00 -2.77987790e+00]
[-7.30259180e+00 -5.11505365e+00]
[ 2.12593174e+00 -3.25598717e+00]
[ 7.80677795e+00 1.99891090e-01]
[ 8.46464539e+00 2.72348213e+00]
[-2.32167172e+00 4.69824505e+00]
[ 3.97749400e+00 -9.19138908e+00]
[ 2.90814090e+00 1.26416731e+00]
[-8.01068783e-01 5.13629675e+00]
[-1.10610142e+01 4.88826132e+00]
[-1.75804818e+00 1.23052418e+00]]

  [[-5.49224281e+00 -1.18972950e+01]
   [-1.71067417e+00 -7.94510078e+00]
   [ 8.11289787e+00  2.97325039e+00]
   [-6.18529272e+00 -1.07129316e+01]
   [ 5.09897184e+00 -5.09968042e+00]
   [-7.61364222e+00  5.40171003e+00]
   [-6.47705889e+00 -1.68601739e+00]
   [ 5.09246063e+00 -4.75051785e+00]
   [-8.07900906e+00 -1.01351190e+00]
   [-5.56266832e+00 -5.98014545e+00]
   [-1.84528065e+00 -2.55948591e+00]
   [ 1.63680077e-01 -4.52482510e+00]
   [-6.17208385e+00  8.61810780e+00]
   [ 2.54550838e+00  1.04630842e+01]
   [ 6.78235245e+00  2.36592340e+00]
   [ 5.33071947e+00  7.77984023e-01]
   [ 1.59947419e+00  3.40054178e+00]
   [ 9.72853303e-01  1.95867562e+00]
   [-5.11668110e+00  9.07939816e+00]
   [-9.91412735e+00  5.33779049e+00]
   [ 6.93627453e+00  9.98051357e+00]
   [-1.49268985e+00 -1.61656654e+00]
   [ 6.77412367e+00  5.89341736e+00]
   [-9.02515793e+00 -4.86350346e+00]
   [-2.65398359e+00  6.53871584e+00]
   [ 7.60008812e+00  2.23823214e+00]
   [ 3.48978114e+00  7.77210045e+00]
   [-1.89859509e+00  1.36791458e+01]]

  [[ 6.16735172e+00 -3.06368208e+00]
   [-6.22440147e+00  2.49987888e+00]
   [ 1.23477805e+00 -9.58612680e-01]
   [-6.32274055e+00 -3.50495398e-01]
   [-2.37460756e+00  2.89988756e+00]
   [ 6.76133537e+00 -4.97397709e+00]
   [ 4.19915617e-01 -1.44209051e+00]
   [-8.48055720e-01 -1.34412467e+00]
   [ 1.47503817e+00 -4.57327032e+00]
   [-8.83791351e+00  1.30507603e+01]
   [ 5.04546928e+00  2.96709967e+00]
   [ 1.21958218e+01 -3.70147228e-01]
   [-9.32716131e-02  8.25912094e+00]
   [-6.88422298e+00 -2.94276452e+00]
   [-8.92567754e-01  7.48677373e-01]
   [ 1.13859291e+01 -4.54786253e+00]
   [ 4.14542294e+00  3.62407827e+00]
   [-4.84375191e+00  7.60643148e+00]
   [-3.99736214e+00  8.38322639e-01]
   [ 8.50940418e+00 -3.18443274e+00]
   [ 3.02693796e+00 -1.08982430e+01]
   [ 5.94771481e+00  1.90185285e+00]
   [ 1.79021060e-02  2.71560931e+00]
   [ 1.28265166e+00  1.35003614e+00]
   [-5.15541887e+00  3.48176098e+00]
   [ 1.35739307e+01  1.13081062e+00]
   [-1.13344326e+01  2.02814102e+00]
   [-3.78427625e+00 -3.41797924e+00]]]


[[[ 7.51625490e+00 -8.01126385e+00]
[ 8.94779682e-01 -1.39191675e+00]
[-6.38634396e+00 -7.54839706e+00]
[ 5.90809202e+00 -4.73860931e+00]
[ 2.89039660e+00 -5.74475765e-01]
[-3.24619865e+00 -4.33766127e+00]
[ 7.45817757e+00 8.72869968e+00]
[ 1.70315895e+01 -1.43109741e+01]
[ 6.71465302e+00 2.36530209e+00]
[ 2.37234616e+00 9.54824162e+00]
[ 1.24315548e+01 -8.32226753e-01]
[ 1.15248704e+00 6.42775774e+00]
[-2.05694604e+00 -5.29237223e+00]
[ 1.06061993e+01 4.43905163e+00]
[ 3.32259560e+00 2.86404061e+00]
[-1.26070702e+00 -3.86716032e+00]
[-7.17960167e+00 -5.47068119e+00]
[ 6.13063002e+00 -1.27777729e+01]
[-4.77525711e+00 -2.89896202e+00]
[ 5.04258776e+00 1.05476036e+01]
[-4.72102404e+00 4.53035545e+00]
[ 8.30504322e+00 -6.72617435e+00]
[ 1.51879632e+00 -7.57512569e+00]
[ 5.25161076e+00 -6.00039482e+00]
[-2.66712689e+00 -3.15567350e+00]
[ 6.98167515e+00 1.21508999e+01]
[-3.58145714e+00 1.01358452e+01]
[-7.68432474e+00 6.27517796e+00]]

  [[-3.85787821e+00  4.13319540e+00]
   [ 4.08870316e+00  8.98323441e+00]
   [ 2.88623333e-01  2.08936238e+00]
   [-1.27303381e+01 -3.72204494e+00]
   [-1.63620949e-01  4.14640725e-01]
   [-9.77903843e+00 -9.83979321e+00]
   [ 1.20987940e+01 -1.00569272e+00]
   [-9.52555597e-01 -7.21974373e-01]
   [-7.53700542e+00 -1.03328714e+01]
   [ 3.82278275e+00  7.92873979e-01]
   [ 1.62820339e+01  9.25348282e-01]
   [-2.99300981e+00 -4.05044317e+00]
   [-1.71284425e+00  3.32610369e-01]
   [ 8.45957184e+00  3.01560092e+00]
   [ 9.61618781e-01  4.84845543e+00]
   [-5.04883003e+00 -4.64504576e+00]
   [-4.88548994e-01  4.22385454e+00]
   [-3.06538558e+00 -2.68467999e+00]
   [ 1.44536438e+01 -1.67332339e+00]
   [-1.20380235e+00 -3.01969767e-01]
   [-1.25808067e+01 -1.83691287e+00]
   [ 7.00172246e-01 -5.44080067e+00]
   [ 3.71728969e+00 -6.50164127e+00]
   [ 3.44825792e+00  2.27989483e+00]
   [-1.16813726e+01  3.55064964e+00]
   [-6.76289463e+00 -1.25415869e+01]
   [ 1.86627662e+00  4.91928959e+00]
   [ 2.37216616e+00 -3.23613596e+00]]

  [[ 5.60678053e+00  6.07894707e+00]
   [-6.23391962e+00 -1.82450306e+00]
   [ 6.90571690e+00 -2.60890079e+00]
   [ 3.98191905e+00 -2.60109711e+00]
   [ 3.79411244e+00 -7.30271769e+00]
   [-8.19082737e+00 -4.81762362e+00]
   [ 1.01562176e+01 -9.18346643e-02]
   [ 5.15106916e-01 -1.88746595e+00]
   [-7.10526466e-01  4.75524092e+00]
   [ 4.28777647e+00 -4.28609967e-01]
   [ 2.94255161e+00 -2.76411390e+00]
   [-5.01211119e+00 -1.35121047e-01]
   [ 4.88255644e+00  7.48982000e+00]
   [-2.94339252e+00 -1.49728453e+00]
   [-5.28226614e-01  1.13798523e+01]
   [-3.26653433e+00 -1.12711830e+01]
   [ 6.92280245e+00  4.46824360e+00]
   [-1.07686818e-02  5.99187469e+00]
   [-2.30055904e+00 -2.35181737e+00]
   [-1.86744165e+00 -9.12775040e-01]
   [ 5.70386982e+00  2.56417489e+00]
   [-4.37073708e-01 -4.62391090e+00]
   [-8.43499756e+00  9.08772826e-01]
   [-5.64418888e+00 -5.02650261e+00]
   [ 3.92685270e+00 -5.31071186e+00]
   [ 6.36297584e-01  2.63665223e+00]
   [-7.71557522e+00  4.19800425e+00]
   [ 3.64932895e+00  2.46329069e+00]]]


[[[ 6.65752888e-01 3.40558529e-01]
[ 4.08683634e+00 6.27357101e+00]
[-2.77316380e+00 5.83889532e+00]
[-2.01864777e+01 -8.63007069e+00]
[ 4.85101509e+00 1.56102419e-01]
[ 5.13551521e+00 7.00347781e-01]
[ 4.66926765e+00 1.13918304e+01]
[ 1.17937775e+01 -5.75443983e+00]
[ 5.18499660e+00 2.47753906e+01]
[-4.94616604e+00 1.09324312e+00]
[ 7.08940148e-01 5.36628440e-02]
[-6.22777748e+00 -6.08889389e+00]
[ 8.21062326e-01 5.73018026e+00]
[-1.01816578e+01 -5.96292210e+00]
[-3.45601702e+00 -5.80823088e+00]
[-7.81425619e+00 -1.54714165e+01]
[ 6.15157843e+00 4.41321850e+00]
[ 2.28190422e-02 -1.40392697e+00]
[ 5.86180115e+00 2.66614532e+00]
[-1.21994901e+00 6.87365246e+00]
[ 7.62740707e+00 -1.52388859e+00]
[-8.03575134e+00 -1.35383148e+01]
[-1.75186968e+00 -1.95710063e+00]
[-8.72407794e-01 -8.31413174e+00]
[-1.38678074e+01 -3.35018563e+00]
[ 1.02961273e+01 5.95636034e+00]
[ 7.15158939e+00 9.47603941e-01]
[ 3.59655428e+00 -3.57616353e+00]]

  [[-7.15814590e+00 -1.73663855e-01]
   [-5.33630848e+00  2.23019302e-01]
   [-3.60880065e+00  1.16919529e+00]
   [ 1.65422618e+00  3.21728516e+00]
   [ 1.86843979e+00  1.13296022e+01]
   [-6.71664524e+00  8.06290245e+00]
   [-3.82262254e+00  4.57042742e+00]
   [-7.61132431e+00  7.53255653e+00]
   [ 1.63969231e+00 -1.19336343e+00]
   [ 2.03410006e+00  5.48414516e+00]
   [ 7.98875904e+00 -6.00354958e+00]
   [ 5.37972260e+00 -3.13939238e+00]
   [ 6.52196217e+00  5.99524212e+00]
   [-3.65084100e+00  5.70605898e+00]
   [ 5.66238022e+00 -4.25603628e-01]
   [ 1.31335664e+00  3.34762931e-01]
   [ 4.95460320e+00 -7.73174858e+00]
   [-6.06322289e-02  7.14966822e+00]
   [ 4.30868864e+00 -4.49330187e+00]
   [ 3.00062609e+00 -3.45171928e+00]
   [-8.88646841e-01  4.49364281e+00]
   [-1.37166762e+00 -9.60632420e+00]
   [ 2.72169065e+00 -2.02102685e+00]
   [ 4.06615162e+00  2.21987987e+00]
   [ 4.58932543e+00 -6.33985901e+00]
   [-7.59764194e+00 -8.69492054e-01]
   [ 6.72914386e-01  3.37907672e-02]
   [-9.57373238e+00  4.29612064e+00]]

  [[ 2.07057667e+00  2.49500203e+00]
   [-2.39765930e+00  6.45140171e-01]
   [ 9.70951462e+00  1.52998376e+00]
   [ 9.77593803e+00  8.06670094e+00]
   [ 8.35551929e+00  7.26291513e+00]
   [-9.06231880e-01 -9.31769133e-01]
   [-6.77584314e+00  3.27285552e+00]
   [ 5.13162661e+00 -5.17782736e+00]
   [-3.71608639e+00 -5.12819290e-01]
   [ 1.48577709e+01 -2.64512122e-01]
   [ 6.44747496e-01 -9.95941162e-02]
   [ 1.04961805e+01 -3.98670554e+00]
   [ 2.51394081e+00  1.80438447e+00]
   [-5.59201813e+00  1.18733444e+01]
   [-2.31048003e-01 -1.12039871e+01]
   [-1.62683907e+01  2.02177715e+00]
   [ 1.14540329e+01  2.30115056e-02]
   [-1.10159683e+01 -3.24261713e+00]
   [-1.33181334e+01 -8.00105953e+00]
   [ 6.21838617e+00  8.89258957e+00]
   [ 1.58339548e+00 -2.27107620e+00]
   [-4.17989254e-01 -2.85755348e+00]
   [-2.48508906e+00  9.64674568e+00]
   [-1.08764257e+01  2.08483315e+00]
   [ 9.77210236e+00  2.50418329e+00]
   [-1.62253022e+00  1.67334347e+01]
   [ 6.42501354e-01  2.53464675e+00]
   [-1.12935200e+01  3.39891338e+00]]]], shape=(4, 3, 28, 2), dtype=float32)
1
print(tf.matmul(a,b))
tf.Tensor(
[[[[ 2.93815994e+00  1.80159616e+00]
   [-4.95495558e+00 -2.65059781e+00]
   [ 6.36351776e+00  8.27180672e+00]
   [-2.50184441e+00 -3.22499895e+00]
   [-6.89737129e+00  6.43845701e+00]
   [ 4.72115231e+00 -7.39528358e-01]
   [-5.87444019e+00 -6.75603390e+00]
   [ 8.61762238e+00  4.49309635e+00]
   [ 3.18081021e+00 -1.84904563e+00]
   [-5.71209073e-01  1.76863492e+00]
   [-8.77548409e+00 -2.09427929e+00]
   [ 6.11399221e+00  3.75506377e+00]
   [-5.72681367e-01 -5.56786919e+00]
   [ 1.03334942e+01 -6.21349716e+00]
   [ 2.05781221e+00 -2.48031449e+00]
   [-1.05474174e-01  1.17951145e+01]
   [-8.32847595e+00  8.04420090e+00]
   [ 1.10347319e+01 -7.15183640e+00]
   [-1.10890408e+01  2.06656051e+00]
   [ 2.04201794e+00 -1.72195137e-01]
   [ 4.16003466e+00  2.92319274e+00]
   [ 1.00829735e+01  3.50188327e+00]
   [ 1.60061455e+01 -3.23914313e+00]
   [-1.34949207e+00 -2.27372718e+00]
   [ 1.16594486e+01 -1.30499089e+00]
   [ 2.90008926e+00  6.59213543e+00]
   [-3.04731274e+00 -1.17982030e-01]
   [-6.25353050e+00 -1.59929824e+00]]

  [[ 2.35922217e+00 -1.44711876e+00]
   [-2.82181549e+00 -4.16362000e+00]
   [-4.13206530e+00  1.96330786e-01]
   [-1.13723636e+00 -1.90036798e+00]
   [-1.42907238e+00  4.24102306e-01]
   [ 1.01430655e+01  2.54081345e+00]
   [-4.05478477e+00 -9.29689407e+00]
   [ 1.36705666e+01  1.40576875e+00]
   [-5.09379244e+00  4.66089153e+00]
   [ 6.25803471e-02 -2.86052656e+00]
   [-1.12946069e+00 -4.28373003e+00]
   [ 5.32545996e+00  1.97562897e+00]
   [ 7.16341162e+00  6.10791969e+00]
   [-4.77171421e+00 -1.75261497e+00]
   [-1.43213348e+01  5.44925928e+00]
   [-2.13357735e+00 -2.74817157e+00]
   [-6.38115454e+00  6.48117113e+00]
   [-1.21313601e+01 -1.16765201e+00]
   [-1.98863697e+00  1.22314978e+01]
   [ 3.63174462e+00  5.20076323e+00]
   [-1.05080090e+01  4.47047186e+00]
   [ 8.52560043e+00 -3.26042938e+00]
   [ 1.86961699e+00  1.04149675e+00]
   [ 3.27967310e+00  4.52322531e+00]
   [-1.08596125e+01  4.40047550e+00]
   [ 3.30025196e+00 -3.57261777e-01]
   [-4.17899323e+00 -5.29293346e+00]
   [-6.29359818e+00  2.55025506e-01]]

  [[ 2.79792500e+00 -1.14968262e+01]
   [ 6.42120302e-01  8.60604167e-01]
   [ 8.26789284e+00  9.11268139e+00]
   [-1.24864876e+00 -1.29506755e+00]
   [ 1.83019781e+00  1.32512970e+01]
   [-4.38226223e+00  2.93613434e+00]
   [-1.01948481e+01 -2.50259852e+00]
   [-6.08818817e+00  5.71516156e-01]
   [ 8.14604282e-01  8.74936581e-01]
   [-9.27050591e-01  1.68381357e+00]
   [-2.39078522e+00 -4.39953446e-01]
   [-1.79722738e+00  2.44799304e+00]
   [-6.19097829e-01  4.20792866e+00]
   [-3.59187007e+00  2.05337834e+00]
   [-2.02478099e+00  3.92844319e+00]
   [-2.78609324e+00 -1.00785866e+01]
   [ 1.35041237e-01  9.82832527e+00]
   [ 3.77985573e+00 -2.92683578e+00]
   [ 2.50951290e+00 -1.10158062e+00]
   [-2.69217896e+00  7.27837420e+00]
   [ 4.59399509e+00 -4.81438732e+00]
   [ 9.01638508e+00  5.12754726e+00]
   [ 5.19506645e+00 -2.35464978e+00]
   [ 2.05791235e-01  3.24537897e+00]
   [-6.17561936e-02  1.22012386e+01]
   [ 8.34735334e-01  2.56306553e+00]
   [-8.42908740e-01 -4.72223663e+00]
   [ 7.59096265e-01 -8.70975971e-01]]]


[[[-1.25730896e+01 1.73365784e+00]
[-8.16483736e-01 -3.12521791e+00]
[ 4.31258678e+00 -3.65629935e+00]
[ 7.81925964e+00 2.67266393e+00]
[-1.45902622e+00 -1.69710827e+00]
[-4.97250271e+00 1.06699669e+00]
[-1.15320644e+01 3.67050219e+00]
[ 9.82042491e-01 -8.96060181e+00]
[ 4.24584293e+00 -2.03312969e+00]
[-8.90874267e-02 -2.19113445e+00]
[ 7.03373575e+00 4.82089567e+00]
[ 2.19787431e+00 -1.35815620e+00]
[-4.33743429e+00 -2.77082419e+00]
[-6.55539846e+00 -5.28619862e+00]
[-4.10456562e+00 1.53431883e+01]
[-6.97701550e+00 5.58186054e-01]
[-4.06244993e+00 -1.29598303e+01]
[ 1.80246496e+00 -2.77987790e+00]
[-7.30259180e+00 -5.11505365e+00]
[ 2.12593174e+00 -3.25598717e+00]
[ 7.80677795e+00 1.99891090e-01]
[ 8.46464539e+00 2.72348213e+00]
[-2.32167172e+00 4.69824505e+00]
[ 3.97749400e+00 -9.19138908e+00]
[ 2.90814090e+00 1.26416731e+00]
[-8.01068783e-01 5.13629675e+00]
[-1.10610142e+01 4.88826132e+00]
[-1.75804818e+00 1.23052418e+00]]

  [[-5.49224281e+00 -1.18972950e+01]
   [-1.71067417e+00 -7.94510078e+00]
   [ 8.11289787e+00  2.97325039e+00]
   [-6.18529272e+00 -1.07129316e+01]
   [ 5.09897184e+00 -5.09968042e+00]
   [-7.61364222e+00  5.40171003e+00]
   [-6.47705889e+00 -1.68601739e+00]
   [ 5.09246063e+00 -4.75051785e+00]
   [-8.07900906e+00 -1.01351190e+00]
   [-5.56266832e+00 -5.98014545e+00]
   [-1.84528065e+00 -2.55948591e+00]
   [ 1.63680077e-01 -4.52482510e+00]
   [-6.17208385e+00  8.61810780e+00]
   [ 2.54550838e+00  1.04630842e+01]
   [ 6.78235245e+00  2.36592340e+00]
   [ 5.33071947e+00  7.77984023e-01]
   [ 1.59947419e+00  3.40054178e+00]
   [ 9.72853303e-01  1.95867562e+00]
   [-5.11668110e+00  9.07939816e+00]
   [-9.91412735e+00  5.33779049e+00]
   [ 6.93627453e+00  9.98051357e+00]
   [-1.49268985e+00 -1.61656654e+00]
   [ 6.77412367e+00  5.89341736e+00]
   [-9.02515793e+00 -4.86350346e+00]
   [-2.65398359e+00  6.53871584e+00]
   [ 7.60008812e+00  2.23823214e+00]
   [ 3.48978114e+00  7.77210045e+00]
   [-1.89859509e+00  1.36791458e+01]]

  [[ 6.16735172e+00 -3.06368208e+00]
   [-6.22440147e+00  2.49987888e+00]
   [ 1.23477805e+00 -9.58612680e-01]
   [-6.32274055e+00 -3.50495398e-01]
   [-2.37460756e+00  2.89988756e+00]
   [ 6.76133537e+00 -4.97397709e+00]
   [ 4.19915617e-01 -1.44209051e+00]
   [-8.48055720e-01 -1.34412467e+00]
   [ 1.47503817e+00 -4.57327032e+00]
   [-8.83791351e+00  1.30507603e+01]
   [ 5.04546928e+00  2.96709967e+00]
   [ 1.21958218e+01 -3.70147228e-01]
   [-9.32716131e-02  8.25912094e+00]
   [-6.88422298e+00 -2.94276452e+00]
   [-8.92567754e-01  7.48677373e-01]
   [ 1.13859291e+01 -4.54786253e+00]
   [ 4.14542294e+00  3.62407827e+00]
   [-4.84375191e+00  7.60643148e+00]
   [-3.99736214e+00  8.38322639e-01]
   [ 8.50940418e+00 -3.18443274e+00]
   [ 3.02693796e+00 -1.08982430e+01]
   [ 5.94771481e+00  1.90185285e+00]
   [ 1.79021060e-02  2.71560931e+00]
   [ 1.28265166e+00  1.35003614e+00]
   [-5.15541887e+00  3.48176098e+00]
   [ 1.35739307e+01  1.13081062e+00]
   [-1.13344326e+01  2.02814102e+00]
   [-3.78427625e+00 -3.41797924e+00]]]


[[[ 7.51625490e+00 -8.01126385e+00]
[ 8.94779682e-01 -1.39191675e+00]
[-6.38634396e+00 -7.54839706e+00]
[ 5.90809202e+00 -4.73860931e+00]
[ 2.89039660e+00 -5.74475765e-01]
[-3.24619865e+00 -4.33766127e+00]
[ 7.45817757e+00 8.72869968e+00]
[ 1.70315895e+01 -1.43109741e+01]
[ 6.71465302e+00 2.36530209e+00]
[ 2.37234616e+00 9.54824162e+00]
[ 1.24315548e+01 -8.32226753e-01]
[ 1.15248704e+00 6.42775774e+00]
[-2.05694604e+00 -5.29237223e+00]
[ 1.06061993e+01 4.43905163e+00]
[ 3.32259560e+00 2.86404061e+00]
[-1.26070702e+00 -3.86716032e+00]
[-7.17960167e+00 -5.47068119e+00]
[ 6.13063002e+00 -1.27777729e+01]
[-4.77525711e+00 -2.89896202e+00]
[ 5.04258776e+00 1.05476036e+01]
[-4.72102404e+00 4.53035545e+00]
[ 8.30504322e+00 -6.72617435e+00]
[ 1.51879632e+00 -7.57512569e+00]
[ 5.25161076e+00 -6.00039482e+00]
[-2.66712689e+00 -3.15567350e+00]
[ 6.98167515e+00 1.21508999e+01]
[-3.58145714e+00 1.01358452e+01]
[-7.68432474e+00 6.27517796e+00]]

  [[-3.85787821e+00  4.13319540e+00]
   [ 4.08870316e+00  8.98323441e+00]
   [ 2.88623333e-01  2.08936238e+00]
   [-1.27303381e+01 -3.72204494e+00]
   [-1.63620949e-01  4.14640725e-01]
   [-9.77903843e+00 -9.83979321e+00]
   [ 1.20987940e+01 -1.00569272e+00]
   [-9.52555597e-01 -7.21974373e-01]
   [-7.53700542e+00 -1.03328714e+01]
   [ 3.82278275e+00  7.92873979e-01]
   [ 1.62820339e+01  9.25348282e-01]
   [-2.99300981e+00 -4.05044317e+00]
   [-1.71284425e+00  3.32610369e-01]
   [ 8.45957184e+00  3.01560092e+00]
   [ 9.61618781e-01  4.84845543e+00]
   [-5.04883003e+00 -4.64504576e+00]
   [-4.88548994e-01  4.22385454e+00]
   [-3.06538558e+00 -2.68467999e+00]
   [ 1.44536438e+01 -1.67332339e+00]
   [-1.20380235e+00 -3.01969767e-01]
   [-1.25808067e+01 -1.83691287e+00]
   [ 7.00172246e-01 -5.44080067e+00]
   [ 3.71728969e+00 -6.50164127e+00]
   [ 3.44825792e+00  2.27989483e+00]
   [-1.16813726e+01  3.55064964e+00]
   [-6.76289463e+00 -1.25415869e+01]
   [ 1.86627662e+00  4.91928959e+00]
   [ 2.37216616e+00 -3.23613596e+00]]

  [[ 5.60678053e+00  6.07894707e+00]
   [-6.23391962e+00 -1.82450306e+00]
   [ 6.90571690e+00 -2.60890079e+00]
   [ 3.98191905e+00 -2.60109711e+00]
   [ 3.79411244e+00 -7.30271769e+00]
   [-8.19082737e+00 -4.81762362e+00]
   [ 1.01562176e+01 -9.18346643e-02]
   [ 5.15106916e-01 -1.88746595e+00]
   [-7.10526466e-01  4.75524092e+00]
   [ 4.28777647e+00 -4.28609967e-01]
   [ 2.94255161e+00 -2.76411390e+00]
   [-5.01211119e+00 -1.35121047e-01]
   [ 4.88255644e+00  7.48982000e+00]
   [-2.94339252e+00 -1.49728453e+00]
   [-5.28226614e-01  1.13798523e+01]
   [-3.26653433e+00 -1.12711830e+01]
   [ 6.92280245e+00  4.46824360e+00]
   [-1.07686818e-02  5.99187469e+00]
   [-2.30055904e+00 -2.35181737e+00]
   [-1.86744165e+00 -9.12775040e-01]
   [ 5.70386982e+00  2.56417489e+00]
   [-4.37073708e-01 -4.62391090e+00]
   [-8.43499756e+00  9.08772826e-01]
   [-5.64418888e+00 -5.02650261e+00]
   [ 3.92685270e+00 -5.31071186e+00]
   [ 6.36297584e-01  2.63665223e+00]
   [-7.71557522e+00  4.19800425e+00]
   [ 3.64932895e+00  2.46329069e+00]]]


[[[ 6.65752888e-01 3.40558529e-01]
[ 4.08683634e+00 6.27357101e+00]
[-2.77316380e+00 5.83889532e+00]
[-2.01864777e+01 -8.63007069e+00]
[ 4.85101509e+00 1.56102419e-01]
[ 5.13551521e+00 7.00347781e-01]
[ 4.66926765e+00 1.13918304e+01]
[ 1.17937775e+01 -5.75443983e+00]
[ 5.18499660e+00 2.47753906e+01]
[-4.94616604e+00 1.09324312e+00]
[ 7.08940148e-01 5.36628440e-02]
[-6.22777748e+00 -6.08889389e+00]
[ 8.21062326e-01 5.73018026e+00]
[-1.01816578e+01 -5.96292210e+00]
[-3.45601702e+00 -5.80823088e+00]
[-7.81425619e+00 -1.54714165e+01]
[ 6.15157843e+00 4.41321850e+00]
[ 2.28190422e-02 -1.40392697e+00]
[ 5.86180115e+00 2.66614532e+00]
[-1.21994901e+00 6.87365246e+00]
[ 7.62740707e+00 -1.52388859e+00]
[-8.03575134e+00 -1.35383148e+01]
[-1.75186968e+00 -1.95710063e+00]
[-8.72407794e-01 -8.31413174e+00]
[-1.38678074e+01 -3.35018563e+00]
[ 1.02961273e+01 5.95636034e+00]
[ 7.15158939e+00 9.47603941e-01]
[ 3.59655428e+00 -3.57616353e+00]]

  [[-7.15814590e+00 -1.73663855e-01]
   [-5.33630848e+00  2.23019302e-01]
   [-3.60880065e+00  1.16919529e+00]
   [ 1.65422618e+00  3.21728516e+00]
   [ 1.86843979e+00  1.13296022e+01]
   [-6.71664524e+00  8.06290245e+00]
   [-3.82262254e+00  4.57042742e+00]
   [-7.61132431e+00  7.53255653e+00]
   [ 1.63969231e+00 -1.19336343e+00]
   [ 2.03410006e+00  5.48414516e+00]
   [ 7.98875904e+00 -6.00354958e+00]
   [ 5.37972260e+00 -3.13939238e+00]
   [ 6.52196217e+00  5.99524212e+00]
   [-3.65084100e+00  5.70605898e+00]
   [ 5.66238022e+00 -4.25603628e-01]
   [ 1.31335664e+00  3.34762931e-01]
   [ 4.95460320e+00 -7.73174858e+00]
   [-6.06322289e-02  7.14966822e+00]
   [ 4.30868864e+00 -4.49330187e+00]
   [ 3.00062609e+00 -3.45171928e+00]
   [-8.88646841e-01  4.49364281e+00]
   [-1.37166762e+00 -9.60632420e+00]
   [ 2.72169065e+00 -2.02102685e+00]
   [ 4.06615162e+00  2.21987987e+00]
   [ 4.58932543e+00 -6.33985901e+00]
   [-7.59764194e+00 -8.69492054e-01]
   [ 6.72914386e-01  3.37907672e-02]
   [-9.57373238e+00  4.29612064e+00]]

  [[ 2.07057667e+00  2.49500203e+00]
   [-2.39765930e+00  6.45140171e-01]
   [ 9.70951462e+00  1.52998376e+00]
   [ 9.77593803e+00  8.06670094e+00]
   [ 8.35551929e+00  7.26291513e+00]
   [-9.06231880e-01 -9.31769133e-01]
   [-6.77584314e+00  3.27285552e+00]
   [ 5.13162661e+00 -5.17782736e+00]
   [-3.71608639e+00 -5.12819290e-01]
   [ 1.48577709e+01 -2.64512122e-01]
   [ 6.44747496e-01 -9.95941162e-02]
   [ 1.04961805e+01 -3.98670554e+00]
   [ 2.51394081e+00  1.80438447e+00]
   [-5.59201813e+00  1.18733444e+01]
   [-2.31048003e-01 -1.12039871e+01]
   [-1.62683907e+01  2.02177715e+00]
   [ 1.14540329e+01  2.30115056e-02]
   [-1.10159683e+01 -3.24261713e+00]
   [-1.33181334e+01 -8.00105953e+00]
   [ 6.21838617e+00  8.89258957e+00]
   [ 1.58339548e+00 -2.27107620e+00]
   [-4.17989254e-01 -2.85755348e+00]
   [-2.48508906e+00  9.64674568e+00]
   [-1.08764257e+01  2.08483315e+00]
   [ 9.77210236e+00  2.50418329e+00]
   [-1.62253022e+00  1.67334347e+01]
   [ 6.42501354e-01  2.53464675e+00]
   [-1.12935200e+01  3.39891338e+00]]]], shape=(4, 3, 28, 2), dtype=float32)
1
2
3
4
a = tf.random.normal([4,28,32])
b = tf.random.normal([32,16])
# 先自动扩展,再矩阵相乘
tf.matmul(a,b)
<tf.Tensor: id=503, shape=(4, 28, 16), dtype=float32, numpy=
array([[[ -2.7598646 ,   7.0569715 ,  -2.0019226 , ...,  -1.2552259 ,
          -5.3215303 ,  -1.2467324 ],
        [ -3.5755327 ,  -3.8002384 , -12.492091  , ...,   6.249779  ,
          -3.7607257 ,  -2.4373896 ],
        [  1.3191882 ,  -4.4746413 ,   2.4289536 , ...,  -1.8787553 ,
          -0.10033526,  -1.4797553 ],
        ...,
        [ -2.0344322 ,   5.086643  ,  -7.6664243 , ...,  -3.0846074 ,
          -8.448284  ,  -1.7322202 ],
        [  0.57995903,  -3.7676647 ,  -3.6173913 , ...,   7.0568666 ,
           2.3793366 ,   3.498049  ],
        [  0.5311534 ,   3.0278618 ,   3.9090858 , ...,   4.8734083 ,
          -6.3130484 ,  -3.7237642 ]],

       [[  4.89592   ,  -2.8002422 ,  -0.7761507 , ...,   9.516954  ,
          11.723758  ,  -2.5442433 ],
        [ -0.47682646, -13.358232  , -11.200428  , ...,  -0.46236166,
          -2.9554985 ,  -1.4849511 ],
        [ 11.57021   ,  -8.973025  ,   2.9124722 , ...,  -5.2608457 ,
           1.2784045 ,  -5.1128254 ],
        ...,
        [  2.318095  ,   2.843607  ,   4.602457  , ...,   5.6242056 ,
           6.1018414 ,  -5.076501  ],
        [ -4.3738413 ,  -5.2155914 ,   8.190216  , ...,  -3.7748199 ,
          -4.86178   ,   2.7263112 ],
        [ -1.4741284 ,   0.5153975 ,  -2.7228315 , ...,  -0.1337083 ,
          -8.092061  ,  -3.1821835 ]],

       [[  0.28089905,   9.784105  ,   2.9840403 , ...,   0.33226973,
          -0.6826554 ,  -4.040504  ],
        [ -5.8831253 ,  12.158736  ,  -7.0445533 , ...,   2.2380865 ,
          -8.451615  ,   3.1144416 ],
        [-12.613248  , -10.317265  ,  -5.9143896 , ...,  -2.8576682 ,
         -10.0681925 ,   6.5913053 ],
        ...,
        [  5.002187  ,   0.69802207,   4.616313  , ...,   1.8524637 ,
           1.6469531 ,   1.4813223 ],
        [ -1.0592954 ,  -1.9839575 ,   6.1675334 , ...,  -3.4000485 ,
           9.097794  ,   3.3264492 ],
        [ -2.3593228 ,  -6.8569756 , -14.06582   , ...,  -9.968381  ,
           9.856624  ,   5.7211127 ]],

       [[  0.79352164, -12.076045  ,   4.5146046 , ...,  -0.5590708 ,
          -0.44884235,   4.5407653 ],
        [  3.3152225 ,  -1.2491262 ,   8.590666  , ...,   0.24038552,
          12.144938  ,  11.659479  ],
        [ -0.8445607 ,   6.594575  ,  -2.8742118 , ...,  -2.4811752 ,
           3.3992496 ,   4.638756  ],
        ...,
        [ -8.9465065 ,  -6.0752807 ,  -4.039912  , ...,  -0.8335391 ,
          -3.9777448 ,   3.659201  ],
        [  4.1183553 ,   3.9281585 ,  -4.132287  , ...,  -7.197991  ,
           5.790247  ,  -8.167656  ],
        [ -4.423063  , -20.040358  ,  -9.854829  , ...,  -3.0150864 ,
          -5.763957  ,   4.594075  ]]], dtype=float32)>

前向传播实战

1
2
3
4
5
6
7
import matplotlib.pyplot as plt
import tensorflow as tf
import tensorflow.keras.datasets as datasets

plt.rcParams['font.size'] = 16
plt.rcParams['font.family'] = ['STKaiti']
plt.rcParams['axes.unicode_minus'] = False
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
def load_data():
# 加载 MNIST 数据集
(x, y), (x_val, y_val) = datasets.mnist.load_data()
# 转换为浮点张量, 并缩放到-1~1
x = tf.convert_to_tensor(x, dtype=tf.float32) / 255.
# 转换为整形张量
y = tf.convert_to_tensor(y, dtype=tf.int32)
# one-hot 编码
y = tf.one_hot(y, depth=10)

# 改变视图, [b, 28, 28] => [b, 28*28]
x = tf.reshape(x, (-1, 28 * 28))

# 构建数据集对象
train_dataset = tf.data.Dataset.from_tensor_slices((x, y))
# 批量训练
train_dataset = train_dataset.batch(200)
return train_dataset
1
2
3
4
5
6
7
8
9
10
11
12
13
def init_paramaters():
# 每层的张量都需要被优化,故使用 Variable 类型,并使用截断的正太分布初始化权值张量
# 偏置向量初始化为 0 即可
# 第一层的参数
w1 = tf.Variable(tf.random.truncated_normal([784, 256], stddev=0.1))
b1 = tf.Variable(tf.zeros([256]))
# 第二层的参数
w2 = tf.Variable(tf.random.truncated_normal([256, 128], stddev=0.1))
b2 = tf.Variable(tf.zeros([128]))
# 第三层的参数
w3 = tf.Variable(tf.random.truncated_normal([128, 10], stddev=0.1))
b3 = tf.Variable(tf.zeros([10]))
return w1, b1, w2, b2, w3, b3
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
def train_epoch(epoch, train_dataset, w1, b1, w2, b2, w3, b3, lr=0.001):
for step, (x, y) in enumerate(train_dataset):
with tf.GradientTape() as tape:
# 第一层计算, [b, 784]@[784, 256] + [256] => [b, 256] + [256] => [b,256] + [b, 256]
h1 = x @ w1 + tf.broadcast_to(b1, (x.shape[0], 256))
h1 = tf.nn.relu(h1) # 通过激活函数

# 第二层计算, [b, 256] => [b, 128]
h2 = h1 @ w2 + b2
h2 = tf.nn.relu(h2)
# 输出层计算, [b, 128] => [b, 10]
out = h2 @ w3 + b3

# 计算网络输出与标签之间的均方差, mse = mean(sum(y-out)^2)
# [b, 10]
loss = tf.square(y - out)
# 误差标量, mean: scalar
loss = tf.reduce_mean(loss)

# 自动梯度,需要求梯度的张量有[w1, b1, w2, b2, w3, b3]
grads = tape.gradient(loss, [w1, b1, w2, b2, w3, b3])

# 梯度更新, assign_sub 将当前值减去参数值,原地更新
w1.assign_sub(lr * grads[0])
b1.assign_sub(lr * grads[1])
w2.assign_sub(lr * grads[2])
b2.assign_sub(lr * grads[3])
w3.assign_sub(lr * grads[4])
b3.assign_sub(lr * grads[5])

return loss.numpy()
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
def train(epochs):
losses = []
train_dataset = load_data()
w1, b1, w2, b2, w3, b3 = init_paramaters()
for epoch in range(epochs):
loss = train_epoch(epoch, train_dataset, w1, b1, w2, b2, w3, b3, lr=0.001)
print('epoch:', epoch, 'loss:', loss)
losses.append(loss)

x = [i for i in range(0, epochs)]
# 绘制曲线
plt.plot(x, losses, color='blue', marker='s', label='训练')
plt.xlabel('Epoch')
plt.ylabel('MSE')
plt.legend()
plt.show()
1
train(epochs=20)
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/mnist.npz
11493376/11490434 [==============================] - 2s 0us/step
epoch: 0 loss: 0.16654462
epoch: 1 loss: 0.14800379
epoch: 2 loss: 0.13541555
epoch: 3 loss: 0.12577298
epoch: 4 loss: 0.11817748
epoch: 5 loss: 0.11203371
epoch: 6 loss: 0.1069127
epoch: 7 loss: 0.10258315
epoch: 8 loss: 0.09884895
epoch: 9 loss: 0.095569395
epoch: 10 loss: 0.092678
epoch: 11 loss: 0.09010928
epoch: 12 loss: 0.0878074
epoch: 13 loss: 0.08572935
epoch: 14 loss: 0.08384038
epoch: 15 loss: 0.0821046
epoch: 16 loss: 0.08050328
epoch: 17 loss: 0.079019025
epoch: 18 loss: 0.07763501


findfont: Font family ['STKaiti'] not found. Falling back to DejaVu Sans.


epoch: 19 loss: 0.07634819


/Users/maqi/opt/anaconda3/envs/tf2/lib/python3.7/site-packages/matplotlib/backends/backend_agg.py:240: RuntimeWarning: Glyph 35757 missing from current font.
  font.set_text(s, 0.0, flags=flags)
/Users/maqi/opt/anaconda3/envs/tf2/lib/python3.7/site-packages/matplotlib/backends/backend_agg.py:240: RuntimeWarning: Glyph 32451 missing from current font.
  font.set_text(s, 0.0, flags=flags)
/Users/maqi/opt/anaconda3/envs/tf2/lib/python3.7/site-packages/matplotlib/backends/backend_agg.py:203: RuntimeWarning: Glyph 35757 missing from current font.
  font.set_text(s, 0, flags=flags)
/Users/maqi/opt/anaconda3/envs/tf2/lib/python3.7/site-packages/matplotlib/backends/backend_agg.py:203: RuntimeWarning: Glyph 32451 missing from current font.
  font.set_text(s, 0, flags=flags)

png