发新帖

新手求助

[复制链接]
646 1

快来加入 TensorFlowers 大家庭!

您需要 登录 才可以下载或查看,没有帐号?加入社区

x
logits=tf.constant([[1.0,2.0,3.0],[1.0,2.0,3.0],[1.0,2.0,3.0]])
y=tf.nn.softmax(logits)
y_=tf.constant([[0.0,0.0,1.0],[1.0,0.0,0.0],[1.0,0.0,0.0]])
cross_entropy=-tf.reduce_sum(y_*tf.log(tf.clip_by_value(y,1e-10,1.0)))
cross_entropy2=tf.reduce_sum(tf.nn.softmax_cross_entropy_with_logits(labels=y_,logits=logits))
with tf.Session() as sess:
    softmax=sess.run(y)
    ce=sess.run(cross_entropy)
    ce2=sess.run(cross_entropy2)
    print("cross_entropy result",ce)
    print("softmax_cross_entropy_with_logist result=",ce2)
    print("softmax result=", softmax)为什么最后一行不能打印出来
WARNING: Logging before flag parsing goes to stderr.
W0524 17:57:02.851681  3600 deprecation.py:323] From D:/untitled/demp.py:156: softmax_cross_entropy_with_logits (from tensorflow.python.ops.nn_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Future major versions of TensorFlow will allow gradients to flow
into the labels input on backprop by default.
See `tf.nn.softmax_cross_entropy_with_logits_v2`.
2019-05-24 17:57:02.879936: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2
cross_entropy result 5.2228174
softmax_cross_entropy_with_logist result= 5.2228174


我知道答案 回答被采纳将会获得10 金币 + 5 金币 已有1人回答
本楼点评(0) 收起

精彩评论1

祝家鑫  TF荚荚  发表于 2019-7-12 18:15:34 | 显示全部楼层
cross_entropy result 5.2228174
softmax_cross_entropy_with_logist result= 5.2228174
softmax result= [[0.09003057 0.24472848 0.66524094]
[0.09003057 0.24472848 0.66524094]
[0.09003057 0.24472848 0.66524094]]
同样的代码运行,这是结果啊
本楼点评(0) 收起
您需要登录后才可以回帖 登录 | 加入社区

本版积分规则

主题

帖子

4

积分
快速回复 返回顶部 返回列表