Could not find valid device for node.Node:{{node Elu}} All kernels registered for op Elu问题

使用tf.keras.activations.elu**函数的时候,出现Could not find valid device for node.Node:{{node Elu}}
Could not find valid device for node.Node:{{node Elu}} All kernels registered for op Elu问题
Could not find valid device for node.Node:{{node Elu}} All kernels registered for op Elu问题
这个地方**函数的输入x是int整型,此处应该讲输入值转换成float浮点型即可,
Could not find valid device for node.Node:{{node Elu}} All kernels registered for op Elu问题
使用numpy,将**函数的输入改成float 64,运行正常,同理,使用其他**函数,例如relu,softmax, tanh,selu,sigmoid等**函数,都应该保证输入是float。