2017-06-16 112 views
1

我想使用RStudio的Keras界面使神经网络训练可重现。在R脚本中设置种子(set.seed(42))似乎不起作用。是否可以将种子作为参数传递给layer_dense()?我可以选择RandomUniform作为初始值设定项,但我很难通过种子参数传递它。以下行引发错误:使用RStudio的Keras界面使神经网络训练可重现

model %>% layer_dense(units = 12, activation = 'relu', input_shape = c(8), kernel_initializer = "RandomUniform(seed=1)")

但是可以在不试图通过种子参数可以添加层:

model %>% layer_dense(units = 12, activation = 'relu', input_shape = c(8), kernel_initializer = "RandomUniform")

RandomUniform是假设根据采取种子参数Keras initializer documents

+0

我使用keras在python,似乎当我'set.seed(42)'和'进口tensorflow','tensorflow.set_seed(42)'工作。你能否在R中明确导入tensorflow并尝试?此外,它只适用于使用CPU而不使用GPU。 –

+0

我想我应该尝试使用R Tensorflow库而不是R Keras库,因为Keras集成在Tensorflow 1.2 –

回答

0

内核初始化参数的语法应该是这样的。 kernel_initializer=initializer_random_uniform(minval = -0.05, maxval = 0.05, seed = 104)

尝试这些步骤。

1)对于R环境设置种子导入之前keras/tensorflow

2)设置tensorflow会话配置为使用单个线程

3)设置tensorflow随机种子

4)创建与tensorflow会话这个种子并将其分配给keras后端。

5)最后,在你的模型层,如果使用的是随机初始化像random_uniform(这是默认的)或random_normal那么你将有种子参数更改为某个整数 下面是一个例子

# Set R random seed 
set.seed(104) 
library(keras) 
library(tensorflow) 

# TensorFlow session configuration that uses only a single thread. Multiple threads are a 
# potential source of non-reproducible results, see: https://stackoverflow.com/questions/42022950/which-seeds-have-to-be-set-where-to-realize-100-reproducibility-of-training-res 
#session_conf <- tf$ConfigProto(intra_op_parallelism_threads = 1L, 
#        inter_op_parallelism_threads = 1L) 

# Set TF random seed (see: https://www.tensorflow.org/api_docs/python/tf/set_random_seed) 
tf$set_random_seed(104) 

# Create the session using the custom configuration 
sess <- tf$Session(graph = tf$get_default_graph(), config = session_conf) 

# Instruct Keras to use this session 
K <- backend() 
K$set_session(sess) 


#Then in your model architecture, set seed to all random initializers. 

model %>% 
    layer_dense(units = n_neurons, activation = 'relu', input_shape = c(100),kernel_initializer=initializer_random_uniform(minval = -0.05, maxval = 0.05, seed = 104)) %>% 
    layer_dense(units = n_neurons, activation = 'relu',kernel_initializer=initializer_random_uniform(minval = -0.05, maxval = 0.05, seed = 104)) %>% 
    layer_dense(units =c(100) ,kernel_initializer=initializer_random_uniform(minval = -0.05, maxval = 0.05, seed = 104)) 

参考文献: https://rstudio.github.io/keras/articles/faq.html#how-can-i-obtain-reproducible-results-using-keras-during-development https://rstudio.github.io/keras/reference/initializer_random_normal.html#arguments