- R Deep Learning Cookbook
- Dr. PKS Prakash Achyutuni Sri Krishna Rao
- 435字
- 2025-02-17 11:08:08
How to do it...
The section covers how to visualize TensorFlow models and output in TernsorBoard.
- To visualize summaries and graphs, data from TensorFlow can be exported using the FileWriter command from the summary module. A default session graph can be added using the following command:
# Create Writer Obj for log
log_writer = tf$summary$FileWriter('c:/log', sess$graph)
The graph for logistic regression developed using the preceding code is shown in the following screenshot:

Visualization of the logistic regression graph in TensorBoard
Details about symbol descriptions on TensorBoard can be found at https://www.tensorflow.org/get_started/graph_viz.
- Similarly, other variable summaries can be added to the TensorBoard using correct summaries, as shown in the following code:
# Adding histogram summary to weight and bias variable
w_hist = tf$histogram_summary("weights", W)
b_hist = tf$histogram_summary("biases", b)
The summaries can be a very useful way to determine how the model is performing. For example, for the preceding case, the cost function for test and train can be studied to understand optimization performance and convergence.
- Create a cross entropy evaluation for test. An example script to generate the cross entropy cost function for test and train is shown in the following command:
# Set-up cross entropy for test
nRowt<-nrow(occupancy_test)
xt <- tf$constant(unlist(occupancy_test[, xFeatures]), shape=c(nRowt, nFeatures), dtype=np$float32)
ypredt <- tf$nn$sigmoid(tf$matmul(xt, W) + b)
yt_ <- tf$constant(unlist(occupancy_test[, yFeatures]), dtype="float32", shape=c(nRowt, 1L))
cross_entropy_tst<-tf$reduce_mean(tf$nn$sigmoid_cross_entropy_with_logits(labels=yt_, logits=ypredt, name="cross_entropy_tst"))
The preceding code is similar to training cross entropy calculations with a different dataset. The effort can be minimized by setting up a function to return tensor objects.
- Add summary variables to be collected:
# Add summary ops to collect data
w_hist = tf$summary$histogram("weights", W)
b_hist = tf$summary$histogram("biases", b)
crossEntropySummary<-tf$summary$scalar("costFunction", cross_entropy)
crossEntropyTstSummary<-tf$summary$scalar("costFunction_test", cross_entropy_tst)
The script defines the summary events to be logged in the file.
- Open the writing object, log_writer. It writes the default graph to the location, c:/log:
# Create Writer Obj for log
log_writer = tf$summary$FileWriter('c:/log', sess$graph)
- Run the optimization and collect the summaries:
for (step in 1:2500) {
sess$run(optimizer)
# Evaluate performance on training and test data after 50 Iteration
if (step %% 50== 0){
### Performance on Train
ypred <- sess$run(tf$nn$sigmoid(tf$matmul(x, W) + b))
roc_obj <- roc(occupancy_train[, yFeatures], as.numeric(ypred))
### Performance on Test
ypredt <- sess$run(tf$nn$sigmoid(tf$matmul(xt, W) + b))
roc_objt <- roc(occupancy_test[, yFeatures], as.numeric(ypredt))
cat("train AUC: ", auc(roc_obj), " Test AUC: ", auc(roc_objt), "n")
# Save summary of Bias and weights
log_writer$add_summary(sess$run(b_hist), global_step=step)
log_writer$add_summary(sess$run(w_hist), global_step=step)
log_writer$add_summary(sess$run(crossEntropySummary), global_step=step)
log_writer$add_summary(sess$run(crossEntropyTstSummary), global_step=step)
} }
- Collect all the summaries to a single tensor using themerge_all command from the summary module:
summary = tf$summary$merge_all()
- Write the summaries to the log file using the log_writer object:
log_writer = tf$summary$FileWriter('c:/log', sess$graph)
summary_str = sess$run(summary)
log_writer$add_summary(summary_str, step)
log_writer$close()