Playing tic tac toe 189 example 8 10 e addlayer

This preview shows page 203 - 206 out of 256 pages.

Playing Tic-Tac-Toe|189
Example 8-10.e_add_layer method adds a new Layer objectdef_add_layer(self,layer):iflayer.nameisNone:layer.name="%s_%s"%(layer.__class__.__name__,len(self.layers)+1)iflayer.nameinself.layers:returnifisinstance(layer,Input):self.features.append(layer)self.layers[layer.name]=layerforin_layerinlayer.in_layers:self._add_layer(in_layer)The layers in aTensorGraphmust form a directed acyclic graph (there can be noloops in the graph). As a result, we can topologically sort these layers. Intuitively, atopological sort “orders” the layers in the graph so that eachLayerobject’sin_layersprecede it in the ordered list. This topological sort is necessary to make sure all inputlayers to a given layer are added to the graph before the layer itself (Example 8-11).Example 8-11.etopsort method orders the layers in the TensorGraphdeftopsort(self):defadd_layers_to_list(layer,sorted_layers):iflayerinsorted_layers:returnforin_layerinlayer.in_layers:add_layers_to_list(in_layer,sorted_layers)sorted_layers.append(layer)sorted_layers=[]forlinself.features+self.labels+self.task_weights+self.outputs:add_layers_to_list(l,sorted_layers)add_layers_to_list(self.loss,sorted_layers)returnsorted_layersThebuild()method takes the responsibility of populating thetf.Graphinstance bycallinglayer.create_tensorfor each layer in topological order (Example 8-12).Example 8-12.ebuild method populates the underlying TensorFlow graphdefbuild(self):ifself.built:returnwithself._get_tf("Graph").as_default():self._training_placeholder=tf.placeholder(dtype=tf.float32,shape=())ifself.random_seedisnotNone:tf.set_random_seed(self.random_seed)forlayerinself.topsort():190|Chapter 8: Reinforcement Learning
withtf.name_scope(layer.name):layer.create_tensor(training=self._training_placeholder)self.session=tf.Session()self.built=TrueThe methodset_loss()adds a loss for training to the graph.add_output()specifiesthat the layer in question might be fetched from the graph.set_optimizer()speci‐fies the optimizer used for training (Example 8-13).Example 8-13.esemethods add necessary losses, outputs, and optimizers to thecomputation graphdefset_loss(self,layer):self._add_layer(layer)self.loss=layerdefadd_output(self,layer):self._add_layer(layer)self.outputs.append(layer)defset_optimizer(self,optimizer):"""Set the optimizer to use for fitting."""self.optimizer=optimizerThe methodget_layer_variables()is used to fetch the learnabletf.Variableobjects created by a layer. The private method_get_tfis used to fetch thetf.Graphand optimizer instances underpinning theTensorGraph.get_global_stepis a con‐venience method for fetching the current step in the training process (starting from 0at construction). SeeExample 8-14.Example 8-14. Fetch the learnable variables associated with each layerdefget_layer_variables(self,layer):"""Get the list of trainable variables in a layer of the graph."""ifnotself.built:self.build()withself._get_tf("Graph").as_default():iflayer.variable_scope=="":return[]returntf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES,scope=layer.

Upload your study docs or become a

Course Hero member to access this document

Upload your study docs or become a

Course Hero member to access this document

End of preview. Want to read all 256 pages?

Upload your study docs or become a

Course Hero member to access this document

Term
Spring
Professor
ZENG CHENG
Tags
Artificial Intelligence, Machine Learning, The Land, Artificial neural network, deep learning

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture