TensorFlow is a
computational framework that allows users to represent computation as
a graph. Nodes in this computational graph represents an operation to
be performed, which is called “op” in TensorFlow context. An op
can take zero or more Tensors and produce zero or more Tensors as
it’s output. Tensor can be referred to as a multi-dimensional
array.
In this post I’ll be focusing on Computational graphs,
Sessions, Tensors, and Variables.
Typical TensorFlow
graph has three main phases, which are Construction phase, Graph
Assemble phase and Execution phase. By default TensorFlow provides a
computational graph where users can assign ops that needs to be
executed without assembling a new graph.
Operations and Tensors
A Tensor represents
the data flow through ops, in other words a value consumed or
produced by an op. Tensors does not hold the value of that operation,
instead it handles as a symbolic handle to an output of an op. In
this perspective, Tensor plays two roles; it can be passed as an
input to an op which allows to build the graph and it’s data flow,
Secondly it allows to perform computations on values being passed. In
below, matrix_1, matrix_2 and product are Tensors.
matrix_1 = tf.constant([[3., 3.]])matrix_2 = tf.constant([[2.], [2.]])product = tf.matmul(matrix_1, matrix_2)
Like mentioned
previously an operation or op is a node in TensorFlow graph that can
both consume or produce zero or more Tensors. Ops performs
computations on Tensors. For example,
product = tf.matmul(matrix_1, matrix_2)
Creates an operation
of type “matmul” and takes matrix_1 and matrix_2 tensors as
inputs to produce product tensor as output.
Graphs
A graph in
TensorFlow can be referred as a collection of Ops that linked by
Tensors and can be start with ops that doesn’t take any input
and pass their output to other ops that performs computations. In
below example we have added three ops including two constants and one
op to TensorFlow default graph. To launch and get the results we must
execute the graph in a TensorFlow session (more on Sessions later).
matrix_1 = tf.constant([[3., 3.]])matrix_2 = tf.constant([[2.], [2.]])product = tf.matmul(matrix_1, matrix_2)sess = tf.Session()result = sess.run(product)
Sessions
A session contains
all operations and tensors of a computational model. It does execute
ops and evaluates tensors. The default session can be launched
without giving any graph arguments when creating the session. If it’s
required to run different graphs in different sessions, the we have
to provide which graph should be executed in each session we are
creating.
TensorFlow session
represents a connection to it’s underline native libraries. So it
holds all resources allocated until we close the sessions. Or else we
could user session as a context manages.
matrix_1 = tf.constant([[3., 3.]])matrix_2 = tf.constant([[2.], [2.]])product = tf.matmul(matrix_1, matrix_2)sess = tf.Session()result = sess.run(product)print(result)sess.close()
To execute above
example with session as a context manager,
matrix_1 = tf.constant([[3., 3.]])matrix_2 = tf.constant([[2.], [2.]])product = tf.matmul(matrix_1, matrix_2)with tf.Session() as sess:result = sess.run(product)print(result)
Variables
Variables keeps state across TensorFlow graph executioin and can be
considered as in-memory buffers of Tensors. Variables should be
explicitly initialized and can be stored in the hard disk to use them
later on.
To create a variable we must pass a Tensor as it’s input parameter
to the constructor.
x = tf.Variable([1.0, 2.0])a = tf.constant([3.0, 3.0])# create a variable that will be initialize to scalar value 0.state = tf.Variable(0, name="counter")# variables must be initialized by running an 'init' op after having# launch the graph. first add the init op to the graph.init_op = tf.initialize_all_variables()
In another post, we will be dive deep into data flow graphs for
better understanding of computation executions.
For source code: https://github.com/isurusiri/tensorflowbasics
For source code: https://github.com/isurusiri/tensorflowbasics