Basics of TensorFlow

Subarna Rana
4 min readMar 26, 2018

--

Over the last couple of years, Google made deep learning actionable using a great framework called TensorFlow. In this article, we are going to explore TensorFlow basics. In the later section, we will discuss how to build a Logistic Regression model using TensorFlow.

Introduction to TensorFlow:

Lets start with defining what is TensorFlow. In layman terms, “Tensor” mean a multidimensional array and “Flow” mean a graph of operations. A multidimensional array could be a vector or matrix, 3D array or ND array. Below are some examples of Tensors:

  • [4. ,5., 6.] # rank 1 tensor; It has shape [3] (a vector)
  • [[3., 4., 5.], [10., 20., 30.]] # rank 2 tensor; it has shape [2, 3] (a matrix)
  • [[[66., 77., 88.]], [[11., 12., 13.]]] # rank 3 tensor, it has shape [2, 1, 3]

Graph of operations represent a data flow graph in other words can be called a computation graph.

Depicts a basic computation graph taking a and b both vector of input passed through an operation “add” to product c another vector

Installing TensorFlow:

If you want basic installation without all the fuss, just do this:

pip install tensorflow

Or install it with GPU support:

pip install tensorflow-gpu

For detailed guide on installing TensorFlow, follow: https://www.tensorflow.org/install/

Lets code….

Constants:

To declare a node with a constant value in TensorFlow we use tf.constant (value, datatype) .The first parameter of tf.constant() function is the value you what to put in the node and second parameter is used to define the data type of the constant. TensorFlow can implicitly also identify the type of constant so if we leave out the second parameter then it would also be fine. for example:

firstNode = tf.constant(7.0, dtype = tf.float32)
secondNode = tf.constant(14.0)
print(firstNode, secondNode)
Tensor("Const:0", shape=(), dtype=float32) Tensor("Const_1:0", shape=(), dtype=float32)

Why we cannot see the desired output in print statement? That is because we can get the value 7.0 and 14.0 only when we run the nodes. Also, remember that output of the node will also be a tensor object.

Session:

To run them we need to create a TensorFlow session as:

sess = tf.Session()
print(sess.run([firstNode, secondNode]))
[7.0, 14.0]

We can even perform operation on these two nodes as:

thirdNode = tf.add(firstNode, secondNode)
print("Sum value is: ", sess.run(thirdNode))
Sum value is: 21.0

Placeholders:

Placeholders are like the variable with no value on them but later we can place the value on it. tf.Placeholder is used to feed actual training data. In the below example, note that we have supplied the values when calling the session.

x = tf.placeholder(tf.float32)
y = tf.placeholder(tf.float32)
z = x+y
print(sess.run(z,{x: 10.0, y:20.0}))
30.0

Variables:

TensorFlow Variable (tf.Variable) is used to store the state of data. We can use tf.Variable for trainable variables such as weights (W) and biases (B) for our model. tf.global_variables_initializer() does the job of initializing all the variables.

W = tf.Variable(3.5, dtype=tf.float32,name="W")
b = tf.Variable(4.6, dtype=tf.float32,name="b")
addVar = W + b;
with tf.Session() as sess:
init = tf.global_variables_initializer()
sess.run(init)
print(sess.run(addVar))
8.1

TensorBoard:

TensorBoard is a suite of web applications for inspecting and understanding your TensorFlow runs and graphs. The computations you will use in TensorFlow for things such as training a massive deep neural network, can be fairly complex and confusing, TensorBoard will make this a lot easier to understand, debug, and optimize your TensorFlow programs. Lets look into a simple example as shown below:

a = tf.add(1, 2)
b = tf.multiply(a, 3)
with tf.Session() as sess:
print(sess.run(b))

In the above example, if we wish to see the computation graph in TesnsorBoard, we add a SummaryWriter to the end of our code, this will create a folder in your given directory, Which will contain the information for TensorBoard to build the graph.

with tf.Session() as sess:
writer = tf.summary.FileWriter(“path/to/logs/directory”, sess.graph)
print(sess.run(b))
writer.close()

To run TensorBoard, open command prompt/terminal and run:

tensorboard — logdir = path/to/logs/directory

Now goto: http://localhost:6006/

You will be able to view the computation graph in the GRAPHS tab of TensorBoard.

Computation graph of the example

Scoping in TensorBoard:

TensorFlow also allows to create scopes within your code. It also helps in grouping certain calculations.

# First we need to reset the earlier computation graphd
tf.reset_default_graph()
# Here we are defining the name of the graph, scopes A, B and C.
with tf.name_scope("MyOperationGroup"):
with tf.name_scope("Scope_A"):
a = tf.add(1, 2, name="Add_these_numbers")
b = tf.multiply(a, 3)
with tf.name_scope("Scope_B"):
c = tf.add(4, 5, name="And_These_ones")
d = tf.multiply(c, 6, name="Multiply_these_numbers")
with tf.name_scope("Scope_C"):
e = tf.multiply(4, 5, name="B_add")
f = tf.div(c, 6, name="B_mul")
g = tf.add(b, d)
h = tf.multiply(g, f)
with tf.Session() as sess:
writer = tf.summary.FileWriter("path/to/logs/directory", sess.graph)
print(sess.run(h))
writer.close()
63
Example of a more complex computation graph with scoping

So that covers the basic of TensorFlow, in the next article, we will look into how we can leverage TensorFlow to understand working of Logistic Regression.

You have view the code at: https://github.com/subarnab219/Logistic-Regression-using-Tensorflow

For deep dive on Tensorboard follow: https://neptune.ai/blog/tensorboard-tutorial

Thanks

--

--

Responses (1)