This TensorFlow tutorial is for someone who has basic idea about machine learning and trying to get started with TensorFlow. You would need to have TensorFlow installed. You can follow this tutorial to install TensorFlow. This tensorflow tutorial is divided into two parts; in first part we explain the basics with example; in second part we build a linear regression model.

Part-1: Basics of TensorFlow:

TensorFlow is a library for numerical computation where data flows through the graph.  Data in TensorFlow is represented by n-dimensional arrays called Tensors. Graph is made of data(Tensors) and mathematical operations. 
  • Nodes on the graph: represent mathematical operations. 
  • Edges on the graph: represent the Tensors that flow between operations. 
There is one more aspect in which TensorFlow is very different from any other programming language. In TensorFlow, you first need to create a blueprint of whatever you want to create. While you are creating the graph, variables don’t have any value. Later when you have created the complete graph, you have to run it inside a session, only then the variables have any values. More on this later. 
Let’s get started learning by doing. Run python and import tensorflow:

(i) Graph in TensorFlow:

Graph is the backbone of TensorFlow and every computation/operation/variables reside on the graph. Everything that happens in the code, resides on a default graph provided by TensorFlow. You can access this graph by:

You can get the list of all the operations by typing this:

graph.get_operations()
Currently, the output is empty as shown by [], as there is nothing in the graph.
If you want to print to name of the operations in the graph, do this:

This will again be empty. We shall use this to print names of operations after we have added ops to the graph.

Also, It’s possible to create multiple graphs. But let’s worry about that later. 

(ii) TensorFlow Session:

A graph is used to define operations, but the operations are only run within a session. Graphs and sessions are created independently of each other. You can imagine graph to be similar to a blueprint, and a session to be similar to a construction site.

Graph only defines the computations or builds the blueprint. However, there are no variables, no values unless we run the graph or part of the graph within a session.

You can create a session like this:

Whenever, you open a session, you need to remember to close it. Or you can use ‘with block’ like this.

Advantage of with block is: session closes automatically at the end of the with block. We use with block in most of our code and recommend you to do so too.

iii). Tensors in TensorFlow:

 
TF holds data in Tensors which are similar to numPy multi-dimensional arrays(although they are different from numPy Arrays):

a) Constants:

are constants whose value can’t be changed. You can declare a constant like this: 

As you can see, this is different from other programming languages like python, you can’t print/access constant a unless you run it inside a session. Let’s do it: 


This will produce 1.0 as output.

b) Variables:

are again Tensors which are like variables in any other language. 

Variables(as you can guess by the name) can hold different values as opposed to constants. However, they need to be separately initialized by an init op. It could be taxing to initialize all the variables individually. However, TensorFlow provides a mechanism to initialize all the variables in one go. Let’s see how to do that:

For tf version 0.11 and earlier, use initialize_all_variables() 
>>>init_op = tf.initialize_all_variables()
Use global_variables_initializer() for tf version 0.12 and later.
>>>init_op = tf.global_variables_initializer()
This will add init_op to our tensorflow default graph.
Now run this init_op before you try to access your variable:

This will output 2.0

Now, try to print the operations on the graph: 

This will now output:

Const
test_var/initial_value
test_var
test_var/Assign
test_var/read
init

As you can see, we have declared ‘a’ as Const so this has been added to the graph. Similarly, for the variable b, many ‘test_var’ states have been added to the TensorFlow graph like test_var/initial_value, test_var/read etc. You can visualize the complete network using TensorBoard, which is a tool to visualize a TensorFlow graph and training process.

c)Placeholders: 

are tensors which are waiting to be initialized/fed. Placeholders are used for training data which is only fed when the code is actually run inside a session. What is fed to Placeholder is called feed_dict. Feed_dict are key value pairs for holding data:

Output will be 6.

iv) Device in TensorFlow:

TensorFlow has very strong in-built capabilites to run your code on a gpu or a cpu or a cluster of gpu etc. It provides you options to select the device you want to run your code. However, this is not something that you need to worry about when you are just getting started. We shall write a separate tutorial on this later. So, here is the complete picture:

TensorFlow tutorial

 

Part-2: Tensorflow tutorial with simple example:

In this part, we shall examine a code to run linear regression. Before that, let’s look at some of the basic TensorFlow functions that we shall use in the code.

Create a random normal distribution:

Use random_normal to create random values from a normal distribution. In this example, w is a variable which is of size 784*10 with random values with standard deviation 0.01.

w=tf.Variable(tf.random_normal([784, 10], stddev=0.01))

Reduce_mean:

calculates the mean of an array.

    Output will be 35

ArgMax:

Very similar to python argmax. Gets you the maximum value from a tensor along the specified axis.

Output:  array([2, 0]) which shows the index of maximum value in each row of a. 

Linear Regression Exercise:

Problem statement: In linear regression, you get a lot of data-points and try to fit them on a straight line. For this example, we will create 100 datapoints and try to fit them into a line.

a) Creating training data:

trainX has values between -1 and 1, and trainY has 3 times the trainX and some randomness.

b) Placeholders:

c) Modeling:

Linear regression model is y_model=w*x and we have to calculate the value of w through our model. Let’s initialize w to 0 and create a model to solve this problem. We define the cost as square of (Y-y_model). TensorFlow comes with many optimizers that calculate and update the gradients after each iteration while trying to minimize the specified cost. We are going to define the training operation as changing the values using GradientDescentOptimizer to minimize cost with a learning rate of 0.01. Later we will run this training operation in a loop.

  d) Training:

Till this point, we have only defined the graph. No computation has happened.

None of the TensorFlow variables have any value. In order to run this graph, we need to create a Session and run. Before that we need to create the init_op to initialize all variables:

 

Please note that, first thing that has been done is to initialize the variables by calling init inside session.run(). Later we run train_op by feeding feed_dict. Finally, we print the value of w(again inside sess.run() which should be around 3.

e) Exercise:

If you create a new session block after this code and try to print w, what will be the output?

yes, you got it right, it will be 0.0. That’s the idea of symbolic computation. Once, we have gotten out of session created earlier, all the operations cease to exist.

Hope, this tutorial gives you a good start on TensorFlow. Please feel free to ask your questions in the comments. The complete code can be downloaded from here.

You can continue learning Tensorflow in the second tutorial here.