CM10228 / Programming II:   Lecture 3


Trees & Logs

"In teoria, non c'e' differenza tra teoria e pratica. Ma in pratica c'e'" 

(In theory, there is no difference between theory and practice. But, in practice, there is.)

  -- Jan L.A. van de Snepscheut


-I.  Housekeeping:

  1. Complain when you are confused!!! Sue can't believe she didn't know you didn't know where to be for tutorial.
  2. No required text for this course.
    1. There may be no java books that are also good CS texts.
    2. The on-line notes may link to more resources.
  3. Recommended texts: If the lecture notes and the lectures aren't making things clear enough for you, or don't go into as much detail as you'd like, books are a way to fill in the gaps or to get alternative perspectives that might be clearer to you, better for the way you individually learn.
    1. These books only cover GUIs, threads, events, applets, networking (hard book), basic programming (easy book).
    2. The only book I recommend covering algorithms, complexity and data structures is actually an advanced text that after the first chapter goes way beyond what you'll learn in this course. But it is awesome. But there are many excellent web pages on this topic, including these excellent notes on data structures, sorting (with animations!), searching and complexity.
    3. Books:
      1. easier:
        1.  Java Programming Today, Barbara Johnston.  Um, it's hard to buy anymore, but it's in the library & it's a good book for basic learning.
        2. Thinking in Java, by Bruce Eckel.  Notice that he has the old version of the book on line, and you can buy the newer version (2006.) 
      2. harder:
        1.  Learning Java by Niemeyer & Knudsen (O'Reilly, the one with the tiger mum & cubs on the cover).  Note that all O'Reilly Java books have tigers on the cover, including the general reference book, but this is the one about learning java.  This is available on-line for free if you are on campus or VPNing to Bath.
      3. advanced:
        1. You don't need these books for this course, I'm just mentioning them in case you are keen and want to go way beyond this course.
        2. Algorithms by Neapolitan, the first chapter is a more formal & clear version of the complexity material I present in the first three weeks.  From October 2016 we expect to be using this book in the second year.
        3. Contemporary Artificial Intelligence, also by Neapolitan.  This goes entirely beyond the material in this course, and again will probably be in the second year from October 2016.
      4. You may also want to look at Object-Oriented Programming with Java by David Barnes.  It has nice networking examples too.
      5. An Introduction to Network Programming with Java, by Jan Graba (new Sept 2006), also looks good.  It doesn't only talk about networking; it talks about file handling, threads, servlets, CORBA (middleware) etc.  I haven't really read it thoroughly, but it looked good enough I've ordered a few copies for the library.  This is available on-line for free if you are on campus or on VPN to Bath.
  4. Reverse 
    (defun reverse-help (oldlist newlist)
    (cond ((null oldlist) newlist)
    (t (reverse-help (cdr oldlist) (cons (car oldlist) newlist)))))

    (defun my-reverse (list)
    (reverse-help list nil))
    1. draw memory for them too -- leave up for tree stuff below.

I. Criteria for Evaluating an Algorithm

  1. Main Criteria
    1. Speed
    2. Size
    3. Risk of failure.
    4. Ease of maintenance.
  2. Speed is the main criteria that we'll talk about, but could equally talk about size.
  3. Speed and Size are the two things Computer Scientists might be talking about when they talk about complexity formally.
  4. Of course, the conventional meaning of complexity (how complicated it is to understand the algorithm) affects both risk and maintenance.
    1. Often worth going with something slightly slower if it will be easier to maintain.
    2. Software development is often more of a bottleneck than processor speed.
      1. Good programmers are more expensive than fast computers.
      2. Brooks' Law:  Adding manpower to a late software project makes it later.
        1. This law is so old it's not even gender neutral (In fact, from The Mythical Man-Month, 1975), but it's still true.
        2. Slashdot review of The Mythical Man-Month 20th Anniversary Edition.
      3. But sometimes, time really matters.
          1. Graphics, games engines.
          2. Database engines.
          3. Simulations:
            1. Weather simulations.
            2. Social or political simulations of millions of agents, 
            3. Science: the evolution of life, evolution of culture, the big bang, hydrogen atoms, brain cells etc.
              1. If you are a good programmer with spare time & interested in modelling the evolution of culture, come visit me during my office hours.  AmonI
              2. Right after the first time I gave this lecture (2004) I got a talk announcement on the importance of algorithms for molecular biology / drug discovery from Bruce R. Donald.  Whether you care about helping humanity or making money (not an xor) that's an important research field.
  5. How do you measure Speed?
    1. Stop watch usually isn't recommended (though see the quote above!)
    2. Speed of one instance of an algorithm depends on:
      1.  the processor it's run on
      2. other components (e.g. graphics card, bus)
      3. What else is happening on the computer.
      4. Amount of RAM available.
        1. This is only true because read & write operations take time, even to memory.
        2. But they take more time if they have to go to disk.
        3. If a computer runs out of space in RAM, it swaps memory onto disk.
        4. Very bad thing:  if most time is spent swapping, little on the computation.
        5. Can happen if working with very large data sets, not processing the data efficiently.
      5. But this isn't what most computers scientists are talking about when they talk about time complexity.
  6. Algorithms are normally analyzed in terms of:
    1. The number of operations they perform.
    2. The types of operations they perform.
    3. How the number of operations they perform changes if parameters change.  The key point!!
      1. These criteria are the same for both time and space.
      2. Usually ignore most of the operations and focus on a few that are most significant for time or space (whichever is being measured)
        1. e.g. for time disk reads, hard arithmetic
        2. e.g. for space `new' commands (things that allocate more memory.)
  7. How the number of operations they perform changes if parameters change?
    1. This question is referred to as  scaling.
    2. Scaling happens with respect to some parameter.
    3. Example:  As an animal gets taller, its weight typically scales at height3.
      1. This is because weight is directly related to volume, not height.
      2. volume = height x width x depth.  If you assume width & depth are also correlated to height, then volume is correlated to height3.
      3. Bone strength can grow as height2 but exoskeletons can't, so vertebrates can grow bigger than insects.
  8. Example in algorithms:  finding the length of an array:  How does this scale on the number of items in a collection?
    1. Just look up a variable in the collection object that tells you its length.
      1. Always takes the same number of steps however many items there are.
      2. This is called a constant time algorithm.
    2. Start at the beginning and count until you reach the last item (which must be marked somehow, like in the lists.)
      1. The number of steps is dependent on the number of objects.
      2. This is a linear algorithm.
    3. If you are checking for how many unique items are in the collection, then for each item of the list you will have to check if any of the other items are the same, so you go through the list once for each item in the list.
      1. The number of steps is the square of the number of items in the collection.
      2. This is said to scale quadratically.
drawing of constant, linear
      & quadratic time plotted with respect to # of objects

Notice that an algorithm may look very good at a low N, but then turn out to be a nightmare at higher N!  On the other hand, if you know for certain that you will only have low N for an application, you may still want to consider that algorithm.

II. Logarithms & Logarithmic Complexity

  1. You need to remember what logarithms are to understand complexity analysis.
    1. More review about logarithms.  
    2. A slightly heavier logarithm page.
    3. But don't worry  -- if you understand this page, that's all you need for this unit.
  2. The default base for log is 10.  So log(100)=2 is just another way to say 102 = 100.
  3. In computer science, we normally use base 2.  E.g. log2(8) =3.
    1. We like base 2 because of bits.  For example, the number of bits you need to represent an integer is logarithmic (base 2) on its value --      
      0
      1
      10
      11
      100
      101
      110
      111
      1000

      20
      21

      22



      23
      0
      1
      2
      3
      4
      5
      6
      7
      8
  4. If you think about it, a logarithmic complexity would be a good thing for an algorithm -- not as good as constant time, but much better than linear.
  5. In fact, a log2 complexity would be exactly as much better than linear as quadratic is worse.  [draw on graph]
  6. But can we write an algorithm of logarithmic complexity?

III. Trees for Sorting and Searching

  1. A tree is like a list, except that instead of a next reference, each node has two references, left and right.
    1. the objects at left & right are called children.
    2. the object that points directly to another object is called its parent.
    3. the top parent is called the root (back to the tree metaphor.)
  2. An algorithm for sorting a list using a tree.
    1. Take the first number in the list, make it the root.
    2. For each number in the list, start at the root object
      1. If the new number is less than the current object, go to its left child.
      2. If the new number is larger, go to its right child.
      3. (assume you can ignore duplicates.)
      4. If the child you've gone to is empty, insert your number in a new object.
      5. Otherwise go back to the most recent 1 (that this is 5 of.)
  3. Someone asked in class one year about the duplicate case (I'd actually skipped step 3 above on the board.)  If you are only looking for the existence of an object (see our search below) duplicates can be ignored / skipped.  If you do care about how many items you have of each type, you can have your node objects contain a variable that counts how many instances you found of that object.  You could even imagine keeping a list of objects at each node in the tree if you like!  You'll learn more about complex data structures in Programming II, but if you are keen now, have a look at  b-trees.
  4. Example:  15, 23, 14, 27, 3, 18, 5 gives:
                       15
    / \
    14 23
    / / \
    3 18 27
    \
    5
  5. The depth of this tree grows roughly at log2 relative to the number of items.
    1. Notice that no number will ever come to the right of the 14 (& not just because I ran out of space.)
    2. So log2 is the best case, it's unlikely to be the actual case.
    3. There are algorithms for rebalancing trees though, so if the number of items you have happens to be exactly a power of 2, you could get the optimal case.
  6. To find out whether a number is in the tree, almost the same algorithm works:
    1. Start at the root.
    2. If you find your number, return #true, if you find a null node, return #false. 
    3. Assuming you didn't just finish, if the number you are looking for is less than your current node, move to the left, otherwise to the right.
    4. Go back to step 2.
  7. If the tree is perfectly balanced, then you will only have to look at a maximum of log2 N items (where N is the length of the original list, the Number of items.)
  8. Obviously the base of the logarithm here is dependent on the number of children, so you could do even better if you had more children.
    1. But the algorithm gets more complicated if you can't just use </left vs >/right.
      1. You'll learn such an algorithm next year in association with databases, e.g.  b-trees
    2. Another reason computer scientists like log2.

VII. What have we learned? 

  1. Algorithmic complexity.
  2. Logarithms.
  3. Taught Trees
  4. Next lecture will actually be longer than an hour, so turn up around 3:00
  5. Quick sort now if time.


page author: Joanna Bryson
8 February 2015