JavaRush /Java Blog /Random EN /Growing algorithms or a painless introduction to algorith...
Viacheslav
Level 3

Growing algorithms or a painless introduction to algorithms

Published in the Random EN group
Review of the book "Groaking Algorithms". A little personal opinion, a few examples. I hope this review will help you understand whether you want to read this book or whether it will not take its place on your shelf. WARNING: Too much text )

“Groaking Algorithms” or a Painless Introduction to Algorithms

Introduction

Almost any Junior level vacancy has requirements like “knowledge of data structures and algorithms”. Who has a specialized education, then the algorithms are included in the general course and there should be no problems. But what if the development was "skidded" from other steppes? The only thing left to do is to study for yourself. There is an answer to the question “who is to blame”, but the answer to the question “what to do” must be sought. Let's look in the books. And I want to tell you about one.
"Groaking algorithms" or a painless introduction to algorithms - 1

Grokaem Algorithms

Among all the works I stumbled upon such a book as “Groaming Algorithms”. More details can be found here: " The book "Groaming Algorithms. An Illustrated Guide for Programmers and Curious People ". I noticed the book a long time ago, but on ozone it cost 680 rubles. Expensive or cheap - everyone decides for himself. I'm already taking the second book on Avito for half the price. So I found it in St. Petersburg, bought it, and went to play. Which I decided to share with you. Yes, there is no Java code in the book, but there is ... other code, but more on that later.

Introduction to Algorithms (Selection Sort)

So, in an easy form of narration, we reach the first sorting in our performance. This is Selection Sort. Its essence lies in the fact that we go through the elements from left to right (from element 0 to the last one), and look for the largest element among the remaining elements. If we find, then we swap the element on which we are now and the largest element. The easiest way to imagine an array first is: [5, 3, 6, 2, 10]. Take a piece of paper, a pen (the easiest and most budgetary way) and imagine how we have a left border (left), a current index (or right border), there is an index of the minimum element. And how do we work with it. For example:
"Groaking algorithms" or a painless introduction to algorithms - 2
Often algorithms are described in pseudocode, for example, on Wikipedia. We don't really have pseudocode, but more on that later. Until then, let's see:

def selectionSort(array):
    for left in range(0, len(array)):
        minIndex = left
        for right in range (left+1, len(array)):
            if array[right] < array[minIndex]:
                minIndex = right
        if minIndex != left:
            temp = array[left]
            array[left] = array[minIndex]
            array[minIndex] = temp
    return array

print(selectionSort([5, 3, 6, 2, 10]))
And now we will present it in the form of Java code:
public static void selectionSort(int[] array) {
        for (int left = 0; left < array.length; left++) {
            int minIndex = left;
            for (int right = left+1; right < array.length; right++) {
                if (array[right] < array[minIndex]) {
                    minIndex = right;
                }
            }
            if (minIndex != left) {
                int temp = array[left];
                array[left] = array[minIndex];
                array[minIndex] = temp;
            }
        }
}
As you can see, the code is almost the same. The first code is an example from the book. The second is my free execution in Java code.

recursion

Next, we are told that there is such a thing as recursion. First of all, a problem is given about a farmer who has a field of size AxB. It is necessary to divide this field into equal "squares". And here after that the Euclid's Algorithm is mentioned. What I don't like is that they didn't try to code it. But Euclid's algorithm is simple and effective:
"Groaking algorithms" or a painless introduction to algorithms - 3
To be honest, I lacked some details in the book, as in this video: “ Informatics. Theory of algorithms. Euclid's algorithm . For example, about the fact that if a is less than b, then during the first run b and a will change places and the second time the larger will be divided by the smaller. Therefore, the order of the arguments is not important. As usual, first we can “feel” the algorithm on a piece of paper:
"Groaking algorithms" or a painless introduction to algorithms - 4
Now let's look at the code:

def euclidean(a, b):
    if a == 0 : return b
    if b == 0 : return a
    return euclidean (b, a % b)
We will write the same code in Java. If desired, we can use the online compiler :
public static int euclid(int a, int b) {
        if (a == 0) return b;
        if (b == 0) return a;
        return euclid(b, a%b);
}
Factorial was also mentioned at the beginning of the book. The factorial of a number n (n!) is the product of numbers from 1 to n. Why do it? There is one practical application here. If we have n objects (for example, n cities), then we can make n of them! Combinations. You can read more about recursion here: " Recursion. Training tasks ". Comparison of iterative and recursive approaches: " Recursion ".

Quick sort

Quicksort is a rather interesting algorithm. He doesn't get much attention in the book. Moreover, the code is given only for the most unfortunate case, when the first element is selected. However, perhaps for the first acquaintance, this example will be remembered more easily. And it's better to write a bad quicksort than not to write any. Here's an example from the book:

def quicksort(array):
    if len(array) < 2:
        return array
    else:
        pivot = array[0]
        less = [i for i in array[1:] if i <= pivot]
        greater = [i for i in array[1:] if i > pivot]
    return quicksort(less) + [pivot] + quicksort(greater)
Everything is super simple here. If we have an array of 0 or 1 elements, we don't need to sort it. If it is greater, we take the first element of the array and consider it the "reference element". We compose 2 new arrays - one contains elements greater than pivot, and the second contains elements smaller. And we repeat recursively. Not the best option, but again, it is remembered better. Let's implement this algorithm in Java, but more correctly. The material from the review " Computer Science in JavaScript: Quicksort (Quicksort) " will help us with this . And before writing the code, let's draw again to "feel" the algorithm: First, let's draw again on a piece of paper to realize the algorithm:
"Groaking algorithms" or a painless introduction to algorithms - 5
It seems to me that one of the most dangerous moments is to solve problems entirely. Therefore, we will implement the implementation in several small steps:
  • We need to be able to swap elements in an array:

    private static void swap(int[] array, int firstIndex, int secondIndex) {
            int temp = array[firstIndex];
            array[firstIndex] = array[secondIndex];
            array[secondIndex] = temp;
    }

  • We need a method that divides the array in the specified interval into 3 parts


    private static int partition(int[] array, int left, int right) {
            int pivot = array[(right + left) / 2];
            while (left <= right) {
                while (array[left] < pivot) {
                    left++;
                }
                while (array[right] > pivot) {
                    right--;
                }
                if (left <= right) {
                    swap(array, left, right);
                    left++;
                    right--;
                }
            }
            return left;
    }

    Details at the link above. In short, we move the left cursor until the element is less than pivot. Similarly, we move the right cursor from the other end. And we do a swap if the cursors do not converge. We continue until the cursors have converged. We return an index that divides further processing into 2 parts.

  • There is a separation, sorting itself is needed:

    public static void quickSort(int[] array, int left, int right) {
            int index = 0;
            if (array.length > 1) {
                index = partition(array, left, right);
                if (left < index - 1) {
                    quickSort(array, left, index - 1);
                }
                if (index < right) {
                    quickSort(array, index, right);
                }
            }
    }

    That is, if the array consists of at least two elements, then they can already be sorted. First, we divide the entire array into two parts, smaller than pivot elements and larger ones. Then we perform similar actions for each of the received parts.

    And for the test:

    public static void main(String []args){
            int[] array = {8,9,3,7,6,7,1};
            quickSort(array, 0, array.length-1);
            System.out.println(Arrays.toString(array));
    }
The book states that this algorithm belongs to the so-called Divide and Conquer algorithms, when the data set being processed is divided in half each time. Algorithm complexity: O(nLogn)
“Groaking algorithms” or a painless introduction to algorithms - 6
What is bad (i.e., what I did not like) is that the book mentions merge sort in passing (merge sort), but does not provide any example or code. More details can be found here: " Informatics. Search and sorting algorithms: Merge sort ". Therefore, let's do it ourselves for the sake of consistency. The algorithm itself, of course, is inherently simple and uncomplicated:
public static void mergeSort(int[] source, int left, int right) {
    if ((right - left) > 1) {
        int middle = (right + left) / 2;
        mergeSort(source, left, middle);
        mergeSort(source, middle + 1, right);
    }
    merge(source, left, right);
}
We determine the middle and divide the array in half. For each half, do the same and so on. Stop condition or base case - we must have more than one element, since we will not divide one element into two. Now you need to implement the merge, that is, merge:
public static void merge(int[] array, int from, int to) {
    int middle = ((from + to) / 2) + 1;
    int left = from;
    int right = middle;
    int cursor = 0;

    int[] tmp = new int[to - from + 1];
    while (left < middle || right <= to) {
        if (left >= middle) {
            tmp[cursor] = array[right];
            System.out.println("Остаток справа: " + array[right]);
            right++;
        } else if (right > to) {
            tmp[cursor] = array[left];
            System.out.println("Остаток слева: " + array[left]);
            left++;
        } else if (array[left] <= array[right]) {
            tmp[cursor] = array[left];
            System.out.println("Слева меньше: " + array[left]);
            left++;
        } else if (array[right] < array[left]) {
            tmp[cursor] = array[right];
            System.out.println("Справа меньше: " + array[right]);
            right++;
        }
        cursor++;
    }
    System.arraycopy(tmp, 0, array, from, tmp.length);
}
There isn't much to comment on here. From the names of the variables, printlneverything is clear. Well, to check:
int array[] = {1, 7, 3, 6, 7, 9, 8, 4};
mergeSort(array, 0, array.length - 1);
System.out.println(Arrays.toString(array));

Hash tables

The book also touches on hash tables. You don’t have to implement it yourself, and the essence of hash tables is quite simple. After all, Java also has an implementation of hash tables, java.util.HashTable. If we look at the HashTable device, we will see that the Entry array lives inside. Entry is some record, which is a Key - Value bundle. HashTable has initialCapacity - that is, the initial size. And loadFactor is the load factor. The default is 0.75. This number tells at what load of the array (number of elements / total number) the size needs to be increased. In Java, it increases by 2 times. The book says that Hash tables are called hash tables because, based on the hash function, an array cell (basket) is calculated into whichEntry. You can also read more here: Data structures in pictures. HashMap and LinkedHashMap . You can also read in books. For example here: " HashTable basics "

Graphs, Breadth First Search (Shortest Path Search)

Probably one of the most interesting topics is graphs. And here, we must pay tribute, they are given a lot of attention in the book. It might be worth reading this book for that. Although, perhaps, it could be stated a little more clearly)) But, we have the Internet and, in addition to the book, you can watch this theory playlist for “ hearing about graphs for the first time ”. Well, of course, at the very beginning of the book, a breadth-first search algorithm is given, breadth-first-searchaka BFS. The book contains the following graph:
"Groaking algorithms" or a painless introduction to algorithms - 7
The book says that the queue will help us. Moreover, such that we can add elements to the end, and process the queue from the beginning. Such queues are called two-way or Deque in English. The book suggests using a data structure - a hash table. To correlate the name and neighbors. With numbered vertices, you can just use an array. This storage of vertices is called "List of adjacent vertices", which is not mentioned in the book. For this they are minus. Let's implement it in Java:
private Map<String, String[]> getGraph() {
    Map<String, String[]> map = new HashMap<>();
    map.put("you", new String[]{"alice", "bob", "claire"});
    map.put("bob", new String[]{"anuj", "peggy"});
    map.put("alice", new String[]{"peggy"});
    map.put("claire", new String[]{"thom", "jonny"});
    map.put("annuj", null);
    map.put("peggy", null);
    map.put("thom", null);
    map.put("johny", null);
    return map;
}
Now the search itself, built on this data:
private String search() {
    Map<String, String[]> graph = getGraph();
    Set<String> searched = new HashSet<>();
    Deque<String> searchQue = new ArrayDeque<>();
    searchQue.add("you");
    while (!searchQue.isEmpty()) {
        String person = searchQue.pollFirst();
        System.out.println(person);
        if (personIsSeller(person)) {
            return person;
        } else {
            String[] friends = graph.get(person);
            if (friends == null) continue;
            for (String friend : friends) {
                if (friend != null && !searched.contains(friend)) {
                    searchQue.addLast(friend);
                }
            }
        }
    }
    return null;
}
As you can see, nothing complicated. If you compare it with the code from the book, it's almost the same.

Graphs, Dijkstra's Algorithm

Having dealt more or less with BFS, the author of the book invites us to deal with Dijkstra's algorithm and weighted graphs. The following graph is proposed for the solution:
"Groaking algorithms" or a painless introduction to algorithms - 8
First, we need to understand how to represent our graphs. We could represent in the form of a matrix. An article on Habré will help us here: Dijkstra's Algorithm. Search for optimal routes on the graph . Let's use the adjacency matrix:
public Integer[][] getGraphMatrix(int size) {
    Integer[][] matrix = new Integer[size][size];
    matrix[0][1] = 6;
    matrix[0][2] = 2;
    matrix[2][1] = 3;
    matrix[1][3] = 1;
    matrix[2][3] = 5;
    return matrix;
}
And now the logic itself:
@Test
public void dijkstra() {
    Integer[][] graph = getGraphMatrix();           // Данные графа
    Integer[] costs = new Integer[graph.length];    // Стоимость перехода
    Integer[] parents = new Integer[graph.length];  // Родительский узел
    BitSet visited = new BitSet(graph.length);      // "Ферма" маркеров посещённости

    Integer w = 0;
    do {
        System.out.println("-> Рассматриваем вершину: " + w);
        Integer min = null;
        for (int i = 0; i < graph.length; i++) {    // Обрабатываем каждую дугу
            if (graph[w][i] == null) continue;      // Дуги нет - идём дальше
            if (min == null || (!visited.get(i) && graph[w][min] > graph[w][i])) {
                min = i;
            }
            if (costs[i] == null || costs[i] > costs[w] + graph[w][i]) {
                System.out.print("Меням вес с " + costs[i]);
                costs[i] = (costs[w] != null ? costs[w] : 0) + graph[w][i];
                System.out.println(" на " + costs[i] + " для вершины " + i);
                parents[i] = w;
            }
        }
        System.out.println("Вершина с минимальным весом: " + min);
        visited.set(w);
        w = min;
    } while (w != null);

    System.out.println(Arrays.toString(costs));
    printPath(parents, 3);
}

public void printPath(Integer[] parents, int target) {
    Integer parent = target;
    do {
        System.out.print(parent + " <- ");
        parent = parents[parent];
    } while (parent != null);
}
The book explains it quite step by step. If you add an article on Habré on the Internet + look at the code, you can remember it. I found the step-by-step analysis somewhat cluttered. But for the very step-by-step plus. Overall ok, although it could be better

Greedy Algorithms

The next section is devoted to "greedy algorithms". This section is interesting because it uses sets (java.util.Set). Finally, you can see why it might be needed. As input we use the list of states:
Set<String> statesNeeded = new HashSet();
statesNeeded.addAll(Arrays.asList("mt", "wa", "or", "id", "nv", "ut", "ca", "az" ));
As well as a list of radio stations covering some of these states:
Map<String, Set<String>> stations = new HashMap<>();
stations.put("kone", new HashSet(Arrays.asList("id", "nv", "ut")));
stations.put("ktwo", new HashSet(Arrays.asList("wa", "id", "mt")));
stations.put("kthree", new HashSet(Arrays.asList("or", "nv", "ca")));
stations.put("kfour", new HashSet(Arrays.asList("nv", "ut")));
stations.put("kfive", new HashSet(Arrays.asList("ca", "az")));
Further in the book, the algorithm itself is indicated and explained:
Set<String> finalStations = new HashSet();
while (!statesNeeded.isEmpty()) {
    String bestStation = null;
    Set<String> statesCovered = new HashSet();
    for (String station: stations.keySet()) {
        Set covered = new HashSet(statesNeeded);
        covered.retainAll(stations.get(station));
        if (covered.size() > statesCovered.size()) {
           bestStation = station;
           statesCovered = covered;
        }
    }
    statesNeeded.removeAll(statesCovered);
    finalStations.add(bestStation);
}
System.out.println(finalStations);

Dynamic programming

The book also describes such problems to which an approach called "dynamic programming" is applied. The task is given:
"Groaking algorithms" or a painless introduction to algorithms - 9
We have a bag with a capacity of 4 pounds. You need to find the most profitable items for a given weight. First, let's make a list of items:
List<Thing> things = new ArrayList<>();
things.add(new Thing("guitar", 1, 1500));
things.add(new Thing("tape recorder", 4, 3000));
things.add(new Thing("notebook", 3, 2000));
Now the algorithm itself:
int bagSize = 4;
int cell[][] = new int[things.size()][bagSize];
// Заполняем первую строку без условий
for (int i = 0; i < bagSize; i++) {
    cell[0][i] = things.get(0).cost;
}
// Заполняем оставшиеся
for (int i = 1; i < cell.length; i++) {
    for (int j = 0; j < cell[i].length; j++) {
        // Если вещь не влезает - берём прошлый максимум
        if (things.get(i).weight > j+1) {
            cell[i][j] = cell[i - 1][j];
        } else {
            // Иначе текущая стоимость + предыдущий максимум оставшегося размера
            cell[i][j] = things.get(i).cost;
            if (j + 1 - things.get(i).weight > 0) {
                cell[i][j] += cell[i-1][j + 1 - things.get(i).weight];
            }
        }
    }
}
System.out.println(Arrays.deepToString(cell));
An interesting task is also given to find the most similar words. Interesting, isn't it? Read more here: LongestCommonSubsequence.java

Search to Nearest Neighbors

Also in the book he very clearly talks about the k nearest neighbors algorithm:
"Groaking algorithms" or a painless introduction to algorithms - 10
And here is the formula for the calculation:
"Groaking algorithms" or a painless introduction to algorithms - 11

Outcome

The book ends with an interesting section, "What's next?", which provides a quick overview of interesting algorithms. Here is a brief description of what the meaning of trees and other algorithms is. On the whole, I liked the book. It should not be taken seriously as some kind of exhaustive information. You will have to search and find out for yourself. But as introductory information, to interest and give an initial idea, it's quite good. Yes, the code in the book is written in python. So all these examples are compilable) I hope this review will help you get an idea about the contents of the book and whether it is worth buying.

Additionally

You can also check out the following resources on this topic:
  1. EdX - Introduction to Java Programming: Fundamental Data Structures and Algorithms
  2. LinkedIn - Introduction to Data Structures & Algorithms in Java (paid)
  3. 27 sites with puzzles to hone your programming skills
  4. Java Coding Bat
  5. Tasks for programmers, answers to tasks of varying complexity
#Viacheslav
Comments
TO VIEW ALL COMMENTS OR TO MAKE A COMMENT,
GO TO FULL VERSION