“Groaking Algorithms” or a Painless Introduction to Algorithms
Introduction
Almost any Junior level vacancy has requirements like “knowledge of data structures and algorithms”. Who has a specialized education, then the algorithms are included in the general course and there should be no problems. But what if the development was "skidded" from other steppes? The only thing left to do is to study for yourself. There is an answer to the question “who is to blame”, but the answer to the question “what to do” must be sought. Let's look in the books. And I want to tell you about one.Grokaem Algorithms
Among all the works I stumbled upon such a book as “Groaming Algorithms”. More details can be found here: " The book "Groaming Algorithms. An Illustrated Guide for Programmers and Curious People ". I noticed the book a long time ago, but on ozone it cost 680 rubles. Expensive or cheap - everyone decides for himself. I'm already taking the second book on Avito for half the price. So I found it in St. Petersburg, bought it, and went to play. Which I decided to share with you. Yes, there is no Java code in the book, but there is ... other code, but more on that later.Introduction to Algorithms (Selection Sort)
So, in an easy form of narration, we reach the first sorting in our performance. This is Selection Sort. Its essence lies in the fact that we go through the elements from left to right (from element 0 to the last one), and look for the largest element among the remaining elements. If we find, then we swap the element on which we are now and the largest element. The easiest way to imagine an array first is: [5, 3, 6, 2, 10]. Take a piece of paper, a pen (the easiest and most budgetary way) and imagine how we have a left border (left), a current index (or right border), there is an index of the minimum element. And how do we work with it. For example:
def selectionSort(array):
for left in range(0, len(array)):
minIndex = left
for right in range (left+1, len(array)):
if array[right] < array[minIndex]:
minIndex = right
if minIndex != left:
temp = array[left]
array[left] = array[minIndex]
array[minIndex] = temp
return array
print(selectionSort([5, 3, 6, 2, 10]))
And now we will present it in the form of Java code:
public static void selectionSort(int[] array) {
for (int left = 0; left < array.length; left++) {
int minIndex = left;
for (int right = left+1; right < array.length; right++) {
if (array[right] < array[minIndex]) {
minIndex = right;
}
}
if (minIndex != left) {
int temp = array[left];
array[left] = array[minIndex];
array[minIndex] = temp;
}
}
}
As you can see, the code is almost the same. The first code is an example from the book. The second is my free execution in Java code.
recursion
Next, we are told that there is such a thing as recursion. First of all, a problem is given about a farmer who has a field of size AxB. It is necessary to divide this field into equal "squares". And here after that the Euclid's Algorithm is mentioned. What I don't like is that they didn't try to code it. But Euclid's algorithm is simple and effective:
def euclidean(a, b):
if a == 0 : return b
if b == 0 : return a
return euclidean (b, a % b)
We will write the same code in Java. If desired, we can use the online compiler :
public static int euclid(int a, int b) {
if (a == 0) return b;
if (b == 0) return a;
return euclid(b, a%b);
}
Factorial was also mentioned at the beginning of the book. The factorial of a number n (n!) is the product of numbers from 1 to n. Why do it? There is one practical application here. If we have n objects (for example, n cities), then we can make n of them! Combinations. You can read more about recursion here: " Recursion. Training tasks ". Comparison of iterative and recursive approaches: " Recursion ".
Quick sort
Quicksort is a rather interesting algorithm. He doesn't get much attention in the book. Moreover, the code is given only for the most unfortunate case, when the first element is selected. However, perhaps for the first acquaintance, this example will be remembered more easily. And it's better to write a bad quicksort than not to write any. Here's an example from the book:
def quicksort(array):
if len(array) < 2:
return array
else:
pivot = array[0]
less = [i for i in array[1:] if i <= pivot]
greater = [i for i in array[1:] if i > pivot]
return quicksort(less) + [pivot] + quicksort(greater)
Everything is super simple here. If we have an array of 0 or 1 elements, we don't need to sort it. If it is greater, we take the first element of the array and consider it the "reference element". We compose 2 new arrays - one contains elements greater than pivot, and the second contains elements smaller. And we repeat recursively. Not the best option, but again, it is remembered better. Let's implement this algorithm in Java, but more correctly. The material from the review " Computer Science in JavaScript: Quicksort (Quicksort) " will help us with this . And before writing the code, let's draw again to "feel" the algorithm: First, let's draw again on a piece of paper to realize the algorithm:
-
We need to be able to swap elements in an array:
private static void swap(int[] array, int firstIndex, int secondIndex) { int temp = array[firstIndex]; array[firstIndex] = array[secondIndex]; array[secondIndex] = temp; }
-
We need a method that divides the array in the specified interval into 3 parts
private static int partition(int[] array, int left, int right) { int pivot = array[(right + left) / 2]; while (left <= right) { while (array[left] < pivot) { left++; } while (array[right] > pivot) { right--; } if (left <= right) { swap(array, left, right); left++; right--; } } return left; }
Details at the link above. In short, we move the left cursor until the element is less than pivot. Similarly, we move the right cursor from the other end. And we do a swap if the cursors do not converge. We continue until the cursors have converged. We return an index that divides further processing into 2 parts.
-
There is a separation, sorting itself is needed:
public static void quickSort(int[] array, int left, int right) { int index = 0; if (array.length > 1) { index = partition(array, left, right); if (left < index - 1) { quickSort(array, left, index - 1); } if (index < right) { quickSort(array, index, right); } } }
That is, if the array consists of at least two elements, then they can already be sorted. First, we divide the entire array into two parts, smaller than pivot elements and larger ones. Then we perform similar actions for each of the received parts.
And for the test:
public static void main(String []args){ int[] array = {8,9,3,7,6,7,1}; quickSort(array, 0, array.length-1); System.out.println(Arrays.toString(array)); }
public static void mergeSort(int[] source, int left, int right) {
if ((right - left) > 1) {
int middle = (right + left) / 2;
mergeSort(source, left, middle);
mergeSort(source, middle + 1, right);
}
merge(source, left, right);
}
We determine the middle and divide the array in half. For each half, do the same and so on. Stop condition or base case - we must have more than one element, since we will not divide one element into two. Now you need to implement the merge, that is, merge:
public static void merge(int[] array, int from, int to) {
int middle = ((from + to) / 2) + 1;
int left = from;
int right = middle;
int cursor = 0;
int[] tmp = new int[to - from + 1];
while (left < middle || right <= to) {
if (left >= middle) {
tmp[cursor] = array[right];
System.out.println("Остаток справа: " + array[right]);
right++;
} else if (right > to) {
tmp[cursor] = array[left];
System.out.println("Остаток слева: " + array[left]);
left++;
} else if (array[left] <= array[right]) {
tmp[cursor] = array[left];
System.out.println("Слева меньше: " + array[left]);
left++;
} else if (array[right] < array[left]) {
tmp[cursor] = array[right];
System.out.println("Справа меньше: " + array[right]);
right++;
}
cursor++;
}
System.arraycopy(tmp, 0, array, from, tmp.length);
}
There isn't much to comment on here. From the names of the variables, println
everything is clear. Well, to check:
int array[] = {1, 7, 3, 6, 7, 9, 8, 4};
mergeSort(array, 0, array.length - 1);
System.out.println(Arrays.toString(array));
Hash tables
The book also touches on hash tables. You don’t have to implement it yourself, and the essence of hash tables is quite simple. After all, Java also has an implementation of hash tables, java.util.HashTable. If we look at the HashTable device, we will see that the Entry array lives inside. Entry is some record, which is a Key - Value bundle. HashTable has initialCapacity - that is, the initial size. And loadFactor is the load factor. The default is 0.75. This number tells at what load of the array (number of elements / total number) the size needs to be increased. In Java, it increases by 2 times. The book says that Hash tables are called hash tables because, based on the hash function, an array cell (basket) is calculated into whichEntry
. You can also read more here: Data structures in pictures. HashMap and LinkedHashMap . You can also read in books. For example here: " HashTable basics "
Graphs, Breadth First Search (Shortest Path Search)
Probably one of the most interesting topics is graphs. And here, we must pay tribute, they are given a lot of attention in the book. It might be worth reading this book for that. Although, perhaps, it could be stated a little more clearly)) But, we have the Internet and, in addition to the book, you can watch this theory playlist for “ hearing about graphs for the first time ”. Well, of course, at the very beginning of the book, a breadth-first search algorithm is given,breadth-first-search
aka BFS. The book contains the following graph:
private Map<String, String[]> getGraph() {
Map<String, String[]> map = new HashMap<>();
map.put("you", new String[]{"alice", "bob", "claire"});
map.put("bob", new String[]{"anuj", "peggy"});
map.put("alice", new String[]{"peggy"});
map.put("claire", new String[]{"thom", "jonny"});
map.put("annuj", null);
map.put("peggy", null);
map.put("thom", null);
map.put("johny", null);
return map;
}
Now the search itself, built on this data:
private String search() {
Map<String, String[]> graph = getGraph();
Set<String> searched = new HashSet<>();
Deque<String> searchQue = new ArrayDeque<>();
searchQue.add("you");
while (!searchQue.isEmpty()) {
String person = searchQue.pollFirst();
System.out.println(person);
if (personIsSeller(person)) {
return person;
} else {
String[] friends = graph.get(person);
if (friends == null) continue;
for (String friend : friends) {
if (friend != null && !searched.contains(friend)) {
searchQue.addLast(friend);
}
}
}
}
return null;
}
As you can see, nothing complicated. If you compare it with the code from the book, it's almost the same.
Graphs, Dijkstra's Algorithm
Having dealt more or less with BFS, the author of the book invites us to deal with Dijkstra's algorithm and weighted graphs. The following graph is proposed for the solution:public Integer[][] getGraphMatrix(int size) {
Integer[][] matrix = new Integer[size][size];
matrix[0][1] = 6;
matrix[0][2] = 2;
matrix[2][1] = 3;
matrix[1][3] = 1;
matrix[2][3] = 5;
return matrix;
}
And now the logic itself:
@Test
public void dijkstra() {
Integer[][] graph = getGraphMatrix(); // Данные графа
Integer[] costs = new Integer[graph.length]; // Стоимость перехода
Integer[] parents = new Integer[graph.length]; // Родительский узел
BitSet visited = new BitSet(graph.length); // "Ферма" маркеров посещённости
Integer w = 0;
do {
System.out.println("-> Рассматриваем вершину: " + w);
Integer min = null;
for (int i = 0; i < graph.length; i++) { // Обрабатываем каждую дугу
if (graph[w][i] == null) continue; // Дуги нет - идём дальше
if (min == null || (!visited.get(i) && graph[w][min] > graph[w][i])) {
min = i;
}
if (costs[i] == null || costs[i] > costs[w] + graph[w][i]) {
System.out.print("Меням вес с " + costs[i]);
costs[i] = (costs[w] != null ? costs[w] : 0) + graph[w][i];
System.out.println(" на " + costs[i] + " для вершины " + i);
parents[i] = w;
}
}
System.out.println("Вершина с минимальным весом: " + min);
visited.set(w);
w = min;
} while (w != null);
System.out.println(Arrays.toString(costs));
printPath(parents, 3);
}
public void printPath(Integer[] parents, int target) {
Integer parent = target;
do {
System.out.print(parent + " <- ");
parent = parents[parent];
} while (parent != null);
}
The book explains it quite step by step. If you add an article on Habré on the Internet + look at the code, you can remember it. I found the step-by-step analysis somewhat cluttered. But for the very step-by-step plus. Overall ok, although it could be better
Greedy Algorithms
The next section is devoted to "greedy algorithms". This section is interesting because it uses sets (java.util.Set). Finally, you can see why it might be needed. As input we use the list of states:Set<String> statesNeeded = new HashSet();
statesNeeded.addAll(Arrays.asList("mt", "wa", "or", "id", "nv", "ut", "ca", "az" ));
As well as a list of radio stations covering some of these states:
Map<String, Set<String>> stations = new HashMap<>();
stations.put("kone", new HashSet(Arrays.asList("id", "nv", "ut")));
stations.put("ktwo", new HashSet(Arrays.asList("wa", "id", "mt")));
stations.put("kthree", new HashSet(Arrays.asList("or", "nv", "ca")));
stations.put("kfour", new HashSet(Arrays.asList("nv", "ut")));
stations.put("kfive", new HashSet(Arrays.asList("ca", "az")));
Further in the book, the algorithm itself is indicated and explained:
Set<String> finalStations = new HashSet();
while (!statesNeeded.isEmpty()) {
String bestStation = null;
Set<String> statesCovered = new HashSet();
for (String station: stations.keySet()) {
Set covered = new HashSet(statesNeeded);
covered.retainAll(stations.get(station));
if (covered.size() > statesCovered.size()) {
bestStation = station;
statesCovered = covered;
}
}
statesNeeded.removeAll(statesCovered);
finalStations.add(bestStation);
}
System.out.println(finalStations);
Dynamic programming
The book also describes such problems to which an approach called "dynamic programming" is applied. The task is given:List<Thing> things = new ArrayList<>();
things.add(new Thing("guitar", 1, 1500));
things.add(new Thing("tape recorder", 4, 3000));
things.add(new Thing("notebook", 3, 2000));
Now the algorithm itself:
int bagSize = 4;
int cell[][] = new int[things.size()][bagSize];
// Заполняем первую строку без условий
for (int i = 0; i < bagSize; i++) {
cell[0][i] = things.get(0).cost;
}
// Заполняем оставшиеся
for (int i = 1; i < cell.length; i++) {
for (int j = 0; j < cell[i].length; j++) {
// Если вещь не влезает - берём прошлый максимум
if (things.get(i).weight > j+1) {
cell[i][j] = cell[i - 1][j];
} else {
// Иначе текущая стоимость + предыдущий максимум оставшегося размера
cell[i][j] = things.get(i).cost;
if (j + 1 - things.get(i).weight > 0) {
cell[i][j] += cell[i-1][j + 1 - things.get(i).weight];
}
}
}
}
System.out.println(Arrays.deepToString(cell));
An interesting task is also given to find the most similar words. Interesting, isn't it? Read more here: LongestCommonSubsequence.java
GO TO FULL VERSION