Monthly Archives: August 2015

A Star

Recently I’ve found a nice piece of pseudo code for implementing A Star after searching through a few lesser or incorrect pseudo code passages:

And here is the Pop function pseudo code (written by me, so it probably has a small error here or there):

My favorite thing about the pseudo code is that the closed list can be implemented with just a flag. The open list becomes the most complicated part of the algorithm. Should the open list be a sorted array, an unsorted array, a binary heap? The answer largely depends on much more memory you need to traverse.

If a small portion of memory needs to be searched all dynamic memory can be allocated up-front on one shot. Otherwise bits of memory should probably be allocated up-front, and more as necessary during the algorithms run.

Just yesterday I implemented AStar in C where my full header file looked like:

In my internal C file I only have < 150 lines of code, including some file-scope variables and math functions. Implementation was nearly a one to one transcription of the above pseudo code (so those who are recursion impaired like myself shouldn’t have any problems). This style may not be thread-safe, but hey, most AI related coded can only be done in serial anyway. My grid size was maxed out at 20×15, so pre-allocating memory for your entire search area may not be so practical as it was for me.

Still, I hope this post can provide some bits of intuition that are useful to someone.

TwitterRedditFacebookShare

The Concept of Polymorphism

For a while I thought of many computer science terms as useless hype. For example, as said by Jon Blow in one of his recent videos, some terms can be vague or hard to recollect. Buzzwords.

Polymorphism, semantics, encapsulation, instantiate, interface, API, functional, abstraction, indirection, blah blah blah. Reading too many of these words make my eyes glaze over.

In Compilers, Principles, Techniques and Tools by Aho, Sethi and Ullman has a keen description of polymorphism — at least it really stuck with me. Here with me I have the 1988 reprint so it actually contains C code (unlike the new “shiny” version), and in it states:

“An ordinary procedure allows the statements in its body to be executed with arguments of fixed types; each time a polymorphic procedure is called, the statements in its body can be executed with arguments of different types. The term “polymorphic” can also be applied to any piece of code that can be executed with arguments of different types. so we can talk of polymorphic functions and operators as well.

Built-in operators for indexing arrays, applying functions, and manipulating pointers are usually polymorphic because they are not restricted to a particular kind of array, function, or pointer. For example, the C reference manual states about the pointer operator &: “If the type of the operand is ‘…’, the type of the result is ‘pointer to …’.” Since any type can be substituted for “…” the operator & in C is polymorphic.”

So, some of these buzzwords are just misrepresented.

It seems the idea of polymorphism is useful to express, in code, algorithms that manipulate data structures. If the data being transformed is a data structure, then the data within the structure can sometimes be irrelevant.

For example, perhaps we have a math library that runs matrix operations on 32-bit floats. The library is used extensively in a large project. One day some strange numeric bug creeps into a complex algorithm. It would be helpful to be able to swap all 32-bit floats (the data within the “structure”, where the structure is the project) to 64-bit floats to see if more precision affects the numerically-related bug. If polymorphism is supported, perhaps through function overloads or compiler-supported binary operators, this task might be trivial.

Another example is the need to compute the height of a tree.

Another example is to find a circular dependency in a dependency graph.

Freelist Concept

A freelist is a way of retrieving some kind of resource in an efficient manner. Usually a freelist is used when a memory allocation is needed, but searching for a free block should be fast. Freelists can be used inside of general purpose allocators, or embedded directly into an optimized algorithm.

Lets say we have an array of elements, where each element is 16 bytes of memory. Our array has 32 elements. The program that arrays resides in needs to request 16 byte elements, use them, and later give them back; we have allocation of elements and deallocation of elements.

The order the elements are allocated is not related in any way to order of deallocation. In order for deallocation to be a fast operation the 16 byte element needs to be in a state such that a future allocation be handed this element to be reused.

A singly linked list can be used to hold onto all unused and free elements. Since the elements are 16 bytes each this is more than enough memory to store a pointer, or integer index, which points to the next block in the free list. We can use the null pointer, or a -1 index to signify the end of the freelist.

Allocating and deallocating can now look like:

Setting up the memory* will take some work. Each element needs to be linked together somehow, like through a pointer or integer index. If no more elements are available then more arrays of size 32 can be allocated — this means our memory is being managed with the style of a “paged allocator”, where each array can be thought of as a page.

The freelist is an important concept that can be embedded ad-hoc into more complex algorithms. Often times it is important for little pieces of software to expose a very tiny C-like interface, usually just a function or two. Having these softwares self-contain their own internal freelists is one way to achieve a simple interface.

Example of Hiding the Freelist

For example say we are computing the convex hull of a point-set through the Quick Hull algorithm. The hypothetical algorithm exposes an interface like this:

This QHull function does no explicit memory allocation and forces the user to allocate an appropriate amount of memory to work with. The bounds of this memory (how big it needs to be for the algorithm’s worst case scenario) is calculated by the ComputeMemoryBound function.

Inside of QHull often times the hull is expanded and many new faces are allocated. These faces are held on a free list. Once new faces are made, old ones are deleted. These deleted faces are pushed onto the free list. This continues until the algorithm concludes, and the user does not need to know about the details of the embedded memory management of the freelist.

Convex hull about to expand to point P. The white faces will be deleted. The see-through faces will be allocated.

A convex hull fully expanded to point P. All old faces were deleted.

The above images were found at this address: http://www.eecs.tufts.edu/~mhorn01/comp163/algorithm.html