Chaos – Complexity in simple systems (1/3)

Chaos theory — the mysterious staple of pop science is a fascinating topic. It has the ability to both enlighten and bewilder. Its one in a long set of human endeavors that show us, truly, our limitation in the face of the ever-mystery that is nature.

I found out about chaos theory in any detail two years ago contemplating the limitations of mathematics. Most aspects of different phenomena can be reduced to a mathematical model which can be reasoned about. The model can help us answer questions and make predication about how the system would evolve. Questions like where will the ball land if we throw it with a certain velocity. Or if and when the moon might disappear from the sky. But others have a measure of “difficulty” to them that so far have defied all mathematical reductions. And it appears, though unproven in most  cases, that this is a fundamental limitation, not just a simple result of our ignorance. I’m not convinced of that, but it’s an open questions where most informed  opinion is that some questions are just not amenable to the simplifications we can get for the ball moving through the air.

And that’s where chaos theory comes in. What makes predicting the weather harder than predicting the change in velocity of a thrown ball? What makes the economy intractable even for people who claim it as an expertise, while even a kid can program a computer to “solve” a tic-tac-toe game? Those questions and others led to the eventual development of field of mathematics called chaos theory.

Chaos theory in short is the the study of dynamical systems which are sensitive to initial conditions. This definition might seem like a mouthful but the basic insight that led to the development of the field is the realization that in most* examples where predictability was lost it was due to small changes in the way the system was set up leading to vastly different results after some time.

* By most I allude to the fact that in some instances it can be easy to solve a problem in theory, but not in practice. Consider for example multiplication, it’s an easy problem for both humans and computers but if the numbers involved are large enough even the most powerful computers couldn’t multiply them in any reasonable time or even at all due to physical limitations.

To illustrate this point we can use a simplified version of the weather prediction example from earlier. In our example lets imagine a big box with a 10 balloons filled with lighter than air gas. Let’s put all the balloons  in one corner of the box evenly spaced one centimeter from each other.

When we begin the system, the balloons would want to move up, but due to temperature variations in the box and their shape they would have a small amount of horizontal motion as well. They would start colliding and after 10 seconds each balloon would find itself in a different location in the box. If we run the experiment again, changing nothing* the outcome would be the same.

If we change the distance between one of the balloons to its neighbor by one tenth of a centimeter while keeping everything else the same, after 10 seconds the locations of the balloons would be very different than in our first experiment. This seems obvious for most of our intuitions. But it’s important to emphasize, there’s nothing random about this system. The chaos or lose of predictability was not due to not having enough information. It’s due to the nature of the system. More specifically the balloons interacted with each other, and each interactions was dependent on the previous interactions, and small changes build upon each other, until after some time the system is in a very different state.

I choose this example for a reason. It might seem reasonable to assume that we can’t predict what the atmosphere would do since it’s made up of countless particles but that’s not the case, like the balloon example shows. In a sense the complexity is built in due to the dynamics of the system.

* In the real world changing nothing is rarely possible in systems like the balloon example. But fortunately mathematics does not face this limitation.

Chaotic systems have some very interesting properties one of those is attractors which I hope to tackle in the next installment of this series.

Using strace to spy on a terminal in real time

I administrate linux servers from time to time, and a common need in those situation is monitoring what a different terminal is doing. Either because of a lost connection to an SSH terminal which runs a command, or wanting to know what an unrecognized terminal is doing. To that end I use a very useful tool called strace.

Strace is a system utility on linux systems that can trace all systems calls and signals that a process uses. To monitor a terminal we can make strace print all occurrences of the read and write system calls that the terminal process is using.

Note: This tutorial assumes you have root privileges otherwise you won't be able to use strace to connect and monitor a running process.

To see who’s connected to the machine we can list /dev/pts. The directory contains special character files for each connected pts.

To use strace for the task at hand, we first need to identify the terminal we want to connect to. We can use ps which lists all the running processes like so:

alpha@milkyway:~$ ps aux | grep pts
alpha      943  0.0  0.0  44472  3344 pts/4    R+   09:23   0:00 ps aux
alpha      944  0.0  0.0  21536  1048 pts/4    S+   09:23   0:00 grep --color=auto pts
alpha     7978  0.0  0.0  29812  5172 pts/3    Ss+  09:16 0:00 bash
alpha    22659  0.0  0.0  29812  5192 pts/0    Ss   00:13   0:00 bash
root     22877  0.0  0.0  72716  4304 pts/0    S    00:24   0:00 sudo su alchemist
root     22878  0.0  0.0  72248  3884 pts/0    S    00:24   0:00 su alchemist
alchemi+ 22891  0.0  0.0  29808  5100 pts/0    S+   00:24   0:00 bash
alpha    23022  0.0  0.0  29812  5104 pts/1    Ss   00:28   0:00 bash
root     23034  0.0  0.0  72956  4424 pts/1    S    00:28   0:00 sudo su
root     23035  0.0  0.0  72248  3780 pts/1    S    00:28   0:00 su
root     23049  0.0  0.0  28712  3924 pts/1    S+   00:28   0:00 bash
alpha    24095  0.0  0.0  29812  5144 pts/2    Ss+  01:15   0:00 bash

To see all the input and output of the pts we’re interested in we can call strace with the following arguments:

strace -f -s999 -e trace=write -p 7978

-f: This option tells strace to follow the system calls for the child processes as well (commands run in the terminal).

-s: by default strace truncates strings, we can set the truncate limit higher so we can see more of the output.

-e trace=write: the system calls we want to monitor. For our task we only want to see the output in the terminal. If you want to log what the user types you can add the read call as well. The format is trace=syscalls1,syscall2,..

-p: The pid of the process we’re monitoring.

 

The output is useful, but it would be more useful if the output was more human suitable. You’re in luck, I wrote a simple wrapper to make the output appear like a normal terminal. You can download it from here.

My First Electronics Project

\(\)

I wanted to document my intro into electronics and hardware. And as I’ve yet to put anything useful on this domain this seems like an appropriate place to document such a journey. As the primary interest of the coming posts is not education but documentation, the posts might and will likely contain many errors and amateur design decisions. At the same time I do hope someone might find interesting and/or useful information in my attempts to learn about the subject.

The first project I’ve decided to undertake is the development of a sonar video camera. Using sound to generate graphical representations of objects seems cool. The basic idea is creating a single frequency sound wave and using a microphone to listen for the echos of that sound. Since sound travels at a constant speed we can use the echo to find the distance to objects in the environment. Using a single microphone will only give us distance information so there needs to be at least another microphone to get the orientation of that object with respect to the sound source. The echos will hit one of the microphones first which we can use to get the angle of those echos.

Diagram showing the way the system should work

Diagram showing the way the system should work

To generate the images from the echos, we’ll convert the distance/angle data to a single 3 dimensional position vector. We can then take those position vectors and translate them to a 2d image, using a method called the projection matrix.

As I come from a software background, the last part is where I’ll begin. It will also make things easier to debug. In some sense the hardware part is putting two microphones and single-frequency generator in a nice looking small box and a board that can make those calculations fast enough.

Of course that impression might and probably is naivety. One issue I did run into when it comes to generating a single-frequency sound is that if this sonar-camera is to ever be useful it shouldn’t create an annoying sound when it’s used. Human hearing range is between 31 to 19K hertz. A simple solution is to use a higher or lower frequency sound than that range. The issue stems from common house pets like cats and bats that do live in urban environments, and some of which are sensitive to sounds that are up to 200K hertz. I’ve looked into sub 10 hertz frequencies, as they are the bottom limit for all animals, but they require bigger equipment that will probably not fit within a small case. For now I’ve decided to concentrate on higher frequencies probably above 80k hertz. Hopefully there are appropriate microphones that are sensitive to those frequencies. One helpful aspect of this dilemma is that at least in the first stages of development the frequency doesn’t matter as much and most of the other parts of the system are agnostic as to the frequency used.

Animal Hearing Range Table

Animal Hearing Range Table

In the beginning I want to implement the software using python. It will be composed of three systems:

1. A debug framework that will take a 3d image and convert it into distance/angle data. The opposite of what the entire system is supposed to do. This will allow for effective testing of the system.

2. A system that simulates hardware events coming from the microphones, and converting them into distance/angle data and then to the position vectors.

3. An image generator that is fed the position vectors to generate the final image

One aspect that I wanted to keep for a later date, but I think might be useful now is that of “loop”. The system works by generating a single sound, and waiting for the multiple echos coming from the environment, and converting them to position vectors. Since different objects are at different distances they will take different amount of time to echo from. One issue is that we might detect echos from a previous signal. To prevent that scenario we need to choose a window of time that is dedicated for a single “run” of the signal→detection loop. The length of the loop is limited by frames-per-second of the video we want to create. If we wait half a second in the loop, we can only create two video frames. To get a more realistic wait time for the loop, we need to choose the maximum distance the camera should be able to detect. The formula is: 

$$loop\ time = {\frac{distance}{speed\ of\ sound}}$$

For 10 meters for instance it comes out as:

$$loop\ time = {\frac{10m}{340.29{\frac{m}{s}}}} = 0.029s$$

if we divide 1 with the loop time we get the FPS which in this case comes out as 34.

Sonar Camera Distance Vs FPS Graph

Distance Vs FPS graph

 

This was a general overview of the project. In the interest of keeping this series somewhat self contained I’ll delve deeper into the different aspect of the project in more details. The first order of business is explaining some general properties of sound waves and computer sound to be better positioned to understand how they are useful for this task.