Cats

It’s interesting what you can see when you go out. I had been staying in for the last few days, mostly in an effort to refine and deepen my understanding of the C language (with a little C++ on the side). But today was sunny and quite warm by Pac NW standards (58 F), so I decided that I would head down to the waterfront.

Only before I could get there, I met a cat. Not a hipster cat – like, a real cat, who was calling me. With that, the cat and I commenced to have a conversation.

Now, how exactly does one go about conversing with cat? I supposed that with help of psychedelics, it’s probably pretty easy. But under ordinary conditions, communicating with a cat can be subtle. The vocabulary that we share is limited, and there’s a still a lot to be worked out. But I can say that as one spends time with cats, it’s possibly to become more and more intimately familiar with their minds and thoughts. And, eventually, you can have a pretty good idea, most of the time, of what they’re trying to tell you.

In this case, it was clear that the cat was sad, lonely, and hungry. I know this, in part, because we both share some context. Unfortunately, the person who was probably the cat’s best human friend had died a few days before. I knew him also, but I don’t think we were nearly as close and he and he cat were. Furthermore, it’s not at all clear if anyone else was feeding this cat, or the rest of his/her entourage that appeared to live in and around the house.

I won’t say if I actually fed the cat, as I’m not sure how the neighbors might feel about that. In our community there’s a division between what I call the cat people and the bunnie people. Maybe not quite as bad as the Natives and the Dead Rabbits in “Gangs of New York,” but there’s still some disagreement over the proper response to the presence and activities of cats, especially the ones who seem like they may be trying to eke out a living in the yards, streets and alleyways of our neighborhood.

As for me, I feel like I can appreciate both sides. On the one hand, nobody likes to see little birds or bunnies turned into cat food. On the other hand, I also understand that nature having made them this way, cats have no choice but to kill and consume what they can. Cats don’t eat tofu. Humans and cats have been coexisting for possibly around 10,000 years; but the fashion of keeping cats as pets and feeding them has probably only been going on for less than 100 years. And they know this is not something they can depend on forever.

Advertisements

端的只今の一念より外はこれなく候

This is a line from Hagakure by Yamamoto Tsunetomo. The Hagakure is usually referred to as The Way of the Samurai, or something li ke that, but a literal translation would be more like, “In the Shadow of Leaves”, which I think is more beautiful.

端的只今の
一念より外はこ
これなく候

Tanteki tadaima no
ichinen yori hoka wa
kore naku sōrō

Which means, more or less:

There is surely nothing
other than the single purpose
of the present moment.

(Note that the words have been rearranged for grammatical reasons.)

Even though it’s not 5-7-5, it still has the feeling and rhythm of a haiku for me, and it really distills very nicely, I think, the spirit of zen.

The Hagakure, BTW, is the book that Ghost Dog reads (and gives to the little girl) in Jim Jarmusch’s film of the same name.

Wine

I don’t drink wine anymore; having finished cat-assisted rehab, I now just drink lots coffee. But I’m still interested in the subject and was particularly curious about a data-set that I happened to encounter in my efforts to learn more about Python and statistics.

I’ll try to provide more details later on, but I was so excited about these results that I decided to go ahead and put up a rough ASAP.

Turns out, that with a little Pythonification and linear regression, you can find out some interesting things about wine.

This is what you get when you look at something like about 130k(!) wines:

wineplotscoredistLR

The wines are scored by a professional wine-taster, or sommelier, or something. The first graph shows the distribution of scores, which, to my untrained eye, looks suspiciously like a so-called “normal” (or Gaussian) curve. The graph next to it is an attempt to find the best fit for a linear relationship (if there is one) between the scores and the prices of these fine products. Turns out the lowest price is $4 (like what you might get at Trader Joe’s) and the highest is in excess of $3000 (Yikes!).

But look at that graph – the numbers are all over the place, and there doesn’t seem to be a well-defined relationship at all, at least not at first. After looking at it a couple of times, I thought to myself, “well, what about a logarithmic transformation – why not!”

logLRwine

So there you have it. There’s a bit of spread, but to me it certainly does look like the prices go up exponentially as the wine-taster’s scores increase. Something to think about, I guess.

(Note: If you check out the links above, be advised that the code provided on the web pages doesn’t work the way it’s written, or at least it didn’t work for me. Maybe because I’m still using Python 2.7. I had to do a little tweaking, as well as learn a little more about the use of the numerical and graphing modules to get it going.)

Cheers, everyone.

Seattle Weather, revisited

Following up a little on the previous post about Seattle weather conditions and Markov chain modeling, I decided to write a quick Python script that calculates the probability vectors for each day’s weather.

Here it is:


#!/usr/bin/env python

import numpy as np

t = 1

M = np.array([[0.80, 0.65, 0.60],
              [0.10, 0.25, 0.10],
              [0.10, 0.10, 0.30]])

x = np.array([[1],[0],[0]]) #x1

for t in range (1, 15):

     print 't =', t
     print 'x(%d) =' %(t)
     print x[0]
     print x[1]
     print x[2]
     print '\n'

     x = np.dot(M,x)

As you can see, not much to it, really.

Recall that x_t is a vector that contains the current (at time t ) probabilities of the weather either being cloudy (x(1)), rainy (x(2)), or sunny (x(3)). As each day passes, x_{t + 1} = Mx_t where M is the probability transition matrix for the various different states (Cloudy/State 1, Rainy/State 2, Sunny/State 3).

If you iterate this equation multiple times, you get some interesting results:

MarkovPage1

These are the same results that we got by hand before. Now let's do a few more iterations and see what we get:

MarkovPage2

And finally:

MarkovPage3

This is really very exciting; notice how after about 12 iterations, we have a vector that represents stable, long-term probabilities of approximately 76% cloudy days, 12% rainy days, and 12% sunny days. I don't know if that's true or not, but it certainly feels about right! (Of course, with climate change, this just might all be a thing of the past, which would be sad; I like it cloudy.)

Nozarashi o

野ざらしを
心に風の
しむ身かな

Nozarashi o
Kokoro ni kaze no
Shimu mi kana.

– Bashō

My son suggested that I translate this haiku by Bashō. I think he thinks I know more Japanese than I do. But having spent a lot of time with this poem, I’ll give it a try.

It is contained in a book that was published in or around 1685, 野ざらし紀行 (Nozarashi kikō), the title of which is translated in various ways, but my favorite is Journal of Bleached Bones in a Field, a pdf of which can be downloaded here for your reading pleasure.

Basho himself was a very interesting fellow. Born 松尾 金作 (Matsuo Munefusa), he later changed his name to 松尾 芭蕉 (Matsuo Bashō). Apparently he called himself “Bashō,” which means “banana tree” after a tree that grew in the yard of his hut, where he lived more-or-less as a hermit at times. He was a restless person and traveled a lot. At one point his hut burned down; so for a while, at least, he was effectively homeless. He was a practitioner of Zen, and lived a monk-like existence, but as far as I know he was never actually formally a Buddhist monk.

Here’s a picture of Bashō, presumably doing what he did best:

basho_by_hokusai-small.jpg

There are already a number of existing translations of the haiku above, and I am by no means a scholar of Japanese. But I like the way these particular words are arranged, and I find them intriguing.

野ざらしを
Nozarashi o

“Nozarashi” is often translated as “Bleached bones in a field,” because the kanji 野 refers to a “field,” and ざらし or 晒 refers to something “bleached” (presumably y the sun) or exposed. The particle を (pronounced “o”) indicates direction, in a away similar to the preposition “at” or “to.” This is the place where something is happening, the place Basho is seeing or thinking of. So far we have a field and a condition of having been bleached. Where are the bones? As a Zen teacher might say, “they are in your mind.”

心に
Kokoro ni

“Kokoro” is “heart.” に, pronounced “ni,” is another particle indicating direction, in this case it can be thought of as meaning “in” or perhaps also “to” (the way を can). According to one particular resource you can think of を as indicating the direction of action, whereas に is more for the direction of motion. I think を may have more to do with location in this case. Note that a particle in Japanese comes after a word in Japanese where it would typically be used before a word in English.

風の
kaze no

“Kaze” means “spirit” or “wind” (same thing), as in 神風, pronounced “kamikaze,” meaning, “divine wind.” Most people probably associate the kamikaze with suicide pilots during WWII, but this expression was previously used to refer to the typhoons which destroyed Mongol fleets in 1274, and again in 1281, on both occasions saving Japan from invasion. The Japanese were truly under the protection of the gods!

I should mention that I have puzzled over the particle の, pronounced “no,” which usually acts like an apostrophe “s,” indicating possession , but can be very flexible. It can refer to other things, like location, and sometimes it just seems to connect things together. I’m honestly not sure what it means in this case, or if it really means anything. I have suspected that it might be there for rhythm, or just to provide a syllable.

しむ身かな
Shimu mi kana

しむ, “shimu” is a verb that means “pierce.” 身, pronounced “mi” in this instance (Japanese words can often be pronounced a lot of different ways, depending on how they are used) means “body.” The expression かな, “kana,” is something that, as I understand it, is used to represent what we would use an exclamation point for in English. (Kind of like the way よ, “yo,” is used at the end of a Japanese sentence for emphasis.) It also conveniently adds 2 syllables, which helps with the 5-7-5 structure of the poem.

I feel like if I were going to translate this in a way that was closest, perhaps, to the original intent of the author, I might say something like this:

Bleached bones in a field;
The wind pierces me to the heart.

Or, if I were going to expand it a little and try to make it more like a real 5-7-5 haiku in English:

Bleached bones in a field;
The wind pierces my body,
And it chills my heart.

This is a pretty minimalist interpretation, but I think of Haiku as a pretty minimalist form. So there you have it.

A Model of Seattle Weather

TeXmarkov1

My discussion of Markov chains is based largely on what I was able to learn by studying two resources, a paper by Michael Zhang (http://www.eecs70.org/static/resources/markov-chains-michael.pdf) and a book by Steven Boyd and Lieven Vandenberghe, Introduction to Applied Linear Algebra: Vectors, Matrices, and Least Squares (http://vmls-book.stanford.edu/vmls.pdf), both of which are freely available on the web. What I found most helpful about the Boyd and Vandenberghe book was the way it explained Markov chains as an example of a linear dynamical system, which hadn’t occurred to me before.

Let’s assume a simple (but maybe reasonably accurate) model of Seattle weather. (I’m trying to emphasize how cloudy it is in Seattle because I really like it here, and I don’t want anyone else to come. Please stay away; by the way, it gets cold here, too!) If it’s cloudy (which it is most of the time), then there’s an 80% (0.80) probability that it will be cloudy again tomorrow, a 0.10 probability that it will rain, and a 0.10 probability that it will be sunny. Note that we are considering probabilities for weather events the following day – today is considered a state, we’ll call it x_1, and the new state tomorrow we’ll write as x_2.

Here is a summary of the probabilities of the various possible transitions between states:

TeXMarkovTable1

As you can see, it’s very helpful to have this in the form of a graph, with nodes representing each of the possible states, and directed edges representing the transitions between states. If you go back and take a look at the image above, as well as the list of probabilities, you’ll notice that each state can do one of 3 things; it can remain in the same state the following day, or it can go to one of the other two states. Note that the probabilities of these three transitions for a state sums to 1 – in other words, it has to do something, and, in this model, it can only do one of three things.

We can arrange these probability values more concisely in the form of a matrix, which we will refer to as a transition probability matrix:

\ \\ \begin{array}{lll} P  & = & \left ( \begin{array}{lll} 0.80 & 0.65 & 0.60 \\ 0.10 & 0.25 & 0.10 \\ 0.10 & 0.10 & 0.30 \end{array} \right ) \end{array}

This matrix will be multiplied by the state vector, x_i, where i = t, and t = 1 initially, followed by t = 2, 3, ... on each successive day. The vector x_i contains values for the current probability of the 3 possible states of the system, x(1), x(2), x(3):

\begin{array}{lll} x_i & = & \left ( \begin{array}{c} x(1)\\ x(2)\\ x(3) \end{array} \right ) \end{array}

Notice that each column represents the probabilities of making the transition from a particular state to a new state in the state vector: column 1 represents the probabilities of making the transition from state 1 (the Cloudy state, x(1)), column 2 is state 2 (Rainy, x(2)), and column 3 is state 3 (Sunny, x(3)).

Let’s multiply them together, using x_i = x_1 = {(1, 0, 0)}^T as our initial state probability vector – notice that in the initial state the we know what the conditions are – the weather is cloudy, and so x(1) = 1 and the other states have probability $0&bg=333333&fg=ffffff$. (We are using the transposed row vector in the line above for x_1 for convenience of display, but mathematically we are using the column vector for calculations. It could actually been done another way, but would have required that the matrix be transposed as well, and the transposed matrix would need to be multiplied on the right side of the vector.)

\ \\ \begin{array}{lllll} x_2 & = & Px_1  & = & \left ( \begin{array}{lll} 0.80 & 0.65 & 0.60 \\ 0.10 & 0.25 & 0.10 \\ 0.10 & 0.10 & 0.30 \end{array} \right ) \left ( \begin{array}{c} x(1)\\ x(2)\\ x(3) \end{array} \right ) \\ \\ &&& = & \left ( \begin{array}{lll} 0.80 & 0.65 & 0.60 \\ 0.10 & 0.25 & 0.10 \\ 0.10 & 0.10 & 0.30 \end{array} \right ) \left ( \begin{array}{c} 1 \\ 0 \\ 0 \end{array} \right ) \\ \\ &&& = & \left ( \begin{array}{c} 0.80 \\ 0.10 \\ 0.10 \end{array} \right ) \end{array}

In this case the result is just the probabilities of column 1. but let’s make it more interesting; let’s multiply the transition matrix by this new state vector which we’ll call x_2 to get x_3, the collection of the probabilities of the various states on the third day.

\ \\ \begin{array}{lllll} x_3 & = & Px_2  & = & \left ( \begin{array}{lll} 0.80 & 0.65 & 0.60 \\ 0.10 & 0.25 & 0.10 \\ 0.10 & 0.10 & 0.30 \end{array} \right ) \left ( \begin{array}{c} 0.80 \\ 0.10 \\ 0.10 \end{array} \right ) \\ \\ &&& = & \left ( \begin{array}{c} 0.80\times0.80 + 0.65\times0.10 + 0.60\times0.10\\ 0.10\times0.80 + 0.25\times0.10 + 0.10\times0.10 \\ 0.10\times0.80 + 0.10\times0.10 + 0.30\times0.10 \end{array} \right ) \\ \\ &&& = & \left ( \begin{array}{c} 0.765 \\ 0.115\\ 0.120 \end{array} \right ) \end{array}

\begin{array}{lllll} x_4 & = & Px_3  & = & \left ( \begin{array}{lll} 0.80 & 0.65 & 0.60 \\ 0.10 & 0.25 & 0.10 \\ 0.10 & 0.10 & 0.30 \end{array} \right ) \left ( \begin{array}{c} 0.765 \\ 0.115 \\ 0.120 \end{array} \right ) \\ \\ &&& = & \left ( \begin{array}{c} 0.800\times0.765 + 0.650\times0.115 + 0.600\times0.120\\ 0.100\times0.765 + 0.250\times0.115 + 0.100\times0.120 \\ 0.100\times0.765 + 0.100\times0.115 + 0.300\times0.120 \end{array} \right ) \\ \\ &&& = & \left ( \begin{array}{l} 0.75875\\ 0.11725\\ 0.12400 \end{array} \right ) \end{array}

\ \\ \begin{array}{lllll} x_5 & = & Px_4  & = & \left ( \begin{array}{lll} 0.80 & 0.65 & 0.60 \\ 0.10 & 0.25 & 0.10 \\ 0.10 & 0.10 & 0.30 \end{array} \right ) \left ( \begin{array}{c} 0.75875\\ 0.11725\\ 0.12400 \end{array} \right ) \\ \\ &&& = & \left ( \begin{array}{c} 0.800\times0.75875 + 0.650\times0.11725 + 0.600\times0.12400\\ 0.100\times0.75875 + 0.250\times0.11725 + 0.100\times0.12400 \\ 0.100\times0.75875 + 0.100\times0.11725 + 0.300\times0.12400 \end{array} \right ) \\ \\ &&& = & \left ( \begin{array}{l} 0.7576125 \\ 0.1175875 \\ 0.1248000 \end{array} \right ) \end{array}

Notice that even by the second or third day, it’s still looks like it’s going to be cloudy. (Also notice that all the probabilities in the vector x_5 sum to a total of 1, as they should, which is an indicator that we’ve probably done everything right).

There are more interesting things we can do with this (like using R or Python to program a simulation that does multiple iterations), and I may try to do some of that for you in a follow-up post.

Here’s pdf of this post:
TeXmarkov2