Fortune on Hand was a personal project created at the end of 2016 as the final for both Physical Computing and Computational Media.
Here is a detailed blog of how I made it.
When I was 15 I was reading one of the Greatest Chinese novels —Dream of the Red Mansion, (红楼梦) it was written in the year of 1791 in Ming dynasty with the history of more than 200 years, it is a big book has 120 chapters in total。
It has a such great impact on my life，I love the ancient very poetic Chinese writing style as well as the story plot, the one chapter that kept me thinking over and over again is Chapter 5, is about the main character Baoyu, he had a dream in a fancy palace in the sky and he saw a book which has the names of all the main girls’ characters on that, to each of them there writes a poem. Those poems are very vague, like every other asian poems, talking about rivers and fallen flowers. But actually they are the indications of how their fate will be. But to the people who don’t try to understand they are just simple poems.
Also I really like the idea of using the poems for fate telling, because they are vague and people can interpret in their own way.
I was also interested in making some unique human input into generated output.
I thought about fingerprint and face reading, but that would be a little “interrupting” and “invasive” to users.
What is more unique than our special palm line? For there is a rich history in palm reading.
After I had this idea I was so excited I couldn’t sleep.
So I just had this idea in mind that I want to make a palm reader that print out a poems as output for “fate reading”.
Here is a simple sketch I draw to help me understand the logic flow behind this installation.
There need a webcam that capture palm, and then go through the coding that for mapping the poems.
All the poems came form Sensō-ji‘s Chinese poems. And I category them as “good ” （吉） or “bad” (凶), map them to different palm lines according to Chinese palmistry book。
Thanks for github page by Kyle McDonald, I tried to work with the edge detecting thing, and it is very good in edge detecting of course, code here. Since it was extremely hard to determined The logic behind the coding is divided the canvas into 3 parts and count the palm pixels on that area of canvas, the more pixels there are, the “better” the poems.
(That’s how well the palm lines can be captured with right amount and angle of light)
（yes, I have Simian lines on both my hands）
However, with the light I got from the ER, I can somehow detect a little palm line, but the result can be really differ, since I have very special palm line that is really deep and a is striaght line across the whole palm. Which only 5%-10% of people have this, and luckily make it somehow easier for the cam to capture.
What I will do next is that get a cam with LED (from ER), try to capture image and then find the right lighting.
I got the size the hands by letting 6 people draw their hands on the paper.
Built up a box to test with the interaction.
Here is a really rough mock
My design idea came form Daoism and ancient Chinese culture.
For the printer hole graphic design, I used buddha’s hand.
For the lid design, I want it to look mysterious, ancient yet high tech.
In the same time, I hacked a webcam to put inside of my box. Here are more detail of making prototypes and mounting printer.