I did a Pcom and ICM combined final project called “Fortune on Hand”, the idea came from Chinese Daoism Palmistry knowledge, as well as a book call “The Dream In Red Mansion” .
I tried to combine the two open CV , blob detection on edge detection, but it didn’t really worked out and make it really diffcult to debug. I have not the real idea of how to actually get the line analyzed. Also have to consider the issue of hand size, skin tone…etc. That’s really hard..
So I came up with a back up plan.
Here is a layout of the page.
I fake the palm reading somehow, it’s analyzing pixels rather than real shape and lines on the palm, so as for the pictures above is analyzing my selfie. The type of the sepcific zone is defined by the number of the pixels.
All the poems line are from Sensō-ji Omikuji , they translate Chinese ancient poems to English. And I generate the image of the poem lines from Skerch. I organized them into different zones :
Here are the poems array I made in the code.
Since I’m dealing with a huge deal of poem line, so naming is very important for me not to messing them up. I truly learned a lot from that.
Then I also build up the physical box for palm reading.
This is so far the most interesting thing I’ve ever done. I would never see myself doing that before ITP. As someone has never done coding before ITP, thank you so much Allison for taking me to the world of coding!