layout | title | artists | date | tags | categories | |||
---|---|---|---|---|---|---|---|---|
post |
AA100 Project Proposal |
Chelly Jin, Jack Turpin, Youjin Chung |
2017-05-01 01:52:15 -0700 |
|
|
Exploring the recognition, translation, and conception of body language, with particular interest in the "sigh."
- Video data from movie scenes / internet
- Audio
- Personal data
- Air pressure
- What is the machine version of “sighing”? ("sigh"-ing as the impetus of the project)
- Conception of language : Is there a difference between non-verbal and verbal communication?
- Can technology recognize the nuances in body language that we feel make us particularly human?
- Translation of human body language
- All methods of expression we have is a result of our bodies. But the communication between software does not require haptic feeling.
- Technology has sensory capabilities for humans; not for ‘themselves’
- Reflecting on a modern age of communication that leans on emotions and body notation (text, emojis, etc.)
- Interpersonal relationship between human and ai
- Technology / bot that would communicate back to you with the patterns of body language
- If the user feels that they have garnered some sort of meaningful interaction
- Random or arbitrary interactions