Skip to content

Latest commit

 

History

History
43 lines (34 loc) · 1.56 KB

2017-05-02-projectproposal.md

File metadata and controls

43 lines (34 loc) · 1.56 KB
layout title artists date tags categories
post
AA100 Project Proposal
Chelly Jin, Jack Turpin, Youjin Chung
2017-05-01 01:52:15 -0700
Github
AI
projects

Artificial Intelligence and Non-verbal Communication : Reconfiguring Body Language

Overall Concept

Exploring the recognition, translation, and conception of body language, with particular interest in the "sigh."

Data (Still in Consideration)

  • Video data from movie scenes / internet
  • Audio
  • Personal data
  • Air pressure

Do not yet know what data we need for disembodied body language.

Questions and Considerations

  • What is the machine version of “sighing”? ("sigh"-ing as the impetus of the project)
  • Conception of language : Is there a difference between non-verbal and verbal communication?
  • Can technology recognize the nuances in body language that we feel make us particularly human?

Points of Interest

  • Translation of human body language
  • All methods of expression we have is a result of our bodies. But the communication between software does not require haptic feeling.
  • Technology has sensory capabilities for humans; not for ‘themselves’

Bigger Picture

  • Reflecting on a modern age of communication that leans on emotions and body notation (text, emojis, etc.)
  • Interpersonal relationship between human and ai
  • Technology / bot that would communicate back to you with the patterns of body language

Outcomes

Success

  • If the user feels that they have garnered some sort of meaningful interaction

Failure

  • Random or arbitrary interactions