Yelin Kim

Amazon Lab126

Menu

Skip to content
  • Home
  • CV
  • Blog
  • Publications
  • Outreach/Service
  • Teaching

Multimodal Emotion Recognition

The first set of studies lay the foundation and central motivation of our research. We discover that it is crucial to model complex non-linear interactions between audio and visual emotion expressions, and that dynamic emotion patterns can be used in emotion recognition.

Share this:

  • Share on X (Opens in new window) X
  • Share on Facebook (Opens in new window) Facebook
Like Loading...

Contact

Email: lynnyelin(last name) (at) gmail.com

Talks

Prof. Kim presented our work at Advanced Data Analytics Lightning Talk! https://t.co/diTqgXkvTx

— Inspire Lab (@LabYelinkim) 2017년 2월 20일

Pageviews

  • 39,565 hits

Upcoming Travel

University of Rochester HCI, Dec 4, 2018, Rochester, NY (invited speaker)
RIT AI Seminar Series, Dec 3, 2018, Rochester, NY (invited speaker)
ACM ICMI, Oct 16-20, 2018, Boulder, CO (doctoral consortium co-chair)
USC EE Seminar, Oct 11, 2018, Los Angeles, CA (invited speaker)

News

My Tweets
Website Powered by WordPress.com.
    • Yelin Kim
    • Sign up
    • Log in
    • Copy shortlink
    • Report this content
    • Manage subscriptions
%d