single-research

Reseach

Multimodal Datasets

Eligible: Undergraduate and Masters students

Mentor: Amir Zadeh

Description: We are interested in building novel multimodal datasets including, but not limited to, multimodal QA dataset, multimodal language datasets. We are also interested in advancing our CMU Multimodal SDK, a software for multimodal machine learning research. Students working in this area will have a high chance of being co-authors and publishing their work in top machine learning conferences.

Skills/Experience: This position is open for Masters and undergraduate students with knowledge of web programming in python (i.e. Django). Prior exposure to machine learning is not required but would be helpful.

Contact: Interested students should send an email to Amir Zadeh with their CV and description of their experience.