<div dir="ltr"><font size="2"><span style="font-size:11pt"><b>ACM Transactions on <span class="gmail-m_425170959332432269gmail-il">Human</span>-<span class="gmail-m_425170959332432269gmail-il">Robot</span> Interaction (T-HRI) </b><br></span></font><div><font size="2"><span style="font-size:11pt"><br clear="all"></span></font><font size="2"><span style="font-size:11pt">CALL FOR PAPERS <br>
<br>
<font size="2"><span style="font-size:10pt"></span></font><font size="2">**<span style="font-size:10pt">Apologies for cross posting ** <br>
<br>
</span></font></span></font></div><div><font size="2"><span style="font-size:11pt"><font size="2"><span style="font-size:10pt">We are happy to call for papers for the journal <span class="gmail-m_425170959332432269gmail-il">special</span> <span class="gmail-m_425170959332432269gmail-il">issue</span>:<br>
</span></font>
</span></font></div><div><div><span style="color:rgb(0,0,0)"><b><h3 id="gmail-m_425170959332432269gmail-m_6213550506968730950gmail-m_8349110421055893240gmail-m_-3032401725268901814gmail--special-issue-on-representation-learning-for-human-and-robot-cognition">"<span class="gmail-m_425170959332432269gmail-il">Representation</span> <span class="gmail-m_425170959332432269gmail-il">Learning</span> for <span class="gmail-m_425170959332432269gmail-il">Human</span> and <span class="gmail-m_425170959332432269gmail-il">Robot</span> <span class="gmail-m_425170959332432269gmail-il">Cognition</span>"</h3></b></span></div><div><b>Webpage: </b><b><b><span style="color:rgb(0,0,255)"><a href="https://thri.acm.org/CFP-RLHRC.cfm" target="_blank">https://thri.acm.org/CFP-RLHRC<wbr>.cfm</a></span></b> </b><a href="http://cognitive-mirroring.org/en/events/hai2017_workshop/" rel="noopener noreferrer" target="_blank"><b><span style="color:rgb(0,0,255)"></span></b></a></div>
<br>
<b><font size="2">I. Aim and Scope </font><br><br></b>Intelligent robots
are rapidly moving to the center of <span class="gmail-m_425170959332432269gmail-il">human</span> environment; they collaborate
with <span class="gmail-m_425170959332432269gmail-il">human</span> users in different applications that require high-level
cognitive functions so as to allow them to understand and learn from
<span class="gmail-m_425170959332432269gmail-il">human</span> behavior within different <span class="gmail-m_425170959332432269gmail-il">Human</span>-<span class="gmail-m_425170959332432269gmail-il">Robot</span> Interaction (HRI) contexts.
To this end, a stubborn challenge that attracts much attention in
artificial intelligence is <span class="gmail-m_425170959332432269gmail-il">representation</span> <span class="gmail-m_425170959332432269gmail-il">learning</span>, which refers to
<span class="gmail-m_425170959332432269gmail-il">learning</span> representations of data so as to efficiently extract relevant
features for probabilistic, nonprobabilistic, or connectionist
classifiers. This active area of research spans different fields and
applications including speech recognition, object recognition, emotion
recognition, natural language processing, language emergence and
development, in addition to mirroring different <span class="gmail-m_425170959332432269gmail-il">human</span> cognitive
processes through appropriate computational modeling.<br><br><span class="gmail-m_425170959332432269gmail-il">Learning</span>
constitutes a basic operation in the <span class="gmail-m_425170959332432269gmail-il">human</span> cognitive system and
developmental process, where perceptual information enhances the ability
of the sensory system to respond to external stimuli through
interaction with the environment. This <span class="gmail-m_425170959332432269gmail-il">learning</span> process depends on the
optimality of features (representations of data), which allows humans to
make sense of everything they feel, hear, touch, and see in the
environment. Using intelligent robots could open the door to shed light
on the underlying mechanisms of <span class="gmail-m_425170959332432269gmail-il">representation</span> <span class="gmail-m_425170959332432269gmail-il">learning</span> and its
associated cognitive processes so as to take a closer step towards
making robots able to better collaborate with <span class="gmail-m_425170959332432269gmail-il">human</span> users in space.<br><br>This
<span class="gmail-m_425170959332432269gmail-il">special</span> <span class="gmail-m_425170959332432269gmail-il">issue</span> aims to shed light on cutting edge lines of
interdisciplinary research in artificial intelligence, cognitive
science, neuroscience, cognitive robotics, and <span class="gmail-m_425170959332432269gmail-il">human</span>-<span class="gmail-m_425170959332432269gmail-il">robot</span> interaction,
focusing on <span class="gmail-m_425170959332432269gmail-il">representation</span> <span class="gmail-m_425170959332432269gmail-il">learning</span> with the objective of creating
natural and intelligent interaction between humans and robots. Recent
advances and future research lines in <span class="gmail-m_425170959332432269gmail-il">representation</span> <span class="gmail-m_425170959332432269gmail-il">learning</span> will be
discussed in detail in this journal <span class="gmail-m_425170959332432269gmail-il">special</span> <span class="gmail-m_425170959332432269gmail-il">issue</span>. <b></b><strong><br><br>II. Potential Topics</strong><strong> </strong></div><div>
<p>
Topics relevant to this <span class="gmail-m_425170959332432269gmail-il">special</span> <span class="gmail-m_425170959332432269gmail-il">issue</span> include, but are not limited to: </p><ul style="list-style-type:disc">
<li>Language <span class="gmail-m_425170959332432269gmail-il">learning</span>, embodiment, and social intelligence</li>
<li><span class="gmail-m_425170959332432269gmail-il">Human</span> symbol system and symbol emergence in robotics</li>
<li>Computational modeling for high-level <span class="gmail-m_425170959332432269gmail-il">human</span> cognitive functions</li>
<li>Predictive <span class="gmail-m_425170959332432269gmail-il">learning</span> from sensorimotor information</li>
<li>Multimodal interaction and concept formulation</li>
<li>Language and action development</li>
<li><span class="gmail-m_425170959332432269gmail-il">Learning</span>, reasoning, and adaptation in collaborative <span class="gmail-m_425170959332432269gmail-il">human</span>-<span class="gmail-m_425170959332432269gmail-il">robot</span> tasks</li>
<li>Affordance <span class="gmail-m_425170959332432269gmail-il">learning</span></li>
<li>Cross-situational <span class="gmail-m_425170959332432269gmail-il">learning</span></li>
<li><span class="gmail-m_425170959332432269gmail-il">Learning</span> by demonstration and imitation</li>
<li>Language and grammar induction in robots</li>
</ul><p></p></div><div><p>
<strong>III. Submission</strong></p><p><strong></strong><strong> </strong></p>
<p></p><p>ACM Transactions on <span class="gmail-m_425170959332432269gmail-il">Human</span>-<span class="gmail-m_425170959332432269gmail-il">Robot</span> Interaction is a
peer-reviewed, interdisciplinary, open-access journal using an online
submission and manuscript tracking system. To submit your paper, please:<br></p><ul><li>Go to <a href="https://mc.manuscriptcentral.com/thri" target="_blank">https://mc.manuscriptcentral.c<wbr>om/thri</a> and login or follow the "Create an account" link to register.</li><li>After logging in, click the "Author" tab.</li><li>Follow the instructions to "Start New Submission".</li><li>Choose the submission category “<b>SI: <span class="gmail-m_425170959332432269gmail-il">Representation</span> <span class="gmail-m_425170959332432269gmail-il">Learning</span> for <span class="gmail-m_425170959332432269gmail-il">Human</span> and <span class="gmail-m_425170959332432269gmail-il">Robot</span> <span class="gmail-m_425170959332432269gmail-il">Cognition</span></b>”.<br></li></ul></div><div><p>
<strong>IV. Timline<br></strong></p>
<ul><li>Deadline for paper submission: July 1, 2018</li><li>First notification for authors: September 15, 2018</li><li>Deadline for revised papers submission: November 15, 2018</li><li>Final notification for authors: January 15, 2019</li><li>Deadline for submission of camera-ready manuscripts: March 1, 2019</li><li>Expected publication date: May 2019<br></li></ul><p></p></div><p>
<strong>V. Guest editors</strong></p>Takato Horii, The University of Electro-Communications, Japan
<span></span><font color="#000000"><font face="EAAAAA+Carlito, serif"><font style="font-size:11pt" size="2">(</font></font></font><font color="#0070c0"><font face="Arial, serif"><font style="font-size:11pt" size="2"><a href="mailto:takato@uec.ac.jp" target="_blank">takato@uec.ac.jp</a></font></font></font><font color="#000000"><font face="EAAAAA+Carlito, serif"><font style="font-size:11pt" size="2">).</font></font></font>
<br><font size="2"><span></span></font><font size="2"><span><span><font size="2"><span><span><font size="2"><span><span><font size="2"><span><span><font size="2"><span></span></font></span></span></font></span></span></font></span></span></font></span></span></font>Dr. Amir Aly, Ritsumeikan University, Japan<font color="#000000"><font face="EAAAAA+Carlito, serif"><font style="font-size:11pt" size="2"> (</font></font></font><font color="#0070c0"><font face="Arial, serif"><font style="font-size:11pt" size="2"><a href="mailto:amir.aly@em.ci.ritsumei.ac.jp" target="_blank">amir.aly@em.ci.ritsumei.ac.jp</a></font></font></font><font color="#000000"><font face="EAAAAA+Carlito, serif"><font style="font-size:11pt" size="2"><wbr>).</font></font></font>
<br>Dr. Yukie Nagai, National Institute of Information and Communications Technology (NICT), Japan
<span></span><font color="#000000"><font face="EAAAAA+Carlito, serif"><font style="font-size:11pt" size="2">(</font></font></font><font color="#0070c0"><font face="Arial, serif"><font style="font-size:11pt" size="2"><a href="mailto:yukie@nict.go.jp" target="_blank">yukie@nict.go.jp</a></font></font></font><font color="#000000"><font face="EAAAAA+Carlito, serif"><font style="font-size:11pt" size="2">).</font></font></font>
<br>Prof. Takayuki Nagai, The University of Electro-Communications, Japan
<span></span><font color="#000000"><font face="EAAAAA+Carlito, serif"><font style="font-size:11pt" size="2">(</font></font></font><font color="#0070c0"><font face="Arial, serif"><font style="font-size:11pt" size="2"><a href="mailto:nagai@ee.uet.at.jp" target="_blank">nagai@ee.uet.at.jp</a></font></font></font><font color="#000000"><font face="EAAAAA+Carlito, serif"><font style="font-size:11pt" size="2">).</font></font></font><br clear="all"><br>---------------------- <br><div class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><font size="2"><span><b>Amir Aly, Ph.D.</b><br></span></font><font size="2"><span>Senior Researcher</span></font><font size="2"><span><span><font size="2"><span><span><font size="2"><span><span><font size="2"><span><span><font size="2"><span><br></span></font></span></span></font></span></span></font></span></span></font></span>Emergent Systems Laboratory</span></font><br><font size="2"><span><span><font size="2"><span><span><font size="2"><span></span></font></span></span></font></span>College of Information Science and Engineering</span></font><br><font size="2"><span><span><font size="2"><span><span><font size="2"><span><span><font size="2"><span><span><font size="2"><span><span><font size="2"><span>Ritsumeikan University<br></span></font></span></span></font></span></span></font></span></span></font></span></span></font></span>1-1-1 Noji Higashi, Kusatsu, Shiga 525-8577<br>Japan</span></font></div></div></div></div></div></div></div></div>
</div>