Indexed by:
Abstract:
Neural machine translation, which has an encoder-decoder framework, is considered to be a feasible way for future machine translation. Nevertheless, with the fusion of multiple languages and the continuous emergence of new words, most current neural machine translation systems based on von Neumann’s architecture have seen a substantial increase in the number of devices for the decoder, resulting in high-energy consumption rate. Here, a multilevel photosensitive blending semiconductor optoelectronic synaptic transistor (MOST) with two different trapping mechanisms is firstly demonstrated, which exhibits 8 stable and well distinguishable states and synaptic behaviors such as excitatory postsynaptic current, short-term memory, and long-term memory are successfully mimicked under illumination in the wavelength range of 480–800 nm. More importantly, an optical decoder model based on MOST is successfully fabricated, which is the first application of neuromorphic device in the field of neural machine translation, significantly simplifying the structure of traditional neural machine translation system. Moreover, as a multi-level synaptic device, MOST can further reduce the number of components and simplify the structure of the codec model under light illumination. This work first applies the neuromorphic device to neural machine translation, and proposes a multi-level synaptic transistor as the based cell of decoding module, which would lay the foundation for breaking the bottleneck of machine translation. [Figure not available: see fulltext.] © 2022, Science China Press and Springer-Verlag GmbH Germany, part of Springer Nature.
Keyword:
Reprint 's Address:
Email:
Source :
Science China Materials
ISSN: 2095-8226
Year: 2022
Issue: 5
Volume: 65
Page: 1383-1390
8 . 1
JCR@2022
6 . 8 0 0
JCR@2023
ESI HC Threshold:91
JCR Journal Grade:1
CAS Journal Grade:2
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: