徐佳唯

博士後

Email:xujiawei@@gdiist.cn

個人簡介:

2016年本科畢業于複旦大(dà)學信息科學與工(gōng)程學院電(diàn)子信息科學與技術專業,2022年于複旦大(dà)學取得微電(diàn)子學與固體(tǐ)電(diàn)子學博士學位。博士期間圍繞類腦芯片與智能系統開(kāi)展研究,參與國家自然科學基金、上海市新一(yī)代人工(gōng)智能市級重大(dà)專項等多項課題,具體(tǐ)研究内容包括存算一(yī)體(tǐ)類腦計算仿真系統、低功耗神經網絡專用加速器芯片等,完成了高可靠容錯SoC系統樣機、神經網絡處理器FPGA樣機與ASIC專用芯片等工(gōng)作,相關研究成果共發表10餘篇學術論文,并申請發明專利4項。

代表論著:

1.     J. Xu, Y. Huan, B. Huang, H. Chu, Y. Jin, L.R. Zheng and Z. Zou, “A Memory-Efficient CNN Accelerator Using Segmented Logarithmic Quantization and Multi-Cluster Architecture,” in IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 68, no. 6, pp. 2142-2146, June 2021.

2.     J. Xu, Y. Huan, Y. Jin, H. Chu, L. Zheng and Z. Zou, “Base-Reconfigurable Segmented Logarithmic Quantization and Hardware Design for Deep Neural Networks,” in Journal of Signal Processing Systems, vol. 92, no. 11, pp. 1263-1276, 2020.

3.     J. Xu, D. Wang, F. Li, L. Zhang, D. Stathis, Y. Yang, Y. Jin, A. Lansner, A. Hemani, Z. Zou and L.R. Zheng, “A Memristor Model with Concise Window Function for Spiking Brain-Inspired Computation,” 2021 IEEE 3rd International Conference on Artificial Intelligence Circuits and Systems (AICAS), 2021, pp. 1-4.

4.     J. Xu, Y. Huan, L. Zheng, and Z. Zou, “A Low-Power Arithmetic Element for Multi-Base Logarithmic Computation on Deep Neural Networks,” 2018 31st IEEE International System-on-Chip Conference (SOCC), 2018, pp. 43-48.

5.     D. Wang#, J. Xu#, D. Stathis, L. Zhang, F. Li, A. Lansner, A. Hemani, Y. Yang, P. Herman and Z. Zou, “Mapping the BCPNN Learning Rule to a Memristor Model,” in Frontiers in Neuroscience, vol. 15, p. 1656, 2021. (共同一(yī)作)


鄭立榮研究組