Hardware Design
Hardware design research currently focuses on improving efficiency and automation across various domains, from AI accelerators to general-purpose processors. Key areas include developing novel architectures for neuro-symbolic AI, leveraging machine learning (particularly graph neural networks and large language models) for automated design and optimization, and exploring efficient approximate computing techniques. These advancements aim to address challenges in speed, power consumption, and design complexity, ultimately impacting the development of faster, more energy-efficient, and reliable hardware for diverse applications.
Papers
Towards Efficient Neuro-Symbolic AI: From Workload Characterization to Hardware Architecture
Zishen Wan, Che-Kai Liu, Hanchen Yang, Ritik Raj, Chaojian Li, Haoran You, Yonggan Fu, Cheng Wan, Sixu Li, Youbin Kim, Ananda Samajdar, Yingyan (Celine)Lin, Mohamed Ibrahim, Jan M. Rabaey, Tushar Krishna, Arijit Raychowdhury
Learning to Compare Hardware Designs for High-Level Synthesis
Yunsheng Bai, Atefeh Sohrabizadeh, Zijian Ding, Rongjian Liang, Weikai Li, Ding Wang, Haoxing Ren, Yizhou Sun, Jason Cong