Home
People
Events
Publications
Contact
Yanzhi Wang
Latest
Graph Convolutional Network Acceleration Using Adiabatic Superconductor Josephson Devices
Late Breaking Result: AQFP-aware Binary Neural Network Architecture Search
SupeRBNN: A Randomized Binary Neural Network Using Adiabatic Superconductor Josephson Devices
A Life-Cycle Energy and Inventory Analysis of Adiabatic Quantum-Flux-Parametron Circuits
Algorithm-software-hardware co-design for deep learning acceleration
Design and Implementation of an FFT-Based Neural Network Accelerator Using Rapid Single-Flux-Quantum Technology
Extremely energy-efficient non-linear function approximation framework using stochastic superconductor devices
Performance assessment of an extremely energy-efficient binary neural network using adiabatic superconductor devices
SupeRBNN: Randomized binary neural network using adiabatic superconductor Josephson devices
Design and implementation of stochastic neural networks using superconductor Quantum-Flux-Parametron devices
TAAS
Demonstration of a 47.8 GHz high-speed FFT processor using single-flux-quantum technology
Towards aqfp-capable physical design automation
ASAP: An analytical strategy for AQFP placement
Design of an SFQ convolutional computation processor for convolutional neural network
A buffer and splitter insertion framework for adiabatic quantum-flux-parametron superconducting circuits
A majority logic synthesis framework for adiabatic quantum-flux-parametron superconducting circuits
A stochastic-computing based deep learning framework using adiabatic quantum-flux-parametron superconducting technology
A Study on Majority Synthesis for Adiabatic Superconducting Logic
Adiabatic quantum-flux-parametron: Towards building extremely energy-efficient circuits and systems
IDE development, logic synthesis and buffer/splitter insertion framework for adiabatic quantum-flux-parametron superconducting circuits
Design and Evaluation of Deep Learning Accelerators Using Superconductor Logic Families
Cite
×