Highlights: Developed FAST, a full-stack accelerator search framework with a broad design search space, including hardware datapath, software scheduling, and compiler passes such as operation fusion. When evaluated on vision and NLP benchmarks, custom designs generated by FAST show up to 6x improvement in Perf/TDP vs. TPU-v3. We also demonstrated that accelerators with 2x higher Perf/TCO can become ROI-positive at moderate datacenter deployments ( ASPLOS'22 ). Developed lossless accelerators using approxima
Publications: For a full list of publications, please visit my Google Scholar page.
Highlights: Constitutional AI (CAI) is a novel approach to train Large Language Models (LLMs) that are helpful, harmless, and honest. CAI introduces a new strategy that relies on AI feedback for self-improvement. In CAI, the LLM criticizes its own outputs based on a set of principles (the constitution) and revises its outputs to follow the constitution. This technique is used to finetune the LLM with supervised training first and later with reinforcement learning (based on the preference model trained on AI