Added arbitrary context free grammar constraints to llama.cpp
— Grant Slatton (@GrantSlatton) May 14, 2023
Can now plug in any llama.cpp compatible model and give an exact grammar spec: JSON, etc
Excited to use with more powerful local models as they are released
Thanks @ggerganov & friends for such a wonderful project. pic.twitter.com/HCLACavrlH
Restricting the next predicted token to adhere to a specific context free grammar seems like a big step forward in weaving language models into applications.