Skip to content

Latest commit

 

History

History
16 lines (12 loc) · 1.58 KB

README.md

File metadata and controls

16 lines (12 loc) · 1.58 KB

PLDI2023

Number of papers: 2

  • Authors: Beurer-Kellner, Luca and Fischer, Marc and Vechev, Martin
  • Abstract: Large language models have demonstrated outstanding performance on a wide range of tasks such as question answering and code generation. On a high level, given an input, a language model can be used to automatically complete the sequence in a statistically-likely way. Based on this, users prompt these models with language instructions or examples, to implement a variety of downstream tasks. Advanced prompting methods can even imply interaction between the language model, a user, and external to...
  • Link: Read Paper
  • Labels: PL design for LLMs
  • Authors: Li, Ziyang and Huang, Jiani and Naik, Mayur
  • Abstract: We present Scallop, a language which combines the benefits of deep learning and logical reasoning. Scallop enables users to write a wide range of neurosymbolic applications and train them in a data- and compute-efficient manner. It achieves these goals through three key features: 1) a flexible symbolic representation that is based on the relational data model; 2) a declarative logic programming language that is based on Datalog and supports recursion, aggregation, and negation; and 3) a framewor...
  • Link: Read Paper
  • Labels: PL design for LLMs