Unraveling the Graph of Thoughts Concept

Source:

Medium
on
August 27, 2023
Curated on

August 29, 2023

The article introduces and explains the concept of Graph-of-Thoughts, a recent addition to the family of prompt engineering techniques used with language learning models (LLMs), such as ChatGPT, Claude, and LLama. A prompt is the prefix given to a language model, and the data specified here significantly influences the model's choice of next token sequence. Techniques like Chain-of-Thoughts and Tree-of-Thoughts have aimed to refine the way we prompt and steer LLMs. Enter Graph-of-Thoughts, an evolution of these which allows thoughts to be represented as Direct Acyclic Graphs, permitting a loop over a thought or the aggregation of multiple thoughts into a better one. Graph-of-Thoughts works by representing vertices as thoughts generated by the input’s prompt. Edges represented by (t1, t2) mean that thought2 is generated by prompting an LLM starting from thought1 with a possibility of transformations which are responsible for adding new vertices and edges (new thoughts). Different kinds of transformation include Aggregations (combining thoughts), Refinement (improving a single thought), and Generation (creating new thoughts from other thoughts). The system also includes a mechanism to score and rank thoughts, determining which to aggregate, refine, or drop. The overall architecture of Graph-of-Thoughts includes a Prompter, Parser, Graph of Operations, Graph Reasoning State, and Controller, working together to generate, aggregate, and score thoughts.

Ready to Transform Your Organization?

Take the first step toward harnessing the power of AI for your organization. Get in touch with our experts, and let's embark on a transformative journey together.

Contact Us today