How can we best encode structured data into sequential form for use in large language models (LLMs)? Join Bryan Perozzi for a live demonstration today at 12:30PM at the NeurIPS 2024 Google Research booth (#133) to learn about Graph Tokens that significantly boost an LLM's ability to solve graph reasoning tasks. Read the paper: https://lnkd.in/g9gf_ziC https://lnkd.in/ekungdTV
Nice work
Interesting event
The clever use of "soft tokens" to bridge graphs and LLMs offers a promising direction for handling any structured data, not just graphs. The real breakthrough is showing we can adapt data to LLMs rather than retraining LLMs for each new data type.