Google Research’s Post

How can we best encode structured data into sequential form for use in large language models (LLMs)? Join Bryan Perozzi for a live demonstration today at 12:30PM at the NeurIPS 2024 Google Research booth (#133) to learn about Graph Tokens that significantly boost an LLM's ability to solve graph reasoning tasks. Read the paper: https://lnkd.in/g9gf_ziC https://lnkd.in/ekungdTV

  • diagram

The clever use of "soft tokens" to bridge graphs and LLMs offers a promising direction for handling any structured data, not just graphs. The real breakthrough is showing we can adapt data to LLMs rather than retraining LLMs for each new data type.

Shayan Shokri

Founder @ Earthian AI | Ex-Harvard AI Researcher

2d

Nice work

Like
Reply
Olha Petsiukh

Engagement Manager | Strategic Client Partner

1d

Interesting event

Like
Reply
Leandro Di Bella

PhD Student at ETRO VUB | Electrical Engineer

1d
Like
Reply
See more comments

To view or add a comment, sign in

Explore topics