Google claims one of its AI models is the first of its kind to spot a memory safety vulnerability in the wild – specifically an exploitable stack buffer underflow in SQLite – which was then fixed before the buggy code’s official release. The Chocolate Factory’s LLM-based bug-hunting tool, dubbed Big Sleep, is a collaboration between Google’s Project Zero and DeepMind. This software is said to be an evolution of earlier Project Naptime, announced in June. SQLite is an open source database engine, and the stack buffer underflow vulnerability could have allowed an attacker to cause a crash or perhaps even achieve arbitrary code execution. More specifically, the crash or code execution would happen in the SQLite executable (not the library) due to a magic value of -1 accidentally being […]
Original web page at www.theregister.com