2 matches found
CVE-2024-32878
Llama.cpp is LLM inference in C/C++. There is a use of uninitialized heap variable vulnerability in gguf_init_from_file, the code will free this uninitialized variable later. In a simple POC, it will directly cause a crash. If the file is carefully constructed, it may be possible to control this un...
CVE-2024-42477
llama.cpp provides LLM inference in C/C++. The unsafe type member in the rpc_tensor structure can cause global-buffer-overflow. This vulnerability may lead to memory data leakage. The vulnerability is fixed in b3561.