The first comment is correct. "Don't do that" is the correct answer, and everything else is inconvenient tripe. If you must give an LLM control of your computer, make it a virtual container. No amount of sandboxing can ever make this safe, case closed.
Plus, many people are already using isolation tech like Bubblewrap when they use AI apps in a Flatpak. Presumably not to avert anything as disastrous as LLMs controlling their PC, but sandboxed all the same.
People asking this might as well just upload all their files to an AI and let it manage them.
The first comment is correct. "Don't do that" is the correct answer, and everything else is inconvenient tripe. If you must give an LLM control of your computer, make it a virtual container. No amount of sandboxing can ever make this safe, case closed.
Plus, many people are already using isolation tech like Bubblewrap when they use AI apps in a Flatpak. Presumably not to avert anything as disastrous as LLMs controlling their PC, but sandboxed all the same.