I can’t imagine the usecase of AI in an ebook reader
I could see a hypothetical machine translation suite integrated directly into the reader being a useful tool, especially if it allowed interrogation, correction, and revision by the user in a way that an LLM could actually almost sort of do well enough for a casual context. I mean it would still be frustrating and error prone, but for a book without extant translations it could potentially be better than trying to bulk process it with a separate translation tool.
Although that’s not what they added. If I’m reading this right, what they added was the ability for it to make API calls to LM Studio, which is a framework (I believe open source too) for running text models locally with (also open source) model weights, with the current integration features being something about being able to “discuss selected books” with that local chatbot or ask it for recommendations, although I have no idea how any of that is supposed to work in practice. Since it is adding backend compatibility with local models, the machine translation angle I mentioned is at least a feasible addition that a plugin could add.
The whole thing’s silly and has extremely limited actual usecases, but anyone getting up in arms over it allowing compatibility with other, entirely locally-run open source programs is being even sillier. It’s not like they’re replacing extant functionality with ChatGPT API calls or some nonsense, just enabling hobbyists who go through the trouble of setting up this entire other suite of unrelated shit and manage to get it working to then do something sort of silly with it.
I could see a hypothetical machine translation suite integrated directly into the reader being a useful tool, especially if it allowed interrogation, correction, and revision by the user in a way that an LLM could actually almost sort of do well enough for a casual context. I mean it would still be frustrating and error prone, but for a book without extant translations it could potentially be better than trying to bulk process it with a separate translation tool.
Although that’s not what they added. If I’m reading this right, what they added was the ability for it to make API calls to LM Studio, which is a framework (I believe open source too) for running text models locally with (also open source) model weights, with the current integration features being something about being able to “discuss selected books” with that local chatbot or ask it for recommendations, although I have no idea how any of that is supposed to work in practice. Since it is adding backend compatibility with local models, the machine translation angle I mentioned is at least a feasible addition that a plugin could add.
The whole thing’s silly and has extremely limited actual usecases, but anyone getting up in arms over it allowing compatibility with other, entirely locally-run open source programs is being even sillier. It’s not like they’re replacing extant functionality with ChatGPT API calls or some nonsense, just enabling hobbyists who go through the trouble of setting up this entire other suite of unrelated shit and manage to get it working to then do something sort of silly with it.