Would you like to see some plugins which integrate with local/self-hosted AI instead of sending it to ChatGPT? Or don’t you care about privacy there as long as the results are good?

You might be interested in GPT4All (https://gpt4all.io/index.html), which can be easily downloaded as Desktops GUI. Simply download a model (like nous Hermes) for 7.5GB and run in even without a GPU right on your CPU l (albeit slightly slowish).

It’s amazing what’s already possible with local AI instead of relying on large scale expensive and corporate-dependent AIs such as ChatGPT

  • @greysemanticist
    link
    11 year ago

    There are some smaller parameter models (7B) that you can kind of get by running on a Mac book Air. It’s been a wild west of innovation as they figure out how to encode the weights in progressively smaller numbers of bits.

    One model I’ve seen averages about 2.56 bits/weight.

    I’m not sure how it will fit into my particular use of Obsidian. One thing I keep in mind is I want to be very sure to segregate the AI-generated text from my own.