I use it all the time. It is a good partner to challenge me, when I am looking for other points of view. “I believe x due to y. Challenge my point of view”
It helps me explore a topic fast, so that I know the lingo to search for it myself. I use it for making low stakes decisions where it often succeeds, such as shopping and research for shopping. I validate the results every time.
Is it net negative for society, not sure, maybe? Will it go away, no. So we should embrace it, but not the big tech AI, but smaller LLMs.


I’ve used LLMs to have conversations about technical topics I’m not familiar with. I ask it how something works, it answers, and then I ask several follow-up questions to clarify various things I’m interested in.
Usually, I have some ideas how to implement a particular theory or technology, and I bounce those ideas off the LLM. Some times, my ideas have already been invented about 100 years ago, some times my ideas are impractical, and the LLM tells me exactly why they would or wouldn’t work.
I’m also using a custom agent that has been specifically tailored for this purpose. Normal LLMs are far to supportive, lack critical thinking, do not challenge my ideas etc. so that’s why I had to make my own agent prompt.
Anyway, I think this system works well for me. This way I’ve been able to dive deeper into all sorts of random topics, such as why coco powder doesn’t mix with milk, why a battery bank shows confusing state of charge readings, how fluid coupling is used in heavy machinery etc. Fascinating stuff. It’s a bit like watching a custom documentary made just for my odd interests.
If had to read about these things in magazines or books, I would not have been able to dive as deep as fast. On the other hand, books also give you a general overview, and they include details that I may not be interested in, so I would either end up reading stuff I don’t care about or just skimming those parts. In the latter case, I would end up spending hours looking for the information I care about, not finding it, and walking away with less information.