Streaming

Chatnode team is happy to give you this new feature. We realized that OpenAI GPT models could take between a few seconds to ~10 seconds to generate long format answers. We didn’t want to restrict the lenght of the answer to increase chatbot speed.

Solution:

We are now streaming the answer into the chatbot the same way ChatGPT UI. So now we are saving roughly ~50% in response time. It does improve user experience significantly! Please see it in action below:

Note: we always keep speed into our mind without sacrificing accuracy in chat answer.