OpenAI APIs – How to provide a website content as context to openAI language models?

Hi my team and I are working on semi-automated chatbots for clients and we want to incorporate OpenAI language models that will interact with human visitors to specific sites, and that are going to be able to respond to various questions asked by the visitors – but only using the context of that specific website for answers.
For example – if a website in question is a landing page for some business that contains multiple pages presenting that business and all the services it offers – we would like to provide a solution that would answer all the questions that users have about that business – based on those introductory pages – and all the questions outside that scope would be then redirected back to the scope of the website…
First thing that came to mind when we started figuring it out is to – in some way – tell models to parse the content of the websites and then answer the questions. But, as you can imagine, this might not be the best solution (if possible to implement at all), since it would create a lot of overhead…
Second solution would be to, ahead of time, prompt the site owners to provide context, store that context somewhere, and then send it with each request. This solution also has some downsides, main of which would be the length of the context as the prompt in total should not be larger than 1500 words (and should, in most cases, be significantly lower than that).
We tried looking on the web and YT for answers, but for this case of ours, we hardly found many good solutions out there. So we’re asking – if you had to implement this solution – how would you do it, keeping in mind that we are web devs and do not have any experience with ML, and that we use MERN stack?
For feedback or comments, reach us on hello@newswire.ae