Forum Discussion

6 Replies

  • this is a very nice working "proof of concept"

    BUT: never use this or similar simple javascript solutions on a normal LMS server

    for every request to chatGPT a "secret" api-key is sent along (comparable to a credit card number)

    this key can be extracted without special programs (this is possible with any normal browser with 4 simple clicks)

    every request costs money und you have to pay for every request - no matter where they come from

    for real solutions you need an additional server

    LMS -> special server with the api-key -> Chat GPT -> special server - > LMS

  • Any questions you have about this example, you'll need to direct to NRZ Malik via LinkedIn. There's a link to his profile in the original post.

  • barry's avatar
    barry
    Community Member

    I completely agree with Jürgen that using a public API key directly in frontend code is dangerous and can quickly become costly. For those who are seriously considering chatbot integration, we recently developed a secure setup for k-electric that powers an K Electric AI Chatbot with an internal server and private document sources such as SharePoint. It kept everything safe and scalable while streamlining employee inquiries.