Integration series: How to install Gecholog.ai with Elastic/Kibana bundle

Updated: 2024-05-28

Explore integrating LLM Gateway Gecholog.ai with Elastic/Kibana bundle. Discover running multi-container application, making LLM requests, and visualizing the logs in Kibana Dashboard.

Do you like this article? Follow us on LinkedIn.

Video description

The video is a tutorial on setting up an LLM Gateway with Elastic/Kibana bundle, specifically using the Gecholog.ai container application. It guides viewers through the process of downloading the Docker Compose file and starting a multi-container Gecholog.ai application with Elastic/Kibana bundle.

The video also covers Docker installation verification, the use of Docker commands to run the Gecholog.ai container, and the LLM request using the CURL command and Python code.

Additionally, it explains how to log in to Elastic on the local host and utilize Kibana Dashboard to visualize the LLM log. It covers creating Elastic password and viewing the analytics of the API calls passed through Gecholog.ai LLM Gateway.

Video: How to install Gecholog.ai with Elastic/Kibana Bundle. Watch on YouTube.


Gecholog.aiLLM GatewayElasticKibanaLLM Dashboard

Visulizing LLM logs with Gecholog.ai: Seamless Integration with Elastic and Kibana Dashboard

Are you ready to take your LLM application development efficiency to the next level? Gecholog.ai offers sophisticated data traffic monitoring that can streamline your data analysis and provide valuable operational insights. Don’t miss out on this opportunity to enhance your development workflow!