Unverified Commit 63360f0f authored by jinyuan sun's avatar jinyuan sun Committed by GitHub
Browse files

Merge pull request #40 from ChatMol/main

update dev branch
parents f47ea28f 0685660e
Loading
Loading
Loading
Loading
+14 −0
Original line number Diff line number Diff line
@@ -128,3 +128,17 @@ dmypy.json

# Pyre type checker
.pyre/

test*
01d70f7c997c6a7d73e8fc592865b84f7371642b7afdba535726ba70f020183e*
ce73fb8a1b802e6746c58ac3bf915d79506e2b5edc36e83f1cbfa3f6071a9a92*
e4f51ecb92f1387f12e6a8f00025990b7c16198bf3643601dc66c55144f0b6f8*
9d58e5ce6d8c9ed839c7cb7d836581dbd48e4ba916403cbd9644f886fd070d2b*
1ff60fa0556049ec1ec285a278566575baf11095f6a7d45162b50415cf998f46*
25eea57f04ad4d86473860fa613061d7544bfa338ef5328253de9c47672c0684*
532130db56afd00f19311341a636a52c7413c013afbb850f0ead705599140c0c*
3387394d26f8fc4bddad1ca2b35bc5a57b8e7f27136bfeb0a2c97764951b273f*
cf08d4763cf8b0309f2aa182c253388d712a1b736d4eb34f226c74588995faa0*
e9bbca7f41ac6c03b3a6c3193115e8ac8a4c4fd572a4230577a81c432b2cacfe*
06377e8a560dca176f9a7d7ac9e3184c6c67e13aa1e9a6a3972e2fde8952375b*
copilot_public/Project-*
 No newline at end of file
+18 −1
Original line number Diff line number Diff line
@@ -5,6 +5,8 @@ Started from a PyMol plugin.
- [ChatMol](#chatmol)
  - [Table of contents](#table-of-contents)
  - [Overview](#overview)
  - [ChatMol + Streamlit](#chatmol--streamlit)
  - [ChatMol python package](#chatmol-python-package)
  - [Copilot](#copilot)
  - [ChatMol Website](#chatmol-website)
  - [Requirements \& Installation](#requirements--installation)
@@ -18,7 +20,19 @@ Started from a PyMol plugin.
  - [License](#license)

## Overview
The PyMOL ChatGPT Plugin seamlessly integrates OpenAI's large language models (GPT-3.5-turbo and text-davinci-003) into PyMOL, enabling users to interact with PyMOL using natural language instructions. This robust tool simplifies PyMOL tasks and offers suggestions, explanations, and guidance on a wide range of PyMOL-related topics. ChatMol provides various interaction modes with PyMOL, including the PyMOL command line, miniGUI chatbot, and web browsers.
The PyMOL ChatGPT Plugin seamlessly integrates LLMs into PyMOL, enabling users to interact with PyMOL using natural language instructions. This tool simplifies PyMOL tasks and offers suggestions, explanations, and guidance on a wide range of PyMOL-related topics. ChatMol provides various interaction modes with PyMOL, including the PyMOL command line, miniGUI chatbot, python and web browsers.

## ChatMol + Streamlit

We also provide a Streamlit app for ChatMol, which can be used as a task execution agent or Q&A chatbot. It retains your entire conversation history with ChatMol, and you have the flexibility to modify the execution plan suggested by ChatMol. See more details in [here](./chatmol-streamlit/README.md).

## ChatMol python package

See this [README](./chatmol_pkg/README.md) for more details.

```bash
pip install chatmol
```

## Copilot
This is ChatMol copilot, just like other copilot, it is designed to help your work.  
@@ -121,6 +135,9 @@ python miniGUI.py
Here is a screenshot of the miniGUI:
![img](./assets/chatmol_lite.png)

## Acknowledgements

As an open source project, we thank the support from [ChemXAI](https://www.chemxai.com/), [WeComput](https://www.wecomput.com/) and [levinthal](https://www.levinthal.bio/).

## License
This project is released under the MIT License.
+18 −0
Original line number Diff line number Diff line
# chatmol-streamlit
Streamlit app for chatmol

## Installation
You will need PyMol and chatmol package installed:

```bash
conda install -c conda-forge pymol-open-source
pip install streanmlit==1.35.0
pip install openai anthropic
pip install chatmol
```

## Usage
Brefore running the app, make sure the Pymol is correctly installed.
```bash
streamlit run chatmol-streamlit.py
```
 No newline at end of file
+83 −0
Original line number Diff line number Diff line
import streamlit as st
import chatmol as cm

st.sidebar.title("ChatMol")
st.sidebar.markdown("Welcome to ChatMol! ChatMol is a tool that allows you to interact with PyMOL using natural language.")

openai_llms = ['gpt-4o', 'gpt-4-turbo', 'gpt-3.5-turbo']
claude_llms = ['claude-3-5-sonnet-20240620', 'claude-3-sonnet-20240229', 'claude-3-haiku-20240307', 'claude-3-opus-20240229']
chatmol_llms = ['chatlite']

introduction_of_models = {
    'gpt-4o': "GPT-4o (“o” for “omni”) is most advanced model of OpenAI. It has the same high intelligence as GPT-4 Turbo but is much more efficient—it generates text 2x faster and is 50% cheaper.",
    'gpt-4-turbo': "GPT-4 can solve difficult problems with greater accuracy than any of previous models of OpenAI, thanks to its broader general knowledge and advanced reasoning capabilities.",
    'gpt-3.5-turbo': "GPT-3.5 Turbo models can understand and generate natural language or code and have been optimized for chat.",
    'chatlite': "A model provided by ChatMol freely available to all, which is optimized for PyMOL commands generation but not good for general chat.",
    'claude-3-5-sonnet-20240620': "Most intelligent model of Anthropic, combining top-tier performance with improved speed. Currently the only model in the Claude 3.5 family.\n - Advanced research and analysis\n - Complex problem-solving\n - Sophisticated language understanding and generation\n - High-level strategic planning",
    'claude-3-sonnet-20240229': "Balances intelligence and speed for high-throughput tasks.\n - Data processing over vast amounts of knowledge\n - Sales forecasting and targeted marketing\n - Code generation and quality control",
    'claude-3-haiku-20240307': "Near-instant responsiveness that can mimic human interactions.\n - Live support chat\n - Translations\n - Content moderation\n - Extracting knowledge from unstructured data",
    'claude-3-opus-20240229': "Strong performance on highly complex tasks, such as math and coding.\n - Task automation across APIs and databases, and powerful coding tasks\n - R&D, brainstorming and hypothesis generation, and drug discovery\n - Strategy, advanced analysis of charts and graphs, financials and market trends, and forecasting"
}

if "ps" not in st.session_state:
    st.session_state["cm"] = cm
    if st.button("Start PyMOL"):
        st.session_state["ps"] = cm.start_pymol_gui()

if "available_llms" not in st.session_state:
    st.session_state["available_llms"] = []
    if st.session_state["cm"].defaul_client.client is not None:
        st.session_state["available_llms"].extend(openai_llms)
    st.session_state["available_llms"].extend(chatmol_llms)
    if st.session_state["cm"].defaul_client.client_anthropic is not None:
        st.session_state["available_llms"].extend(claude_llms)

if "llm" not in st.session_state:
    st.session_state["llm"] = ''
    
st.session_state["llm"] = st.sidebar.selectbox("Select LLM", st.session_state["available_llms"])

st.sidebar.write(introduction_of_models.get(st.session_state["llm"], "No introduction available"))

if st.session_state["llm"] in openai_llms+claude_llms:
    if st.sidebar.button("check api availability"):
        with st.spinner("Checking..."):
            results = st.session_state["cm"].defaul_client.test_api_access()
        for k, v in results.items():
            if v:
                st.sidebar.info(v)
            else:
                st.sidebar.info(f"{k.split('_')[0]} is available")

if "messages" not in st.session_state:
    st.session_state['messages'] = []

for message in st.session_state.messages:
    with st.chat_message(message["role"]):
        st.write(message["content"])

if prompt := st.chat_input("What is up?"):
    st.session_state.messages.append({"role": "user", "content": prompt})
    with st.chat_message("user"):
        st.write(prompt)

    with st.spinner("Thinking..."):
        pymol_console = st.session_state["ps"].pymol_console
        if prompt.endswith("?"):
            if st.session_state["llm"] in openai_llms:
                response = st.session_state["cm"].chat_with_gpt(f"This is the log: \n\n{st.session_state['ps'].pymol_console}\n\n. This is my question: \n\n{prompt}")
            elif st.session_state["llm"] in claude_llms:
                response = st.session_state["cm"].chat_with_claude(f"This is the log: \n\n{st.session_state['ps'].pymol_console}\n\n. This is my question: \n\n{prompt}")
            elif st.session_state["llm"] in chatmol_llms:
                response = st.session_state["cm"].chatlite(f"Instruction: {prompt}")
        else:
            if st.session_state["llm"] in openai_llms:
                response = st.session_state["ps"].chatgpt(f"This is the log: \n\n{st.session_state['ps'].pymol_console}\n\n. This is my instruction: \n\n{prompt}")
            elif st.session_state["llm"] in claude_llms:
                response = st.session_state["ps"].claude(f"This is the log: \n\n{st.session_state['ps'].pymol_console}\n\n. This is my instruction: \n\n{prompt}")
            elif st.session_state["llm"] in chatmol_llms:
                response = st.session_state["ps"].chatlite(f"Instruction: {prompt}")
    
        st.session_state.messages.append({"role": "assistant", "content": response})
    with st.chat_message("assistant"):
        response = st.write(response)
+7 −6
Original line number Diff line number Diff line
@@ -7,6 +7,7 @@ from pymol import cmd
import http.server

class PyMOLCommandHandler(http.server.BaseHTTPRequestHandler):

    def __init__(self):
        
        from http import HTTPStatus
@@ -74,7 +75,7 @@ stashed_commands = []
# Save API Key in ~/.PyMOL/apikey.txt
API_KEY_FILE = os.path.expanduser('~')+"/.PyMOL/apikey.txt"
OPENAI_KEY_ENV = "OPENAI_API_KEY"
GPT_MODEL = "gpt-3.5-turbo-1106"
GPT_MODEL = "gpt-4o"
client = None

def set_api_key(api_key):
@@ -107,9 +108,9 @@ def load_api_key():
        print("API key loaded from environment variable.")
    return client
    
def update_model(mdoel_name):
def update_model(model_name):
    global GPT_MODEL
    GPT_MODEL = mdoel_name
    GPT_MODEL = model_name
    print("Model updated to: ", GPT_MODEL)
    return "Model updated to: " + GPT_MODEL

@@ -120,7 +121,7 @@ def chat_with_gpt(message, max_history=10):

    try:
        messages = [
            {"role": "system", "content": "You are an AI language model specialized in providing command line code solutions related to PyMOL. Generate clear and effective solutions in a continuous manner. When providing demos or examples, try to use 'fetch' if object name is not provided. Prefer academic style visulizations. Code within triple backticks, comment and code should not in the same line."}
            {"role": "system", "content": "You are an AI language model specialized in providing command line code solutions related to PyMOL. Generate clear and effective solutions in a continuous manner. You think step-by-step before you conclude correctly. When providing demos or examples, try to use 'fetch' if object name is not provided. Prefer academic style visulizations. Code within triple backticks, comment and code should not in the same line."}
        ]

        # Keep only the max_history latest exchanges to avoid making the conversation too long
Loading