ML&DL and LLM

Try it

이반&핫버드 2025. 1. 17. 10:56

목표 :

- LLM 관련 Chatbot 구동

- Local LLM 및 Vector store 

- 모든 환경은 docker container

 

 

 

단계

 

1. Install Linux(Ububtu) on Windows
windows + wsl
- linux env on windows - ubuntu
  * (Searching keyword) windoes wsl ubuntu 

2. connect to Linux
ssh and telnet client
- ex) putty, mobaxterm*
  * (Searching keyword) putty or mobaxterm

3. Install Docker
install docker
- install docker 
  * (Searching keyword) install docker on ubuntu

4. Prepare Docker container
Download container  
  * (Searching keyword) docker container download

5. Setup container   
5-1. for application -> ubuntu continaer download & execute [Container-Dev]
  - setup python environment 
  * (Searching keyword) download docker images for ubuntu with python

5-2. for ollama      -> ollama container download & execute [Container-LLM] ***
  * (Searching keyword) docker ollama
  
5-3. for choram      -> chroma continaer download & execute [Container-VDB] ***
  * (Searching keyword) docker chroma

6. Setup for dev (5-1)
  - vs-code setting
  - remote ssh setting to [Container-Dev]
  * (Searching keyword) vscode remote ssh

7. Prepare LLM (5-2)
On [Container-LLM]
  - Choose LLM model (llama3.2)
  - Pull LLM model & test run
  * (Searching keyword) ollama pull , langchain**, lamaindex 

8. Streamlit sample (5-1)
On [Container-Dev] 
  - install streamlit 
    * (Searching keyword) streamlit -> Documentation -> Installation / Develop / Deploy
  - download sample code 
  - Run it