목표 :
- LLM 관련 Chatbot 구동
- Local LLM 및 Vector store
- 모든 환경은 docker container
단계
1. Install Linux(Ububtu) on Windows
windows + wsl
- linux env on windows - ubuntu
* (Searching keyword) windoes wsl ubuntu
2. connect to Linux
ssh and telnet client
- ex) putty, mobaxterm*
* (Searching keyword) putty or mobaxterm
3. Install Docker
install docker
- install docker
* (Searching keyword) install docker on ubuntu
4. Prepare Docker container
Download container
* (Searching keyword) docker container download
5. Setup container
5-1. for application -> ubuntu continaer download & execute [Container-Dev]
- setup python environment
* (Searching keyword) download docker images for ubuntu with python
5-2. for ollama -> ollama container download & execute [Container-LLM] ***
* (Searching keyword) docker ollama
5-3. for choram -> chroma continaer download & execute [Container-VDB] ***
* (Searching keyword) docker chroma
6. Setup for dev (5-1)
- vs-code setting
- remote ssh setting to [Container-Dev]
* (Searching keyword) vscode remote ssh
7. Prepare LLM (5-2)
On [Container-LLM]
- Choose LLM model (llama3.2)
- Pull LLM model & test run
* (Searching keyword) ollama pull , langchain**, lamaindex
8. Streamlit sample (5-1)
On [Container-Dev]
- install streamlit
* (Searching keyword) streamlit -> Documentation -> Installation / Develop / Deploy
- download sample code
- Run it
'ML&DL and LLM' 카테고리의 다른 글
(Docker) ollama + chroma로 RAG 구성 (0) | 2024.08.14 |
---|---|
LangChain - 2.6 Retrievers (0) | 2024.04.03 |
LangChain - 2.5 Vector stores GetStarted (0) | 2024.04.02 |
LangChain - 2.3 Text Splitter (0) | 2024.04.02 |
LangChain - 2.2 Document loaders (0) | 2024.04.01 |