Skip to content

Commit 586c97d

Browse files
authored
Documentation Updates, Screenshots and Improvements (#7)
* Improved README and added information Signed-off-by: okhleif-IL <[email protected]> * removed stray $ Signed-off-by: okhleif-IL <[email protected]> * renamed dummy file names Signed-off-by: okhleif-IL <[email protected]> * codegen_ui_gradio.py comment + print cleanup Signed-off-by: okhleif-IL <[email protected]> --------- Signed-off-by: okhleif-IL <[email protected]>
1 parent bc2ff89 commit 586c97d

File tree

3 files changed

+239
-192
lines changed

3 files changed

+239
-192
lines changed

CodeGen/docker_compose/intel/cpu/xeon/README.md

Lines changed: 119 additions & 64 deletions
Original file line numberDiff line numberDiff line change
@@ -100,16 +100,90 @@ export host_ip=${your_ip_address}
100100
export HUGGINGFACEHUB_API_TOKEN=you_huggingface_token
101101
```
102102

103-
2. Set Netowork Proxy
103+
2. Set Network Proxy
104104

105105
**If you access public network through proxy, set the network proxy, otherwise, skip this step**
106106

107107
```bash
108-
export no_proxy=${your_no_proxy}
108+
export no_proxy=${no_proxy},${host_ip}
109109
export http_proxy=${your_http_proxy}
110110
export https_proxy=${your_https_proxy}
111111
```
112112

113+
## 🚀 Build Docker Images
114+
115+
Should the Docker image you seek not yet be available on Docker Hub, you can build the Docker image locally.
116+
117+
### 1. Build the LLM Docker Image
118+
119+
```bash
120+
git clone https://github.com/opea-project/GenAIComps.git
121+
cd GenAIComps
122+
docker build -t opea/llm-textgen:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
123+
```
124+
125+
### 2. Build the Retriever Image
126+
127+
```bash
128+
docker build -t opea/retriever:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/retrievers/src/Dockerfile .
129+
```
130+
131+
### 3. Build the Dataprep Image
132+
133+
```bash
134+
docker build -t opea/dataprep:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/dataprep/src/Dockerfile .
135+
```
136+
137+
### 4. Build the MegaService Docker Image
138+
139+
To construct the Mega Service, we utilize the [GenAIComps](https://github.com/opea-project/GenAIComps.git) microservice pipeline within the `codegen.py` Python script. Build the MegaService Docker image via the command below:
140+
141+
```bash
142+
git clone https://github.com/opea-project/GenAIExamples
143+
cd GenAIExamples/CodeGen
144+
docker build -t opea/codegen:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f Dockerfile .
145+
```
146+
147+
### 5. Build the UI Gradio Image (Recommended)
148+
149+
Build the frontend Gradio image via the command below:
150+
151+
```bash
152+
cd GenAIExamples/CodeGen/ui
153+
docker build -t opea/codegen-gradio-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f docker/Dockerfile.gradio .
154+
```
155+
156+
### 5a. Build CodeGen React UI Docker Image (Optional)
157+
158+
Build react frontend Docker image via below command:
159+
160+
**Export the value of the public IP address of your Xeon server to the `host_ip` environment variable**
161+
162+
```bash
163+
cd GenAIExamples/CodeGen/ui
164+
docker build --no-cache -t opea/codegen-react-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f ./docker/Dockerfile.react .
165+
```
166+
167+
### 5b. Build the UI Docker Image
168+
169+
Construct the frontend Docker image via the command below:
170+
171+
```bash
172+
cd GenAIExamples/CodeGen/ui
173+
docker build -t opea/codegen-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f ./docker/Dockerfile .
174+
```
175+
176+
Then run the command `docker images`, you will have the following Docker images:
177+
178+
- `opea/llm-textgen:latest`
179+
- `opea/retriever:latest`
180+
- `opea/dataprep:latest`
181+
- `opea/codegen:latest`
182+
- `opea/codegen-gradio-ui:latest` (Recommended)
183+
- `opea/codegen-ui:latest` (Optional)
184+
- `opea/codegen-react-ui:latest` (Optional)
185+
186+
113187
### Start the Docker Containers for All Services
114188

115189
CodeGen support TGI service and vLLM service, you can choose start either one of them.
@@ -139,37 +213,69 @@ docker compose --profile codegen-xeon-vllm up -d
139213
```bash
140214
curl http://${host_ip}:8028/v1/chat/completions \
141215
-X POST \
142-
-d '{"model": "Qwen/Qwen2.5-Coder-7B-Instruct", "messages": [{"role": "user", "content": "Implement a high-level API for a TODO list application. The API takes as input an operation request and updates the TODO list in place. If the request is invalid, raise an exception."}], "max_tokens":32}' \
143-
-H 'Content-Type: application/json'
216+
-H 'Content-Type: application/json' \
217+
-d '{"model": "Qwen/Qwen2.5-Coder-7B-Instruct", "messages": [{"role": "user", "content": "Implement a high-level API for a TODO list application. The API takes as input an operation request and updates the TODO list in place. If the request is invalid, raise an exception."}], "max_tokens":32}'
218+
144219
```
145220

146221
2. LLM Microservices
147222

148223
```bash
149224
curl http://${host_ip}:9000/v1/chat/completions\
150225
-X POST \
151-
-d '{"query":"Implement a high-level API for a TODO list application. The API takes as input an operation request and updates the TODO list in place. If the request is invalid, raise an exception.","max_tokens":256,"top_k":10,"top_p":0.95,"typical_p":0.95,"temperature":0.01,"repetition_penalty":1.03,"stream":true}' \
152-
-H 'Content-Type: application/json'
226+
-H 'Content-Type: application/json' \
227+
-d '{"query":"Implement a high-level API for a TODO list application. The API takes as input an operation request and updates the TODO list in place. If the request is invalid, raise an exception.","max_tokens":256,"top_k":10,"top_p":0.95,"typical_p":0.95,"temperature":0.01,"repetition_penalty":1.03,"stream":true}'
153228
```
154229

155-
3. MegaService
230+
3. Dataprep Microservice
231+
232+
Make sure to replace the file name placeholders with your correct file name
233+
234+
```bash
235+
curl http://${host_ip}:6007/v1/dataprep/ingest \
236+
-X POST \
237+
-H "Content-Type: multipart/form-data" \
238+
-F "files=@./file1.pdf" \
239+
-F "files=@./file2.txt" \
240+
-F "index_name=my_API_document"
241+
```
242+
243+
4. MegaService
156244

157245
```bash
158-
curl http://${host_ip}:7778/v1/codegen -H "Content-Type: application/json" -d '{
159-
"messages": "Implement a high-level API for a TODO list application. The API takes as input an operation request and updates the TODO list in place. If the request is invalid, raise an exception."
160-
}'
246+
curl http://${host_ip}:7778/v1/codegen \
247+
-H "Content-Type: application/json" \
248+
-d '{"messages": "Implement a high-level API for a TODO list application. The API takes as input an operation request and updates the TODO list in place. If the request is invalid, raise an exception."}'
161249
```
162250

163-
If the user wants a CodeGen service with RAG and Agents based on dedicated documentation.
251+
CodeGen service with RAG and Agents activated based on an index.
164252

165253
```bash
166-
curl http://localhost:7778/v1/codegen \
254+
curl http://${host_ip}:7778/v1/codegen \
167255
-H "Content-Type: application/json" \
168256
-d '{"agents_flag": "True", "index_name": "my_API_document", "messages": "Implement a high-level API for a TODO list application. The API takes as input an operation request and updates the TODO list in place. If the request is invalid, raise an exception."}'
169257
```
170258

171259

172-
## 🚀 Launch the UI
260+
## 🚀 Launch the Gradio Based UI (Recommended)
261+
To access the Gradio frontend URL, follow the steps in [this README](../../../../ui/gradio/README.md)
262+
263+
Code Generation Tab
264+
![project-screenshot](../../../../assets/img/codegen_gradio_ui_main.png)
265+
266+
Resource Management Tab
267+
![project-screenshot](../../../../assets/img/codegen_gradio_ui_main.png)
268+
269+
Uploading a Knowledge Index
270+
271+
![project-screenshot](../../../../assets/img/codegen_gradio_ui_dataprep.png)
272+
273+
Here is an example of running a query in the Gradio UI using an Index:
274+
275+
![project-screenshot](../../../../assets/img/codegen_gradio_ui_query.png)
276+
277+
278+
## 🚀 Launch the Svelte Based UI (Optional)
173279

174280
To access the frontend, open the following URL in your browser: `http://{host_ip}:5173`. By default, the UI runs on port 5173 internally. If you prefer to use a different host port to access the frontend, you can modify the port mapping in the `compose.yaml` file as shown below:
175281

@@ -282,54 +388,3 @@ For example:
282388
- Ask question and get answer
283389

284390
![qna](../../../../assets/img/codegen_qna.png)
285-
286-
## 🚀 Download or Build Docker Images
287-
288-
Should the Docker image you seek not yet be available on Docker Hub, you can build the Docker image locally.
289-
290-
### 1. Build the LLM Docker Image
291-
292-
```bash
293-
git clone https://github.com/opea-project/GenAIComps.git
294-
cd GenAIComps
295-
docker build -t opea/llm-textgen:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/llms/src/text-generation/Dockerfile .
296-
```
297-
298-
### 2. Build the MegaService Docker Image
299-
300-
To construct the Mega Service, we utilize the [GenAIComps](https://github.com/opea-project/GenAIComps.git) microservice pipeline within the `codegen.py` Python script. Build MegaService Docker image via the command below:
301-
302-
```bash
303-
git clone https://github.com/opea-project/GenAIExamples
304-
cd GenAIExamples/CodeGen
305-
docker build --no-cache -t opea/codegen:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f Dockerfile .
306-
```
307-
308-
### 3. Build the UI Docker Image
309-
310-
Build the frontend Docker image via the command below:
311-
312-
```bash
313-
cd GenAIExamples/CodeGen/ui
314-
docker build --no-cache -t opea/codegen-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f ./docker/Dockerfile .
315-
```
316-
317-
### 4. Build CodeGen React UI Docker Image (Optional)
318-
319-
Build react frontend Docker image via below command:
320-
321-
**Export the value of the public IP address of your Xeon server to the `host_ip` environment variable**
322-
323-
```bash
324-
cd GenAIExamples/CodeGen/ui
325-
docker build --no-cache -t opea/codegen-react-ui:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f ./docker/Dockerfile.react .
326-
```
327-
328-
Then run the command `docker images`, you will have the following Docker Images:
329-
330-
- `opea/llm-textgen:latest`
331-
- `opea/codegen:latest`
332-
- `opea/codegen-ui:latest`
333-
- `opea/codegen-gradio-ui:latest`
334-
- `opea/codegen-react-ui:latest` (optional)
335-

0 commit comments

Comments
 (0)