diff --git a/README.md b/README.md
index 4ed7e07..f82fe63 100644
--- a/README.md
+++ b/README.md
@@ -148,22 +148,38 @@ The Python version used for this project is Python 3.10. You can follow along th
11. Verify Kubernetes is running after deployment
```bash
- kubectl get po
+ # Get the Pods
+ kubectl get po
+
+ # Get the Nodes
+ kubectl get nodes
+
+ # Get the Services
kubectl get svc
+
+ # Get the logs of a pod
+ kubectl logs llama-gke-deploy-668b58b455-fjwvq
+
+ # Describe a pod
+ kubectl describe pod llama-gke-deploy-668b58b455-fjwvq
+
+ # Check CPU usage
+ kubectl top pod llama-gke-deploy-668b58b455-fjwvq
```
-
-
-
-
+
+
+
+
12. Under svc the external ip is the endpoint (34.65.157.134), that can be added in the streamlit app
-
-
-
-
-13. Check some pods and logs
+ ```bash
+ # Set the FastAPI endpoint
+ FASTAPI_ENDPOINT = "http://34.65.157.134:8000/query/"
+ ```
+
+14. Check some pods and logs
```bash
kubectl logs llama-gke-deploy-668b58b455-fjwvq
@@ -171,7 +187,7 @@ The Python version used for this project is Python 3.10. You can follow along th
kubectl top pod llama-gke-deploy-668b58b455-8xfhf
```
-14. Clean up to avoid costs deleting the cluster and the docker image
+15. Clean up to avoid costs deleting the cluster and the docker image
```bash
gcloud container clusters delete llama-gke-cluster --zone=europe-west6-a