...
Inference requests can also be submitted to the LLM service via notebook by running the python-based API client in the notebook environment and then providing the request and model details.
...
...
Inference requests can also be submitted to the LLM service via notebook by running the python-based API client in the notebook environment and then providing the request and model details.
...